Reports of potential incidents of online child sexual exploitation increased by 35% in 2021 compared to 2020, according to new data released by the National Center for Missing and Exploited Children. 


What You Need To Know

  • Reports of potential incidents of online child sexual exploitation increased by 35% in 2021 compared to 2020, according to new data released by the National Center for Missing and Exploited Children

  • The National Center for Missing and Exploited Children received 29.3 million reports of suspected child sexual exploitation last year alone

  • NCMEC alerted law enforcement agencies across the country to over 4,000 potential new cases of child sexual exploitation in 2021

  • The NCMEC notes the number of reports does not necessarily equate to the amount of children targeted by predatory online behavior

The NCMEC, a nonprofit organization founded by Congress in the 1980s, on Thursday published its 2021 CyberTipline Report. The CyberTipline offers the public and electronic service providers a space to report suspected incidents of child exploitation on the internet. 

"These reports concern the sexual exploitation of children around the globe,” Michelle DeLaune, NCMEC SVP and COO, wrote in a statement. “We share this data to continue building awareness of the insidious nature of child sexual exploitation occurring online.”

 

The tipline takes reports on multiple kinds of online abuse, ranging from the possession, manufacture, and distribution of child sexual assault material – also known as child pornography – as well as the enticement of children for inappropriate acts, child sex trafficking or sex tourism, obscene material sent to children, misleading domain names, words or images and online molestation of children.

Child sexual assault material made up 99% of the incidents recorded by the tipline last year with 29,309,106 reports. Altogether, the tipline recorded 29,397,681 reports last year – an overall increase of 35% from 2020, and a 73% increase from 2019. 

The vast majority of the 29.3 million reports last year came from electronic service providers, meaning various companies reported “instances of apparent child sexual abuse material that they become aware of on their systems,” per NCMEC.

Meta Inc.’s reports made up the bulk of those from service providers, with a combined 26.8 million reports across Facebook, Instagram and WhatsApp. No other company or site reported incidents in the millions. 

“Higher numbers of reports can be indicative of a variety of things including larger numbers of users on a platform or how robust an ESP’s efforts are to identify and remove abusive content,” NCMEC wrote in a statement. “NCMEC applauds ESPs that make identifying and reporting this content a priority and encourages all companies to increase their reporting to NCMEC. These reports are critical to helping remove children from harmful situations and to stopping further victimization.”

The NCMEC notes the number of reports does not necessarily equate to the amount of children targeted by predatory online behavior, saying in part: “Unfortunately, child sexual abuse images and videos are often circulated and shared online repeatedly. CSAM of a single child victim can be circulated for years after the initial abuse occurred.” 

Through a new system implemented last year, the organization can now differentiate between new and previously-existing material on the internet. Thanks to the upgraded technology, NCMEC alerted law enforcement agencies across the country to over 4,000 potential new cases of child sexual exploitation.

It’s unclear what exactly contributed to the rise in reports, and NCMEC did not offer any explanations. 

It’s possible increased internet usage during the pandemic may have led to more reports of online sexual material depicting children. According to a series of Digital America reports, internet usage in the U.S. increased 0.6% between 2019 and 2020, and another 0.6% between January 2020 and January 2021; the U.S. saw over a 2.8% surge between January 2021 and 2022, an increase over four times higher than the two years prior.

Still, that data is not limited to children alone, and may not be an entirely accurate representation of youth presence online. A separate study from Canada’s University of Calgary and the New York University School of Medicine, published in the medical journal JAMA Pediatrics in February, found a majority of children five years of age and younger exceeded the daily recommended screen time. 

Pediatric guidelines, as cited by the paper, maintain children under two years old should have zero hours of screen time per day; children between the ages of two and five should spend no more than a single hour a day in front of a screen. 

Social media companies in particular have faced growing pressure to better protect children on their platforms, a push stemming from both lawmakers and average Americans alike. Some of the calls center on child safety, while many others focus on the potential mental health hazards posed by frequent social media use. 

A day before NCMEC published its report, Instagram on Wednesday rolled out a series of highly-anticipated updates aimed at making the platform safer for underage children by giving parents more support and resources on the app. 

Parents of teens aged 13 - 17 can now view how much time their child spends on Instagram and can set time limits, can be notified when their child reports a user on the app and view and receive updates on what kind of content their teen is viewing on Instagram. 

The supervision controls are optional, and the teen user must consent to participate; the rules can be terminated by either child or parent at any time. The controls automatically expire when a user turns 18, per the birth date provided upon creation of their account.