Sexual Child Abuse Online
Keep It Real Online YOUTUBE SCREENSHOT

Eighty‑nine thousand devices in the United States are linked to downloads and sharing of child rape videos over the past six months, according to figures presented by former NFL quarterback and child‑protection advocate Tim Tebow at a Senate Judiciary subcommittee hearing on child trafficking.

The number is not a nationwide estimate of offenders, but a mapping of IP addresses and devices suspected of circulating graphic child sexual abuse material (CSAM) within US borders in a recent six‑month window. The disclosure has intensified scrutiny over the scale of online child exploitation, the limits of law‑enforcement response, and the role of technology platforms in facilitating or obstructing investigations.

Scale of the threat in the United States

At the subcommittee hearing, titled 'Lost and Exploited: Confronting Child Trafficking and the Failure to Protect America's Most Vulnerable', Tebow told lawmakers that mapping data from US law‑enforcement cyber units shows a six‑month snapshot of the country dotted with more than 330 red pins, each representing at least one device or IP address actively downloading, sharing or distributing child rape videos, nearly all depicting victims under the age of 12.

He added that the broader global database of unidentified child‑abuse victims, maintained by the International Children's Emergency Fund (ICEF)‑affiliated X‑Rated Image Analysis (IXA) system, has now expanded from about 57,000 unidentified children two years ago to over 89,000.

Those figures are not official nationwide arrest totals, but they reflect a snapshot of IP‑based activity shared by US cyber‑crime units, including the US Homeland Security Investigations (HSI) cyber centre and related multi‑agency task forces. Investigators and analysts stress that the same IP address may host multiple devices or accounts and that many offenders use virtual‑private‑network (VPN) tools, anonymising browsers and the dark web, which can obscure the true number of individuals behind the data.

Nonetheless, law‑enforcement officials and child‑protection advocates say the 89,000‑plus figure signals a steep rise in the visibility of child‑abuse material and in the number of distinct digital footprints law enforcement can now trace.

Role of NCMEC and the CyberTipline

The National Center for Missing and Exploited Children (NCMEC) operates the CyberTipline, a congressionally mandated clearinghouse through which US‑based electronic service providers (ESPs) must report suspected child sexual abuse material in accordance with 18 U.S.C. § 2258A.

Companies that voluntarily participate share hashes, digital fingerprints of known CSAM images and videos, so platforms can detect and remove abusive content and report matches to the CyberTipline, which then forwards cases to the appropriate law‑enforcement agency. As of 31 December 2024, NCMEC had shared more than 9.8 million hashes with 55 traditional ESPs and 17 non‑traditional ESPs that have opted into the hash‑sharing initiative.

Online Cult
Pexels

NCMEC analysts categorise each reported item by age range, severity and context, tagging content that involves infants, toddlers, violence or bestiality so that investigators can prioritise the most urgent cases.

Some of those cases underpin the US‑side mapping that Tebow referenced; investigators at the CyberTipline and partner agencies use those tags to direct resources toward the most severe patterns of abuse and to coordinate with international bodies such as Interpol and Eurojust. Despite these tools, experts warn that the sheer volume of reports—millions per year—has outpaced the number of specialised analysts and investigators, leaving significant backlogs in victim identification and offender pursuit.

Technology, AI‑generated material and investigative bottlenecks

The rising number of devices downloading child‑rape videos coincides with a surge in artificial‑intelligence‑generated child sexual abuse material (AI‑CSAM). The U‑K‑based Internet Watch Foundation (IWF), which verifies and coordinates takedowns of CSAM worldwide, reported that it detected 1,286 AI‑generated child‑abuse videos in the first half of 2025, up from only two in the same period of 2024.

By the end of 2025, IWF data showed more than 3,400 AI‑generated CSAM videos, including over half classified as Category A, the most severe tier involving graphic abuse, torture or penetration.

Law‑enforcement officials and outside researchers say those AI tools lower the barrier to creating and distributing CSAM, because they let users generate photo‑realistic images and videos without physically abusing a child, while still causing measurable psychological harm to real victims when their likeness is misused.

That shift complicates investigations, because AI‑generated content may not always match existing hash lists, requiring manual review and forensic analysis rather than automated detection. At the same time, adoption of end‑to‑end encryption by major messaging platforms has restricted the ability of companies to scan content, even when law enforcement suspects that CSAM is being shared

US prosecutors and federal task forces have repeatedly argued that encryption and privacy‑enhancing features, while legitimate in other contexts, must be balanced against the ability to detect, disrupt and prosecute child‑abuse networks.

In recent cases, defendants convicted of downloading hundreds or thousands of child‑abuse videos from dark‑web sites have received multi‑year prison sentences, illustrating how law enforcement is using seized digital evidence to trace offenders and link them to particular IP addresses and devices. Nonetheless, the 89,000‑plus device figure underscores that enforcement capacity still lags behind the volume of digital footprints emerging online.

Eighty‑nine thousand devices linked to child‑rape‑video traffic in half a year is a disturbing reminder of how far behind the curve law enforcement still stands and of how urgently policy, technology and enforcement must evolve to match the reach of online predators.