The number of websites illegally selling access to child sex abuse images more than doubled last year, according to new research from the UK authority responsible for finding and removing such material from the internet.
The UK's Internet Watch Foundation unearthed 15,031 commercial child sexual abuse sites in 2025, a 114% increase from a year earlier, according to a report published Thursday. The foundation described these sites as offering "premium access" to illegal and abusive content, which "can involve victims of all ages, and can include some of the most severe and extreme forms of sexual abuse."
"It is clear criminals are exploiting systemic failures and are finding it far too easy to reap huge profits from children's sexual exploitation," said Kerry Smith, the foundation's chief executive. "At every stage, we need to disrupt this system. It is an industry."
ALSO READ: UK To Penalise Big Tech If AI-Created Nonconsensual Intimate Images Not Removed In 48 Hours: Report
Many of the violating websites were disguised, initially showing legal content but revealing the abuse material when accessed a certain way, according to the report. They process payments via cryptocurrency, money transfer services and credit cards, the group said.
Reports of child sex abuse material are on the rise worldwide. In the US, the National Center for Missing and Exploited Children received more than 21.3 million reports of abusive content last year. Well over a million of those reports involved images created with artificial intelligence tools, a rapidly emerging threat that child safety advocates warn may normalize abusive content and overwhelm the content moderation and law enforcement systems tasked with tackling it.
At the same time, technology companies have worked to improve their ability to identify and report child sex abuse material, which may also be contributing to the increase in reports.
The Internet Watch Foundation report also highlighted 397 cases of "sextortion," representing a 127% increase from a year earlier. This is where a perpetrator threatens to publish nude or sexual imagery of the victim unless they comply with demands for money or more extreme images.
ALSO READ: Amazon Found High Volume Of Child Sex Abuse Material In AI Training Data
Boys aged 14 to 17 accounted for 98% of the reported victims. Most came through the Report Remove helpline, a service that allows children to report sexual imagery of themselves that has been shared online and is operated by the foundation and the UK's National Society for the Prevention of Cruelty to Children.
The organizations have urged tech companies to make use of available technology that prevents children from sharing and taking nude images.
Jess Phillips, the UK's minister for safeguarding and violence against women and girls, said in a statement that technology companies and the financial sector can't keep "turning a blind eye" to these online marketplaces.
"We will use the full power of the British state to deliver the biggest crackdown against child abuse, both online and offline, that this country has ever seen," she said.
(This story has not been edited by NDTV staff and is auto-generated from a syndicated feed.)
Essential Business Intelligence, Continuous LIVE TV, Sharp Market Insights, Practical Personal Finance Advice and Latest Stories — On NDTV Profit.
