Artificial intelligence could yet be fueling an epidemic of child sex abuse, Britain’s top law enforcement agency has warned, as it says one in 50 men pose a risk to children.
The National Crime Agency (NCA) estimates that up to 830,000 adults – 1.6% of the adult population – pose some degree of sexual danger to children, a figure described as “extraordinary” by its chief executive, Graeme Biggar. He added that images of online abuse had a “radicalizing” effect that “normalized” such behavior.
The rapid onset of artificial intelligence (AI) means the threat to young people will only increase as fake images flood the internet, Biggar said – while other experts have warned that textbooks of instructions on how to operate the new technology are already circulating online.
The head of the NCA, the agency that leads the fight against serious and organized crime, said: “We believe the viewing of these images – whether real or AI-generated – is increasing greatly increases the risk that offenders will continue to sexually abuse children themselves.”
Understanding of the existing threat continued to grow, and Biggar said most child sexual abuse (CSA) involves the viewing of images. Eight out of 10 people arrested in connection with such abuse are men, and Biggar agreed that meant about 2% of men were at risk.
Unveiling the NCA’s annual threat assessment, Biggar said: “We estimate there are between 680,000 [and] 830,000 adults in the UK who pose some degree of child sexual risk. These are extraordinary figures: about 10 times the prison population.
“They reflect in part a better understanding of a threat that has historically been underestimated, and in part a genuine increase caused by the radicalizing effect of the internet, where the widespread availability of videos and images of child abuse and raped, and groups sharing and discussing the images, normalized such behavior.
The NCA’s National Assessments Center produced the numbers and said its methods were sound.
He reviewed the results of online CSA surveys. Of the abusers identified online, only one in 10 was a known sexual abuser, while nine in 10 had not been previously identified.
The researchers therefore increased the known number of registered sex offenders by about 10 times.
The NCA said: “Our confidence in the validity of this figure is further informed by available intelligence and subject matter expertise.”
Biggar said people active in online abuse forums were already enthusiastically discussing what AI could do and warned it was “just the beginning”.
He added: “Using AI for child sexual abuse will make it harder for us to identify real children who need protection and further normalize abuse.”
Evidence has emerged from guides circulating online for those interested in abuse images. THE the Internet Watch Foundation said the technology was used to produce “astonishingly realistic” images of children as young as three to six years old, using sophisticated image generation tools.
The IWF said it found an online guide aimed at helping offenders train an AI tool and fine-tune their prompts to return the most realistic results.
IWF Chief Executive Susie Hargreaves has urged Prime Minister Rishi Sunak to treat AI-generated CSA material as a priority when hosting a Global AI Security Summit in the fall: “The Prime Minister must treat the grave threat he poses as his top priority when he hosts the first Global AI Summit later this year.”
She added: “Offenders are now using AI image generators to produce sometimes incredibly realistic images of child victims of sexual abuse.”
The IWF said instances of AI-generated material were low, as the use of the technology is only just beginning to spread. Between May 24 and June 30, he investigated 29 reports of web pages containing material suspected of being made by AI. He confirmed that seven of them contained AI-generated CSA material, with some containing a mixture of real and AI images.
AI-generated images of child sexual abuse are illegal in the UK under the Coroners and Justice Act 2009, which contains provisions relating to the making and possession of indecent “pseudo-photographs” of children, although the IWF would like to see the law changed to directly cover AI images.
The Ada Lovelace Institute, a data and AI research body, said on Tuesday the UK needed to strengthen its regulation of the technology. Under current government proposals to oversee AI, regulation is delegated to existing regulators, an approach the institute says does not sufficiently cover the use of AI in areas such as recruitment and policing. .
The institute said in a report analyzing Britain’s proposals for AI regulation that it welcomed Sunak’s commitment to global AI safety, but added that the government should also take care of his national regime.
“International coordination efforts are welcome, but they are not enough,” said Michael Birtwistle, associate director of the institute. “The government must strengthen its national regulatory proposals if it is to be taken seriously on AI and achieve its global ambitions.”
The institute recommended that the government consider establishing an “AI ombudsman” who would support those affected by AI. He also said ministers should introduce new legislation to provide better protections “if necessary”.
A government spokesperson said the upcoming Online Safety Bill, due to come into force this year, contains provisions for the removal of CSA material from online platforms.
The threat assessment report also contains details of the evolution of drug use in Britain. Last year, he said, record quantities were available, causing prices to plummet. Examination of sewage in some urban areas showed a 25% increase in cocaine use in 2022.
The NCA said 40 tonnes of heroin were consumed in 2022, along with 120 tonnes of cocaine.