AI-generated child sex abuse images targeted with new laws


The government has announced four new laws to address the threat of child sexual abuse images created by artificial intelligence (AI).
The Home Office states that the UK will be the first country to make it illegal to possess, create, or distribute AI tools designed to produce child sexual abuse material (CSAM), with penalties of up to five years in prison.
It will also be illegal to possess AI manuals that teach people how to use AI for sexual abuse, with offenders facing up to three years in prison.
"What we're seeing is that AI is now supercharging online child abuse," Home Secretary Yvette Cooper told the BBC's Sunday with Laura Kuenssberg.
Cooper stated that AI is "increasing the scale" of sexual abuse against children and mentioned that government actions "may need to go further."
New laws will also make it a crime to operate websites where paedophiles can share child sexual abuse content or give advice on grooming children. This offense could lead to up to 10 years in prison.
Additionally, the Border Force will have the authority to require individuals suspected of being a sexual risk to children to unlock their digital devices for inspection when entering the UK, as CSAM is often filmed abroad. Depending on the severity of the images, this could result in up to three years in prison.
Artificially generated CSAM includes images that are partly or fully computer-generated. Software can "nudify" real images and swap one child's face with another, creating realistic images.
In some instances, the real voices of children are used, causing innocent survivors of abuse to be re-victimized.
Fake images are also being used to blackmail children and force victims into further abuse.
The National Crime Agency (NCA) reported that there are 800 arrests each month related to threats against children online. It stated that 840,000 adults pose a threat to children nationwide—both online and offline—making up 1.6% of the adult population.
Cooper said, "There are perpetrators using AI to groom or blackmail teenagers and children, distorting images to lure young people into further abuse. These are some of the most horrific and increasingly sadistic acts happening."
She added, "Technology keeps advancing, and our response must also evolve to protect children."
However, some experts think the government could have done more.
Prof. Clare McGlynn, an expert in the legal regulation of pornography, sexual violence, and online abuse, said the changes were "welcome" but noted "significant gaps."
She suggested the government should ban "nudify" apps and address the "normalization of sexual activity with young-looking girls on mainstream porn sites," describing these videos as "simulated child sexual abuse videos."
These videos "feature adult actors who appear very young and are shown in children's bedrooms, with toys, pigtails, braces, and other childhood markers," she said. "This material can be found with the most obvious search terms and legitimizes and normalizes child sexual abuse. Unlike in many other countries, this material is still legal in the UK."
Subscribe to my newsletter
Read articles from NextGenCircuit directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
