The breakdown below shows the increases we saw in severity across all categories of videos applied by our Assessors.
Category A: over 550 fold increase
Category B: over 120 fold increase
Category C: over 180 fold increase
In comparison, criminal assessments of non AI-generated videos showed that 43% were attributed to Category A severity. This highlights the concern that AI tools are enabling the creation of significantly more severe content, often using real images of victims or survivors to generate this material, which further increases the revictimisation of children.
Some readers may find the following descriptions distressing, please feel free to skip this section.
As the quality of generative AI image tools has increased, we have seen the emergence of photorealistic AI child sexual abuse images. Inevitably the development of generative AI video tools has seen an alarming increase in the quality of AI child sexual abuse videos.
We first encountered generative AI child sexual abuse videos in early 2024. The first examples we saw were “deepfake” videos – existing adult pornography altered with AI to replace the face of a woman with that of a known child victim. Even at this early stage of AI videos, the effect was still convincing. Shortly after came the next step - a fully synthetic AI sexual abuse video showing a boy and a man in a scenario entirely generated by AI. The video was low quality, more a choppy succession of photos than a proper video, but this was our first warning about the direction of generative AI.
The quality of generative AI videos has grown exponentially in the past two years. We are now seeing photorealistic sexual abuse videos showing known child victims in entirely new scenarios. In many cases the abuse depicted is worse than that seen in the original images and videos of a victim. And creators are not only generating videos, they are also making and sharing models of known victims that will allow others to generate their own abuse videos depicting that victim.
“Deepfake” child sexual abuse videos have become effortless to generate. Numerous websites exist that require a single image of a child to generate a video of them stripping or performing sexual acts. With no safeguards in place, these websites have given anyone the ability to create child sexual abuse videos.
Generative AI is now limited only by the user’s imagination. In the wrong hands, that has terrible consequences.