Draft content added
In 2025 we saw a 50% increase in the number of videos we assessed as showing child sexual abuse.
The breakdown below shows increases in severity across all categories.
Category A: 29% increase
Category B: 14% increase
Category C: 214% increase
Category A: Videos depicting penetrative sexual activity, sexual activity involving an animal, or sadistic conduct.
Category B: Videos depicting non-penetrative sexual activity.
Category C: Other indecent videos that do not fall within Categories A or B.
Category C includes, for example, videos showing partially nude, nude, or topless sexual posing.
We assess videos of child sexual abuse material that can range from a GIF lasting a few seconds to a self-generated video lasting several hours. The emotional response of a child to direction, instruction or the action of abuse can be distressing for assessors to watch, which is why we usually analyse videos without sound.
Assessors might see hundreds of children abused within a single video: a perpetrator could compile a series of composite images, screenshots of digital media files, spliced CSAM videos featuring multiple children, or the same child featured in multiple settings at different times. One video could contain the abuse of hundreds of children.
We have seen more videos that depict what appears to be an adult recording themselves abusing a child. It seems performative, with the adult interacting with the recording device or viewers, as though it is possibly livestreamed.
Videos can also be used to conceal CSAM: sometimes the abuse imagery is spliced into mainstream digital movie files, disguising the purpose of the video to a casual observer.