Child sexual abuse material is assessed in line with the Sentencing Council’s Sexual Offences Definitive Guideline. The Indecent Photographs of Children section sets out the categories used to classify such material.
Category A: Images depicting penetrative sexual activity, sexual activity involving an animal, or sadistic conduct.
Category B: Images depicting non-penetrative sexual activity.
Category C: Other indecent images that do not fall within Categories A or B.
Category C includes, for example, images showing partially nude, nude, or topless sexual posing.
Percentages are rounded to the nearest whole number.
This chart shows the severity of child sexual abuse images according to UK Sentencing Council categories A, B and C.
Half of all the criminal images we assessed in 2025 were Category C images which is up from 29% in 2024.
The category of content we see can be influenced by
The overall number of images found by our Assessors to be criminal has gone down in 2025 however, this doesn't reflect the full scale of the work done by our team. Throughout the year we assess a large number of images both from our proactive work, public reports and as part of our CAID partnership. However, not all images reviewed will be classified as criminal and they are not reflected in the chart above.
Our review of material held within CAID, alongside our proactive analytical activity, led us to reconsider how we address content that falls short of the legal definition of child sexual abuse under UK law but nonetheless raises credible concerns of child exploitation. In response, we established a new “exploitative” classification to more accurately capture and manage this type of material. More about this new classification can be found in Methodology & datasets.
Some readers may find the following descriptions distressing, please feel free to skip this section.
We have ‘eyes on’ every single image graded, and the severity of the child sexual abuse material we see no longer surprises us.
To determine severity, we must pay close attention to images, ensuring we don’t miss a vital section of a complex grid image, an image depicting multiple children, or a composite image of multiple parts. We want our final grade to represent the true severity that the victims have experienced in that image.
When we assess imagery, we grade it using the UK Sentencing Council's Sexual Offences Definitive Guidelines: Categories A, B and C. Under these guidelines, Category A material is the most severe, however it’s not always the Category A material that can feel most impactful.
The grade we apply most frequently is Category C. We encounter multiple large image sets that depict children, mostly girls, posed sexually in studio settings. The images are professionally taken but depict full or partial exposure of a child’s genital area or pubescent breasts. These sets are so large, and feature so many unique victims that their volume can influence our data. Many of these sets reach us from our work with CAID, whilst others are found online by analysts.
Our analysts also locate many self-generated Category C images. These images are published in online forums at high volume and frequency, and contribute to a significant proportion of analyst work. These images share a theme in that they typically depict a girl alone with a device in a domestic setting, coerced into sexual activity by a remote offender.
We know that many of these Category C images are ‘stills’ cut from a video that the offender recorded during the online interaction. When posted in online forums these images act as a ‘preview’ to a downloadable video of the full online encounter. The videos escalate in severity, frequently showing masturbation (Category B) or penetration (Category A).
Sometimes, other aspects of the imagery, regardless of the assigned category of A,B or C, can make it feel severe: a distressed child, or a child we have seen abused so many times that we have come to generate an unfortunate familiarity with their images.
Some of the material we encounter falls outside of the IWF’s remit, and can be particularly harrowing. We’re trained and prepared to see the sexual abuse of children, however we are less equipped to see imagery depicting violence or cruelty to children, animal abuse, or gore content.