Draft content added
Criminal imagery depicting older teenagers is seen regularly by our Internet Content Analysts, both online and via our Child Reporting Services. The child victims in this imagery are under 18 but in their upper teenage years. It can often be challenging to visually determine their age. So much so that this imagery can be found on pornography websites, where it is arranged to ‘blend in’ with legal adult pornography.
We know that the biggest users of the Report Remove service, that allows UK children to report their own imagery to the IWF, are 14-17 years old, specifically boys aged 16-17. A further breakdown of these reports can be found on the Report Remove page.
The older teen group is also the most likely to be reporting incidents of sexually coerced extortion with 14-17 year-old boys accounting for 98% of these reports in 2025. Learn more about sexually coerced extortion.
These children were seen in 56,179 images.
A significant proportion (33%) of images involving 14–17-year-olds are self-generated, compared with just 5% across the remaining age groups combined.
Many of the images and videos featuring older teens that we assess have been encountered online, either by our Analysts through their proactive searches or through reports submitted by the public, members, law enforcement, other hotlines or by the children themselves.
In 2025 there were 10,034 reports of children in the 14-17-year-old-group. 8,938 of these were reports of URL's showing this imagery.
The previously reported trend of imagery involving 14–15-year-olds being primarily hosted on image hosting websites has continued this year, with 4,111 instances (59%) identified on this type of URL in 2025.
Where sexual abuse imagery involving 16–17-year-olds has been found online, it has most commonly appeared on video channels, with 506 instances (25%) identified on this type of site in 2025.
Imagery of older teens is often self-generated, and the majority that we find spread online depicts girls, but the reasons behind its capture, creation and distribution are varied. Some imagery, self-reporters tell us, has been ‘leaked’ without their knowledge, and some may have been ‘faked’, potentially by AI tools. Other instances depict aggressive and humiliating sexually coerced extortion. Some victims may not have known they were being recorded during a coerced interaction. Victims may be unaware that their imagery is sought after, regularly posted, and shared.
Our Report Remove tool provides UK-based children with a place to self-report imagery of themselves. When children cannot use a child reporting service, our collaboration with CAID, law enforcement and charity partners helps us confirm the ages of older child victims and take steps to prevent their re-victimisation online. Some of this content may have been online for years. In fact, some adult ‘self-reporters’ ask us for help in removing online indecent imagery captured they were teenagers, but appearing years later in online public spaces.
Our commitment to accurate age assessments allows us to confidently hash and remove from the internet thousands of pieces of imagery depicting older teens. Much of this imagery is in the most severe categories of harm, and depicts humiliation and degradation. Our actions can help stop this material being overlooked as adult content, from being missed by moderation, or hidden in plain sight.
We witness a demand for indecent material of older children. Some websites promote indecent imagery of older teens as ‘leaked’ ‘exposed’ or ‘teen’ material, using derogatory language to celebrate such leaking and exposing of teenagers. On websites like these, collections of indecent material are advertised, promoted, branded, monetised, and directly sold for payment.
Some websites expose more than just these children’s imagery, but their apparent name and location too. Comments from web users encourage such posting, ask for more content, or request more information about the child. The ease of accessing shareable links to ‘full collections’ of imagery encourages the onward sharing and collecting of indecent imagery of older children. These victims are children, and every re-post and every onward share is harmful.