Criminal images and videos depicting older teenagers are seen regularly by our analysts, both online and via our child reporting services. The child victims in this imagery are under 18 but in their upper teenage years. It can often be challenging to visually determine if they are a child and what age they might be due to their physical maturity. This imagery can be found on pornography websites where it can be difficult to distinguish from legal adult content and when it is designed to ‘blend in’ with legal adult pornography.
For this reason, this section focuses specifically on older teens, given the distinct risks, identification challenges and harm patterns associated with this type of content.
We know that the biggest users of the Report Remove service (launched in June 2021) are 14-to-17-year-olds, specifically boys aged 16-17. A further breakdown of these reports can be found on the Report Remove page.
Since 2022, reports to our Report Remove service involving 14-to-17-year-olds have increased 13-fold, while the number of related child sexual abuse images and videos has increased 11-fold.
The older teen group is also the most likely to report incidents of sexually coerced extortion with 14-to-17-year-old boys accounting for 98% of these reports in 2025. Learn more about sexually coerced extortion.
These children were seen in 56,179 images, some of which contained more than one child.
A significant proportion (33%) of images involving 14-to-17-year-olds are 'self-generated' , compared with just 5% across the remaining age groups combined.
Many of the images and videos featuring older teens have been encountered online, either by our analysts through their proactive searches or through reports submitted by the public, Members, law enforcement, other hotlines or by the children themselves.
In 2025 there were 10,034 reports of children in the 14-to-17-year-old-group. 8,939 (89%) of these were reports of URLs showing this imagery. The remaining 1,095 reports were submitted via our child reporting services and involved children reporting imagery rather than URLs.
URLs containing sexual abuse imagery involving 16-17-year-olds most commonly appeared on video channels, with 506 instances (25%) identified on this type of site in 2025. Many sites and applications use end-to-end encryption, which restricts our visibility of content shared on those platforms and limits our ability to identify and take action against child sexual abuse material hosted on these platforms.
Some readers may find the following descriptions distressing, please feel free to skip this section.
Images and videos of older teens are usually 'self-generated', and the majority that we find online depicts girls, but the reasons behind their capture, creation and distribution are varied. Some imagery, children tell us, has been ‘leaked’ without their knowledge, and some may have been ‘faked’, potentially by AI tools. Other instances depict aggressive and humiliating sexually coerced extortion. Some victims may not have known they were being recorded during a coerced interaction. Others may be unaware that their imagery is sought after, regularly posted and shared.
The Report Remove tool provides UK-based children with a place to self-report imagery of themselves. Our collaboration with the UK Home Office’s Child Abuse Image Database (CAID), law enforcement and charity partners helps us confirm the ages of older child victims and to take steps to prevent their repeated victimisation online. Some of this content may have been online for years. Some adult ‘self-reporters’ ask us for help in removing online indecent imagery captured when they were teenagers, which appears years later in online public spaces.
The physical development of older teens can make it difficult to distinguish between them and adults. This means we frequently request age verification to ensure we are correctly identifying child sexual abuse material rather than intimate image abuse involving an adult.
Age verification may come from traditional sources such as UK or international police forces, child reporting services like Report Remove or Meri Trustline, or other hotlines and trusted partners. Our commitment to accurate age verification enables us to confidently hash and remove thousands of images depicting older teenagers, ensuring our assessments are as precise as possible.
We see a demand for indecent material of older children. Some websites promote indecent imagery of older teens as ‘leaked’, ‘exposed’ or ‘teen’ material, using derogatory language to celebrate exploiting them. On websites like these, collections of indecent material are advertised, promoted, branded, monetised and directly sold for payment.
Some websites expose more than just the imagery of children, but also their apparent name and location. Comments from web users encourage such posting, ask for more content or request more information about the child. The ease of accessing shareable links to ‘full collections’ of imagery encourages the onward sharing and collecting of indecent imagery of older children. These victims are children, and every repost and onward share is degrading and cruel.