Report Remove

Draft content added

Overview
Reports
Images
Videos

Overview

To support young people in removing sexual images or videos of themselves from the internet, the IWF and the NSPCC’s Childline developed the world-first Report Remove tool, empowering children to take control of their images. The service was launched in June 2021.

The IWF and Childline's Report Remove service ensures that young people are safeguarded and supported throughout the process. The IWF assesses reported content and takes action where it meets the threshold of illegality. Where appropriate, content is assigned a unique digital fingerprint (hash), which is shared with internet companies to help prevent the imagery from being uploaded or redistributed online.

Report Remove provides a child-centred approach to image removal that can be completed entirely online. Young people do not need to disclose their identity, can make a report at any time, and can access ongoing information and support from Childline throughout the process.

Young people create or sign into a Childline account, enabling them to receive email updates about their report. This account can also be used to access ongoing support, including contact with a Childline counsellor via online chat or freephone number, as well as relevant information and advice, self-help tools, and peer support through the Childline website.

The IWF has worked closely with the National Crime Agency and law enforcement agencies across the United Kingdom to promote Report Remove and to make it easier for officers to refer children and young people to the service. Together with the NSPCC, we have also developed signposting resources for use in schools. These awareness-raising activities are likely to have contributed to the increase in reports received in 2025.

Analyst insight

When a child uses Report Remove, they need assistance right now - in real time – so we act promptly on each new Report Remove case that comes in. We feel privileged to help children who are experiencing a live, current threat. Through our hashing and content removal we can potentially help mitigate the harm they might be facing.  

The imagery children report is almost always self-generated, and depicts sexual posing, or a close-up image of the child’s genitals or breasts. Sometimes, the child is depicted performing a sexual act on themselves, or on another person.  Each year we see that boys use the Report Remove tool most often. For analysts, this is a strong contrast to the number of girls that we see exploited via our work with public and proactive reports of URLs.  

Via their reports, both boys and girls tell us their nude or sexual images and videos are already circulating on messaging apps. For others, they may want to prevent the future upload of their imagery. It is clear to us that some children are using Report Remove because they are concerned about a threat to ‘leak’ their imagery or make it ‘go viral’.  

We never know what a young person is going through when they use Report Remove, but NSPCC’s Childline counsellors offer support to any young person who uses the tool. It gives us peace of mind to know that child reporters have the option to speak to such experienced and well-informed counsellors, while we tackle the practical tasks of hashing their imagery, and requesting removal of any online content. 

This year, 20% of all reports to Report Remove involved faked imagery. The growing availability of AI tools means that bad actors can create fake explicit imagery that becomes increasingly difficult to detect. 

It is through Report Remove that we have seen in most detail how aggressively children, particularly boys they are sexually extorted. In 2024, there has been no sign of this threat declining.  

Since the launch of Report Remove in June 2021, over 4,000 young people in the UK have used the service. It feels like a step not only towards reducing the revictimization of children online, but towards young people feeling more empowered in the online world. 

Reports

Reports received

1,894

Children can report URLs that contain their sexual image or individual images and videos via the Report Remove service.

Children using the Report Remove tool can submit up to 50 images or videos in a single report and may create multiple reports. As part of the reporting process, they provide their age and sex. The breakdown below shows the number of reports received by age and sex, highlighting the scale of harm experienced by children using the service.

  • In 2025, we received 1,894 reports through the Report Remove tool, a 66% increase on 2024.
  • We took action on 1,175 reports this year which is an additional 533 reports compared to last year.
  • These actioned reports consisted of 2,963 individual criminal images and 509 criminal videos.

Of the 1,175 reports we assessed as showing criminal imagery in 2025, 896 (76%) contained Category C imagery.

Reports by sex – Report Remove

This chart provides an overview of the sex of the children reported through the Report Remove tool.

Kate to create new charts

Reports by sex and age group – Report Remove

This chart shows how boys in both the 14-15 and 16-17 age groups represent the biggest users of the Report Remove service.

Kate to create charts

Reports by sex and severity – Report Remove

Most criminal images and videos reported by young people through Report Remove fall into Category C, with a notable amount of imagery of boys assessed as Category B.

  • Category A: Images involving penetrative sexual activity; images involving sexual activity with an animal; or sadism.
  • Category B: Images involving non-penetrative sexual activity.
  • Category C: Other indecent images not falling within categories A or B.

Children reporting sexually coerced extortion - KG section off from here down 

We see children reporting sexual extortion when submitting their imagery or URLs via the Report Remove tool. Sexually coerced extortion is explained in more detail here.

Of the 1,175 actioned reports received via Report Remove, 394 reports contained comments indicating sexually coerced extortion.

Report Remove – sexual extortion

  • Sexual extortion
  • No sexual extortion

We have observed that boys are often lured into what they believe to be consensual exchanges of sexual images, frequently thinking they are sharing content with a peer or someone close to their own age. In reality, they are being groomed into sharing images that are subsequently used without their consent.

We know that sexually coerced extortion underpins these reports from boys, as supporting evidence is frequently included. This may take the form of chat logs demonstrating coercive behaviour, or collages created by offenders in which images are combined and overlaid with threatening or intimidating text.

In 2025, boys account for 98% of all reports of sexually coerced extortion.

Report Remove – sexual extortion by sex

Images

Images actioned

2,963

Reports may contain one or more images or videos. For this reason, we have provided an analysis of the individual images reported to us below. This data is drawn from our IntelliGrade system, which generates hashes of illegal imagery that are shared with industry Members to prevent their recirculation online.

In 2025, as a result of actioning 1,175 reports, we assessed 2,963 individual images and 509 videos as illegal. 

Age, sex, and severity are recorded for all single images. However, due to the complexity of assessing videos and still imagery containing multiple images, only severity is recorded in these cases. This approach improves efficiency and helps protect the wellbeing of our image assessors, without compromising the accuracy of our assessments.

Similar to last year, more boys reported actionable images to us than girls, with boys making up 67% of the total (an increase from 64% in 2024).

75% of reports were from boys aged 14-17 and were assessed as Category C child sexual abuse.

Images by sex – Report Remove

Charts to be created by Kate

Images by sex and age group – Report Remove

Images by sex and severity – Report Remove

 

Analyst insight

Sexual extortion reveals itself to us in the some of the imagery submitted by boys to Report Remove.  

Regularly, boys report to us grid-style images that indicate sexually coerced extortion. These images look like a collage of harmful images joined together in a grid format.  Within the grid, there may be an image of the reporter’s genitals, plus an image of his face. Another section of the grid might show his contact list from a social media site. Text overlaid on the grid accuses the child of being an offender themselves; of being a rapist or a paedophile. The overlaid text also encourages others to share the imagery or make it go viral. 

It can be hard to see. It feels like a particularly cruel form of online abuse. These grid images, or similar, are used to extort and threaten boys: the child must pay money, or risk the imagery being shared to people they know. 

Some Report Remove submissions reveal to us the text conversations that lead up to boys being sexually coerced into extortion. We see that criminals approach boys with interest in a mutual exchange of nudes. Once nudes are exchanged, the tone of the text conversation changes rapidly, and the offender becomes aggressive and threatening; quickly demanding money in order to not share a boy’s nude imagery. The growing availability of AI tools has further exacerbated this harm. Offenders are now able to create fake explicit imagery, even when a young person has not shared any sexual content. 

As analysts, it can be hard to see the child’s panicked responses as they struggle to manage the level of threat they are faced with. We feel relieved the Childline can offer support to the children experiencing sexually coerced extortion, and that our hashes can help prevent further upload of this imagery.   

Many of the male and female children who use the service report nude or sexual imagery without evidence of sexually coerced extortion. Some children will report just one image, others will report much larger numbers of images and videos. We always hash this imagery as soon as possible, to give the child reporter the protection offered by IWF hashes once deployed by our industry members across their services.  

Videos

Videos actioned

509

We assessed 509 videos that were criminal, an increase of 27% on last year where we assessed 401 videos.

Assessing videos presents additional complexity, as they may vary in length and depict multiple children and different categories of sexual activity. In these cases, we record severity only. This approach improves efficiency and helps protect the wellbeing of our Taskforce assessors without compromising accuracy.

Videos by severity

  • Category A: Videos involving penetrative sexual activity; videos involving sexual activity with an animal; or sadism.
  • Category B: Videos involving non-penetrative sexual activity.
  • Category C: Other indecent videos not falling within categories A or B.
Analyst insight

Children can report videos via Report Remove. Most videos reported by children depict the reporter alone, and feature sexual posing or sexual activity. Just like the images we see, videos are almost always self-generated in nature, and capture the child in a private, domestic space.