Report Remove

Overview
Reports analysis
Report Remove and sexually coerced extortion
Image analysis
Video analysis

Overview

To support young people to remove sexual images or videos of themselves online, the IWF and NSPCC's Childline developed the world-first Report Remove tool empowering children to take control of their images. This was launched in June 2021.

The IWF's and Childline's Report Remove service ensures that the young person is safeguarded and supported throughout the process and the IWF assesses the reported content and takes action if it meets the threshold of illegality. The content is given a unique digital fingerprint (hash) which is then shared with internet companies to help prevent the imagery from being uploaded or redistributed online.

This solution provides a child-centred approach to image removal which can be done entirely online. The young person does not need to tell anyone who they are, they can make the report at any time, and further information and support is always available from Childline.

Young people create or sign into a Childline account which then allows them to receive Childline email updates about their report. Young people can use this email service for ongoing support, and they can contact a Childline counsellor via online chat and freephone number. They can also access relevant information and advice, self-help tools and peer support on the Childline website.

The IWF has worked closely with the National Crime Agency and law enforcement throughout the United Kingdom to promote Report Remove and make it easier for officers to refer children and young people to the service. Together with the NSPCC, we have also developed signposting resources to be shared within schools. These awareness-raising efforts are likely to have contributed to the increase in reports received in 2024.

Reports analysis

Children can report URLs that contain their sexual image or individual images and videos via the Report Remove service.

Children that use the Report Remove tool are able to submit up to 50 images or videos in any one report and can create multiple reports. Here is a breakdown of the reports we have received and also the breakdown on the number of images and videos this equates to which highlights the harm and extent of this abuse.

Report analysis

  • In 2024, we received 1,142 reports through the Report Remove tool, a 44% increase on 2023.
  • We took action on 642 reports this year which is an additional 134 reports compared to last year.
  • These actioned reports consisted of 1,608 individual illegal images and 401 illegal videos.

Of the 642 reports we assessed as criminal images in 2024, most, 474 (74%), contained Category C images, and similar to last year, more boys reported actionable images to us than girls, with boys making up 64% of the total.

Reports by sex – Report Remove

This chart provides an overview of the sex of the children reported through the Report Remove tool.

While boys still account for most Report Remove reports, this year we have seen an increase in the number of girls reporting illegal images in this way, with a 47% increase on 2023.

Reports by sex and age group – Report Remove

This chart shows how boys in both the 14-15 and 16-17 age groups represent the biggest users of the Report Remove service.

The largest increase in any age group can be seen in the 11-to-13-year-olds. This has risen from just 13 reports in 2023 to 69 reports in 2024, an increase of 431%. 

Reports by sex and severity – Report Remove

Most criminal images and videos reported by young people through Report Remove fall into Category C, with a notable amount of imagery of boys assessed as Category B.

  • Category A: Images involving penetrative sexual activity; images involving sexual activity with an animal; or sadism.
  • Category B: Images involving non-penetrative sexual activity.
  • Category C: Other indecent images not falling within categories A or B.

Report Remove and sexually coerced extortion

We see children reporting sexual extortion when submitting their imagery or URLs via the Report Remove tool. Sexually coerced extortion is explained in more detail here.

Report analysis

Of the 642 actioned reports received via Report Remove, 151 reports contained comments indicating sexually coerced extortion.

Report Remove – sexual extortion

  • Sexual extortion
  • No sexual extortion

We’ve seen how boys are typically lured into what they believe are mutual exchanges of sexual images where they often think that they are sharing images with a peer or person older than them, when in fact they are being groomed into sharing non-consensual images.

We know that sexually coerced extortion is behind these specific reports from boys as they have included evidence of this within their report. This could be a chat log where the young person has demonstrated that they are being coerced or where a collage of images has been created by the offender, overlaid with threatening text.

Report Remove – sexual extortion by sex

  • Boys
  • Girls

Image analysis

Reports can contain one or more images or videos and we have therefore provided an analysis of the individual images reported to us below. This data is drawn from our IntelliGrade system, which creates hashes of illegal imagery that are shared with industry Members to prevent their recirculation online.

As a result of actioning the 642 reports, we assessed 1,608 individual images and 401 videos as illegal in 2024. To date, we have processed a total of 3,467 images and 678 videos that we have been made aware of via the Report Remove service.

Please note that age, sex and severity are recorded for all single images. However, the complexity of assessing videos and still imagery containing multiple images means that we only record severity in these cases. This contributes to greater efficiency and helps to protect the wellbeing of our image assessors without compromising the accuracy of our work.

Image analysis

Images by sex – Report Remove

Images by sex and age group – Report Remove

Images by sex and severity – Report Remove

 

Video analysis

We assessed 401 videos that were illegal. They were given a severity assessment but not an age or sex assessment.

Video analysis

Videos by severity

  • Category A: Images involving penetrative sexual activity; images involving sexual activity with an animal; or sadism.
  • Category B: Images involving non-penetrative sexual activity.
  • Category C: Other indecent images not falling within categories A or B.

There is additional complexity when assessing videos of various lengths that may show several children and a number of different categories of sexual activity. For these we only record severity. This contributes to greater efficiency and helps to protect the wellbeing of our Taskforce assessors without compromising accuracy.