Hotline assessment overview

A guide to how we assess both reports and imagery, highlighting the key outcomes from 2024

The Hotline is split into two workstreams as seen in the diagram below, which includes assessments of both reports and imagery.

 

A diagram showing the sources of our reports, the two workflows for assessing reports and imagery, and the IWF services they support.

 

 

The majority of our work (63%) comes from proactive searching online; however, we also receive reports from external sources. These include the public, the police, IWF Members, hotline agencies and other charities and stakeholders in the child protection sector. In addition, we receive reports from our child reporting services, Report Remove and Meri Trustline, where children are able to report URLs or their own illegal imagery.  

These reports are analysed by our Internet Content Analysts who determine if the content reported is criminal according to UK Law. URLs found to be criminal are actioned and listed on our IWF URL list for our Members.  

If a child reports a URL this will follow our report assessment process, if they report criminal imagery this will follow the criminal imagery process, both are set out in the workflow diagram above.

All criminal imagery is then uploaded to our Intelligrade system where our Taskforce Image Assessors then provide analysis of each image or video. These assessments provide data insights based on the images, videos or children seen. All imagery will be given a severity assessment and included on the IWF Hash list.  

The IWF has access to the government’s Child Abuse Image Database (CAID) in order to assess images and videos from it. This data is then shared to support law enforcement and contributes to the IWF Hash List to help the tech sector find and remove copies of known child sexual abuse images online. 

Our Quality Assurance team provide an independent audit across both workstreams within the Hotline to ensure accuracy and consistency.

 

Dataset tag example

This year we have been able to provide more detailed analysis focusing on different areas of our work. Image, video and child analysis are a new focus for 2024. We have created six dataset tags to clearly identify what data and what part of our workstream is being referred to throughout all sections of our report. 

 

Below are the key outcomes across both workstreams throughout 2024 (1 Jan - 31 December 2024)

Reports assessment - key outcomes in 2024

424,047 Total number of reports assessed (an 8% increase on 2023)
291,273 Reports confirmed as containing child sexual abuse imagery (a 6% increase on 2023)
1142 Reports received through the Report Remove tool (a 44% increase on 2023)
245 Number of URLs containing AI-generated images of child sexual abuse (a 380% increase on 2023)

 

See Reports assessment for more details on our full report analysis.

Imagery assessment - key outcomes in 2024

1,264,393 Total number of images and videos assessed in 2024
734,048 Imagery confirmed as child sexual abuse in 2024 *
650140 Number of children seen in child sexual abuse images
7063 Number of AI-generated images and videos of child sexual abuse

* This figure accounts for all illegal hashes in 2024, this includes videos and collage images where individual children are not recorded.

 

See Imagery assessment for more details on our image and video analysis.

All three of the Hotline's teams collaborate daily to ensure the two workflows run smoothly and efficiently. In addition to this core work, they engage in specialised projects, sharing IWF findings with partners in the child protection sector to help all parties stay ahead of emerging trends.

Our Hotline team

Internet Content Analysts

The Internet Content Analysts, often referred to as Analysts, are a team of 17 people. They are responsible for proactively searching for images and videos of child sexual abuse online, responding to public reports, and monitoring new trends. It’s their job to ensure criminal content depicting the sexual abuse of children is removed from the internet.  

Image Classification Assessors

The Image Classification Assessors are a Taskforce of 11 whose role is to assess images and videos, adding extra metadata to each image such as the age of the child depicted, and the type of sexual activity that is happening to them in addition to other information. Once they have added the data, a hash or “digital fingerprint” is created. These hashes are then used to prevent the upload, download and further dissemination of this image by our industry partners. 

Quality Assurance Assessors and Officers

The role of our Quality Assurance team is to support the Hotline. The team of five are deliberately managed by a different Director to the rest of the Hotline and they ensure the work of the Hotline is held to the highest standards. They check for accuracy and consistency of assessments and track trends to ensure the IWF remains a trusted and world-leading organisation.