Types of reports

 

Frontline observations

Reports received and assessed in 2025

451,210

This equates to one report every 70 seconds.  

This is a 6% increase on the number of reports assessed from 2024.

Of the 451,210 reports received:

  • 449,549 were reports of URLs 
  • 2,078 were from child reporting services (451 contained URLs included in the number above)
  • 34 were reports of newsgroups .  

Every 101 seconds a report showed a child being sexually abused.   

311,610 reports were confirmed to either contain criminal imagery of child sexual abuse, link to this illegal imagery or were found advertising it (a 7% increase from 2024).  

Of the 311,610 reports confirmed to be criminal:

  • 310,437 were URLs.
  • 1,261 were reports received from our child reporting services (99 contained URLs included in the number above)
  • 11 were newsgroups

The IWF’s mission is to detect, disrupt, remove and prevent online child sexual abuse imagery. Our analysts assess each report against UK legal guidelines.  

The types of reports we receive:

Every report received by the IWF, whether submitted externally or generated through our proactive work, relates to a single URL or a direct report via our child reporting services, Report Remove and Meri Trustline.   

Each URL could contain one, tens, hundreds or even thousands of individual child sexual abuse images or videos.  

Many sites directly display child sexual abuse imagery, however, we also take action on sites that facilitate or enable others to commit offences involving the access, possession or distribution of child sexual abuse material. These include:

  • Gateway or referrer sites that link to child sexual abuse content
  • Advertisements for such material
  • Manuals that provide instructions or guidance on how to locate or perpetrate abuse
  • Inchoate links (non-clickable links) that share information (such as URLs) enabling others to locate criminal content.

Non-image-based criminal URLs actioned in 2025

Internet Content Analyst
Frontline observations

Some readers may find the following descriptions distressing, please feel free to skip this section.

This year, inchoate reports have become a new tool for disrupting material that encourages or assists the offence of sharing child sexual abuse material. While most of our reports are of images and videos depicting abuse, we also see bad actors attempting to evade detection by discreetly signposting to criminal material in a way that only a trained eye would recognise what is being communicated. They might try to direct other bad actors from one platform to another, or use seemingly innocuous keywords to advertise the sale of child sexual abuse material. 

The clearest example of this is bad actors posting links to abuse material as plain text rather than clickable hyperlinks. Previously, analysts couldn’t categorise this as criminal or request the removal of the plain text, leaving platforms vulnerable and the public at risk of encountering it. Leaving these ‘plain text’ links online did not fully resolve the issue or disrupt the activity.  

Inchoate reports closed that gap. In this context, inchoate means 'just begun' or 'undeveloped'. In English criminal law, it is used to refer to situations where, although a substantial offence has not been committed, the defendant has taken steps to commit it, or encouraged others to do so, and that’s exactly what these plain text links are. Had they been clickable hyperlinks, we could have removed them as direct pathways to criminal material.  

We were seeing this methodology being exploited with increasing frequency, presumably to avoid detection or removal. Taking action on plain text links can help show bad actors that certain internet spaces are not safe havens for promoting and distributing illegal material. Where we can, we continue to cut off these signposts to child sexual abuse material.