Draft content added
We take action against any online site that displays child sexual abuse content. Our analysts collect data when creating reports, such as whether sites are abused or purpose-built to share child sexual abuse material. This helps us identify trends in how abuse is shared online and informs work across the IWF, from membership to policy development. Understanding site types isn’t always straightforward, but knowing how platforms are built explains why some appear more frequently than others.
It is important to note that the table reflects the number of reports received, not the scale or severity of the content. A single report may relate to one image or to hundreds or even thousands of images.
Image-hosting services account for a significant proportion of the criminal imagery we encounter. This is closely linked to the use of online forums, where images are often stored on separate image-hosting platforms and embedded into posts, enabling the same content to be shared across multiple forums.
By working directly with image-hosting providers to remove this material at source, we are able to eliminate it from many locations at once, making our takedown work particularly impactful and central to disrupting this form of criminal activity.
When a website is found to be displaying child sexual abuse content, we take action against the site itself and also identify and seek to remove the direct locations of the imagery. This explains why image-hosting services feature prominently in the table, as they are often the original sources from which content is shared across multiple websites.
A huge part of our work, especially our proactive work, includes monitoring and disrupting image hosting services. One especially persistent method of distribution is image hosting services that have been reported and removed by us, appearing again almost instantly, under a slightly different domain name.
This trend sees bad actors upload the same thousands of files of child sexual abuse in bulk, the same children and in the same order, continuously reuploading the same collections of abuse almost immediately after we secure their removal. It is driven by users with a high level of technical expertise, exploiting platforms in highly organised or automated ways, and keeping this material offline can feel like an impossible task. We have to remind ourselves not to lose sight of what sits at the centre of this work. Each image shows a real child, suffering abuse, and every time that image is reuploaded that child is victimised again. The process of removal is not always straightforward, but it is how we, as analysts, can make a difference.
Unfortunately, our work has taught us that every platform has the potential to be abused by bad actors and host child sexual abuse material. This includes storage sites like cyberlockers as well as platforms built for social interaction, like forums or blogs. Some sites we see, like text stores, may not be familiar to most of the public.
A text store website is exactly as it sounds, a website that acts like a notes app, allowing users to post plain text or links. Unlike a personal notes service where a note is visible only to the user, these pages are hosted online and can be accessed by multiple people. Bad actors flood these text stores with notes that contain all types of sites sharing child abuse material, constantly updated with the newest links to act as a directory for other bad actors to find the latest criminal sites after previous sites have been removed by us.
Cyberlockers also make up a large number of our reports. They are sites that typically hold a file for download. We see how bad actors can create large numbers of cyberlocker URLs to share and distribute large amount of child sexual abuse imagery. Some downloads could contain hundreds of images and videos of just one child being sexually abused, another could contain a collection of abuse videos, maybe themed by the age of the child or the nature of the abuse. Sometimes there is no pattern, but this type of distribution feels incredibly widespread.
Especially with trends that rely on generating clicks or invites, CSAM is increasingly being shared on websites that we would consider household names. We see links being spammed across platforms, adverts directing users to abusive content, even on social networks with billions of monthly users. Our role is not only to remove child sexual abuse material for the sake of the victims, but also to protect the public from finding this material too. No one should have to see what we see here in the hotline, especially without the training or support we are given.