Violence against women & girls

Draft content added

 

Violence Against Women and Girls (VAWG) encompasses a wide range of harms, including sexual exploitation, abuse, and coercion, much of which is increasingly facilitated online. Our work at the IWF directly intersects with VAWG through our efforts to identify, assess, and remove child sexual abuse imagery which disproportionately impacts girls. By disrupting the online infrastructure that enables abuse to be shared, profited from, and normalised, the IWF plays a critical role in preventing further harm, supporting victim safeguarding, and addressing the digital dimensions of violence against women and girls.

Number of girls identified in criminal images over past two years

730,998

Year on year it is girls that are most often seen in the criminal imagery analysed by the IWF. 

Since January 2024 our Image Assessors have been able to record the age and sex of all children seen within singular images.

Number of children by sex over the past two years

This chart shows a breakdown by sex of all images in which sex has been recorded over the last 2 years. 76% (730,998) of those child sexual abuse images a depicted girls.

The disproportionate representation of girls persists in the more recent development of AI-generated child sexual abuse material.

AI-generated images by sex over the past two years

An overwhelming 99% of the AI-generated images assessed over the past two years depicted girls.

 

Evidence shows that real images of child sexual abuse victims have been used in the training of AI models, with these technologies often generating new imagery depicting known victims.

Even when sexual abuse is captured in a single image, AI-generated or not, the harm it represents extends well beyond that moment, with lasting effects on the children involved.

Analyst Insight

Violence against women and girls is a prevalent theme amongst the harmful and illegal material we see online. Even in imagery of very young girls, sinister sexualisation with violent undertones is clearly present.  

On forum websites and image hosting sites we encounter comments against imagery of young children in swimwear or gymnastics costumes, stating how those girls should be abused. Some internet users rate and compare the girls’ bodies and invite abusive comments from others.  In other instances, a girl no older than 7 years old was described as having ‘great slut potential’; and a three-year old girl was described as a ‘BDSM Slut’. An Assessor recalled an image of a pre-pubescent girl juxtaposed with text describing the acts of violent sex that a girl should learn submit to; acts also described as ‘inevitable’. Due to our roles, witnessing frequent and blatant degradation of girls is a daily occurrence for us. We recognise the societal risk of this being the norm for all internet users.  

Members of the public report to us sexualised content of women and girls produced in non-consenting scenarios. These reports include URLs of websites dedicated to posting voyeuristic imagery of young women and older girls’ bodies. Though not always illegal, this imagery depicts sexualised and suggestive close-up images of women and girls’ bodies, taken without their knowledge. We have also seen an increase in reports of imagery depicting non-consenting women being  ejaculated on or sexually touched in crowded public places, apparently without them realising sexual contact has been made. It is evident that this imagery is posted for gratification, without concern for consent. We also know that improper use of AI, such as ‘nudifying’ technology, poses further threat to the autonomy women and girls have over their bodies online.  

We know that there are adult pornography websites hosting imagery described as ‘rape’. To our trained eyes, many of these scenes appear staged by adult performers, yet the titles of the imagery can be misogynistic and violent. Not all online ‘rape’ imagery appears staged however, and our Hotline team has viewed multiple videos that depict apparently real gang rape and physical violence against women in certain international communities. These acts of shaming, sexual violence and degradation are published online, accessible on the open internet.  

Some online indecent imagery depicting older teenage girls and young women is disturbingly humiliating, and framed by misogynistic language. We encountered a web user describing the posting and sharing of indecent imagery - including that of confirmed child victims - as “trading and exposing whores”. Another user posted the nude image of a female with the text “your body is now the public property of the web”. A sense of commodification and ownership of women’s and girls’ bodies is strongly detectable in certain online spaces. 

The public reports to us websites that promote the self-generated content of women and older teenage girls as: ‘self-harm’, ‘incest’, ‘blackmail’, ‘forced’ or ‘self-humiliation’ material. Packs of content depicting such scenes are offered for sale. Any concern for the harm experienced by the women and girls depicted on such websites is not evident. What is far clearer is the appetite for viewing older girls and young women experiencing sexualised harm and humiliation.   

Imagery of this nature is branded as ‘wins’ - particularly, it seems, if it was sourced without consent or created under threat. One website that posts such 'wins’ or ‘leaks’ of older teenage girls and young women also allows users to search for imagery by ‘body type’ or ‘hair colour’, as though shopping for a product. On similar websites, users gain kudos or website currency for being able to identify the female in the indecent imagery, or confirm which town she lives in. In fact, another website seen by analysts categorises its imagery of women and older teenage girls by the victim’s precise location, in an act of ‘exposing’ them to other internet users. Users’ comments on this website, and many other such websites, reveal a prurient scrutiny of women and girls, a readiness to sexually humiliate them, and apathy in the face of revictimisation.