Violence against women & girls

 

Violence against women and girls (VAWG) encompasses a wide range of harms, including sexual exploitation, abuse and coercion, much of which is increasingly facilitated online. Our work at the IWF directly intersects with VAWG through our efforts to identify, assess and remove child sexual abuse imagery which disproportionately impacts girls. By disrupting the online infrastructure that enables abuse to be shared, profited from and normalised, the IWF plays a critical role in preventing further harm, supporting victim safeguarding, and addressing the digital dimensions of violence against women and girls.

Number of girls seen in criminal images over the past two years

730,998

Frontline observations Policy overview

Year on year, it is girls that are most often seen in the criminal imagery analysed by the IWF. 

However, we have consistently seen a relatively low proportion of girls use our Report Remove service. To date, girls account for just 28% of the criminal reports processed through this service.

Number of children* recorded in child sexual abuse images by sex over the past two years

* Methodological note: Since January 2024, following the introduction of our Multichild recording capability, our Image Assessors can record the age and sex of every child appearing in a single image.

This chart shows a breakdown by sex of all images in which sex has been recorded over the past two years. 76% (730,998) of those child sexual abuse images depicted girls.

While the proportion of the most severe content (Category A) is similar for boys and girls, the distribution of less severe categories differs between genders.

  • 21% of images of girls were assessed as Category A, which is similar to boys at 23%.
  • For girls, Category B and Category C were fairly evenly split, at 38% and 41%.
  • For boys, the pattern is slightly different. A higher proportion of images were Category B (52%), while fewer were Category C (26%).

The disproportionate representation of girls persists in the more recent development of AI-generated child sexual abuse material.

AI-generated images by sex over the past two years

An overwhelming 99% of the AI-generated images assessed over the past two years depicted girls compared to 78% in non AI-generated (real).

Evidence shows that real images of child sexual abuse victims have been used in the training of AI models, with these technologies often generating new imagery depicting known victims.

Even when sexual abuse is captured in a single image, AI-generated or not, the harm it represents extends well beyond that moment, with lasting effects on the children involved. It could also reinforce damaging gender norms, influencing how girls feel expected to behave and shaping how boys develop their perceptions of girls, women and sexual behaviour.

IWF Internet Content Analyst
Frontline observations

Some readers may find the following descriptions distressing, please feel free to skip this section.

Violence against women and girls is a prevalent theme in the harmful and illegal material we see online. Even in imagery of very young girls, sinister sexualisation with violent undertones is clearly present.  

On forum websites and image hosting sites we encounter comments next to imagery of young children in swimwear or gymnastics costumes, stating how those girls should be abused. Some internet users rate and compare the girls’ bodies and invite abusive comments from others. In other instances, a girl no older than 7 years old was described as having ‘great slut potential’; and a 3-year-old girl was described as a ‘BDSM Slut’. One image of a pre-pubescent girl juxtaposed with text describing the acts of violent sex that a girl should learn to submit to; acts also described as ‘inevitable’. Due to our roles, witnessing frequent and blatant degradation of girls is a daily occurrence for us. We recognise the societal risk of this being the norm for all internet users.  

Members of the public report to us sexualised content of women and girls produced in non-consenting scenarios. These reports include URLs of websites dedicated to posting voyeuristic imagery of young women and older girls’ bodies. Though not always illegal, this imagery depicts sexualised and suggestive close-up images of women and girls’ bodies, taken without their knowledge. We have also seen an increase in reports of imagery depicting non-consenting women being ejaculated on or sexually touched in crowded public places, apparently without them realising sexual contact has been made. It is evident that this imagery is posted for gratification, without concern for consent. We also know that improper use of AI, such as ‘nudifying’ technology, poses further threat to the autonomy women and girls have over their bodies online.  

We know that there are adult pornography websites hosting imagery described as ‘rape’. To our trained eyes, many of these scenes appear staged by adult performers, yet the titles of the imagery can be misogynistic and violent. Not all online ‘rape’ imagery appears staged however, and our Hotline team has viewed multiple videos that depict apparently real gang rape and physical violence against women, often in India or sometimes Bangladesh. These acts of shaming, sexual violence and degradation are published online, accessible on the open internet.  

Some online indecent imagery depicting older teenage girls and young women is disturbingly humiliating, and framed by misogynistic language. We encountered a web user who described the posting and sharing of indecent imagery - including that of confirmed child victims - as “trading and exposing whores”. Another user posted the nude image of a female with the text “your body is now the public property of the web”. A strong sense of the commodification and ownership of women’s and girls’ bodies is evident in certain online spaces. 

We've had websites reported to us that promote the self-generated content of women and older teenage girls as: ‘self-harm’, ‘incest’, ‘blackmail’, ‘forced’ or ‘self-humiliation’ material. Packs of content depicting such scenes are offered for sale. Any concern for the harm experienced by the women and girls depicted on such websites is not evident. What is far clearer is the appetite for viewing older girls and young women experiencing sexualised harm and humiliation.   

Imagery of this nature is branded as ‘wins’ - particularly, it seems, if it was sourced without consent or created under threat. One website that posts such 'wins’ or ‘leaks’ of older teenage girls and young women also allows users to search for imagery by body type or hair colour, as though shopping for a product. On similar websites, users gain kudos or website currency for being able to identify the female in the indecent imagery, or confirm which town she lives in. One website seen by analysts categorises its imagery of women and older teenage girls by the victim’s precise location, in an act of ‘exposing’ them to other internet users. Users’ comments on this website, and many other such websites, reveal a prurient scrutiny of women and girls, a readiness to sexually humiliate them, and apathy in the face of repeated victimisation.  

 

Policy overview button

 

 

Violence against women and girls and child sexual abuse are inherently and deeply connected – with shared root causes like gender inequality, misogyny and power imbalances. A coordinated and joined-up response to these issues is essential.

We are at the front line and see how girls bear the brunt of sexual violence on and offline. Notable cases and research have highlighted the distinct gendered dimension to the requesting, creation and sharing of nude images: girls often feel pressured to share nude images and are then stigmatised, while male students have used nudify apps to ‘undress’ their classmates as an attempt to exercise power over them.

We know that girls are already forced to shrink both their physical and digital lives to stay safe – a modern echo of the age-old reality where their freedom is curtailed to avoid violence. The data from IWF serve as a stark reminder of how widespread abuse against girls remains, and how much work is needed to address it.

In December 2025 the UK Government published its long-awaited Violence Against Women and Girls (VAWG) strategy, which sets out how the Government will meet its manifesto pledge to halve VAWG in the next decade. Following advocacy from the IWF and others, the strategy includes clear and deliverable objectives to combat child sexual exploitation and abuse, with an emphasis on prevention and early intervention. However, partners such as SWGfL have called for further action to protect adult victims of technology-facilitated abuse, the majority of which are women. Action needed includes creating a register so that platforms are able to remove and block content, and ensuring that the whole internet infrastructure is able to block access to non-consensual intimate images.

 

EU flag icon

At the EU level, the fight against VAWG has risen steadily up the political agenda. The Directive on combating violence against women and domestic violence recognises that girls are disproportionately affected by sexual violence and requires Member States to tailor measures to both adult and child victims of gender-based violence. The Directive builds on the EU’s broader child protection framework, in particular the proposed Regulation to prevent and combat child sexual abuse online, and the Child Sexual Abuse Directive, which is currently being revised.