Should encryption be curbed to combat child abuse?

Published:  Wed 19 May 2021

Written by:  BBC News

For nine years, Chris Hughes has fought a battle very few people ever see.

He oversees a team of 21 analysts in Cambridge who locate, identify and remove child sexual abuse material (CSAM) from the internet.

The Internet Watch Foundation (IWF) is funded by the global tech industry.

It manually reviews online reports of suspected criminal content sent in by the public. Mr Hughes sees upsetting material every day.

Read more at BBC News

How the sending of one photo led an 11-year-old girl to become a victim of physical sex abuse

How the sending of one photo led an 11-year-old girl to become a victim of physical sex abuse

The girl sent a photo to a boy in her class before the image and her phone number were added to all-male online chat groups - she later started disappearing before being abused by "unknown men".

23 July 2024 IWF In The News
AI advances could lead to more child sexual abuse videos, watchdog warns

AI advances could lead to more child sexual abuse videos, watchdog warns

IWF warns of more AI-made child sexual abuse videos as tools behind them get more widespread and easier to use

22 July 2024 IWF In The News
AI being used to generate deepfake child sex abuse images based on real victims, report finds

AI being used to generate deepfake child sex abuse images based on real victims, report finds

The tools used to create the images remain legal in the UK, the Internet Watch Foundation says, even though AI child sexual abuse images are illegal.

22 July 2024 IWF In The News