In Data: What's behind Europe's online child sex abuse dilemma?
The shadow of a child is cast as he plays on a trampoline in Nairobi, Kenya February 12, 2025. REUTERS/Thomas Mukoya
What’s the context?
European countries host the most child sexual abuse content online, but proposed rules to protect victims and identify perpetrators have stalled.
BRUSSELS - As the popularity of social media and messaging apps grows, so does the volume of child sexual abuse material, or CSAM, on the internet, and 2024 was the worst year on record, a new report shows.
European Union countries host more than 60% of child sexual abuse web pages identified in the last year, according to the Internet Watch Foundation (IWF), prompting calls for better regulations and reporting tools to protect potential victims.
EU efforts to combat CSAM have stalled over privacy concerns, with campaigners and tech companies warning that some proposed rules would break end-to-end encryption on messaging services, leading to mass scanning of private messages.
Lawmakers and privacy campaigners agree that more needs to be done, but opinions diverge over how to find a balance between privacy and child protection.
How big is Europe's online child sex abuse problem?
While the problem is global, the EU hosts the largest number of online child sex abuse content, according to the IWF, which has traced more than 290,000 web addresses to servers in Europe.
Each web address assessed by IWF could contain one, tens, hundreds or thousands of child sexual abuse images or videos.
The IWF said almost a third of videos it found were of the worst kind of child sexual abuse imagery, including depictions of rape, sadism or bestiality.
Deepfakes generated with artificial intelligence are adding to the volume of CSAM online: There were 245 reports of AI-generated CSAM in 2024, a 380% increase from 2023.
What is driving the problem?
While much of the CSAM is generated elsewhere, it requires a physical hosting location, and Europe's high-speed internet and large number of servers means web hosting companies in EU countries are often used.
The Netherlands hosts the highest volume of child sexual abuse imagery. In 2024, the Dutch government introduced fines for hosting companies that do not remove online child sexual abuse material from their servers in a bid to combat its proliferation.
Under current EU legislation, tech platforms must report and remove child abuse material via an exemption to the e-Privacy Directive - a temporary measure which will expire in April 2026.
The EU's Digital Services Act also places greater responsibility on tech companies for the content they host, imposing fines if illegal material is not removed.
However, criminals can easily shift to private domains, including cloud services and chat apps like Telegram, to thwart investigations, making it harder to identify victims and perpetrators.
So how is the EU trying to combat it?
In 2022, the European Commission proposed new rules, which would require tech companies like Meta and Apple to detect and remove CSAM from their messaging, social media and cloud services. But progress on the legislation remains stalled over privacy concerns.
Privacy activists and tech companies warned that certain measures in the draft proposal such as scanning encrypted messages, like those in WhatsApp, Signal or iMessage, would break or weaken end-to-end encryption and open up personal devices to mass surveillance.
Although amendments to limit scanning were introduced in 2023, there is no consensus among EU lawmakers.
Some want more targeted investigations rather than blanket scanning, alongside stronger parental controls and improved content moderation from tech platforms.
EU lawmakers continue to discuss a compromise to cover the gaps of existing rules.
(Reporting by Joanna Gill; Editing by Ayla Jean Yackley)
Context is powered by the Thomson Reuters Foundation Newsroom.
Our Standards: Thomson Reuters Trust Principles
Tags
- Content moderation
- Tech regulation
- Social media