This EU law could end online privacy

People walk past as European Union's flags flutter at half mast in memory of late European Parliament President David Sassoli, in front of European Commission building, in Brussels, Belgium January 11, 2022
opinion

People walk past as European Union's flags flutter at half mast in memory of late European Parliament President David Sassoli, in front of European Commission building, in Brussels, Belgium January 11, 2022. REUTERS/Yves Herman

EU’s draft child sexual abuse regulation can force providers to bare our digital lives, threatening privacy and anonymity online

Diego Naranjo is Head of Policy and Ella Jakubowska is Senior Policy Advisor at the European Digital Rights.

As they do every year on Global Encryption Day on Oct. 21, members of the European Parliament sent out messages in support of end-to-end encryption and secure, private messaging.

But despite the widely-acknowledged benefits that secure communication brings to human rights defenders, professionals and society at large, the European Commission earlier this year put forward a draft law that threatens the safety, security, privacy and free expression of internet users, including of children, with the Child Sexual Abuse Regulation (CSAR).

Many messages, e-mail and chat services already use risky technologies to scan people’s private communications, with very little public transparency. However, the EU’s CSAR proposal could force these providers to scan virtually our entire digital lives, as well as making it near-impossible to remain anonymous online.

Go DeeperDigital Services Act: time for Europe to turn the tables on Big Tech
A child using a laptop sits on a camp bed
Go DeeperCan an EU law save children from harmful content online?
Go DeeperEU Digital Services Act brings us closer to an inclusive, equitable internet

EDRi and over 100 other human rights civil society groups have already called on the EU to reject this misguided proposal which, despite its important goal, puts forward measures that are likely to be dangerous, ineffective and unlawful.

Even the European Commission's own Scrutiny Board has pointed out that the efficiency of the measures proposed in the CSAR are not sufficiently demonstrated.

A report this year from the United Nations Office of the High Commissioner for Human Rights (OHCHR) confirms the serious risk that the EU’s proposal, alongside a broad, global range of anti-privacy trends and laws, poses to human rights, describing "the very real and encroaching risk of creating systems of pervasive surveillance and control that may eventually choke the development of vibrant, prosperous and rights-respecting societies”.

Similarly, the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) warn of serious data protection and privacy concerns, and risks for fundamental rights.

Finally, the EU's European Social and Economic Committee (EESC) said the CSAR proposal comes with measures of "disproportionate nature", warns about the risk of widespread monitoring of all virtual exchanges, questions the approach of "a general sweep of hosting and communication services" and asks the

Commission to “make the text better and more specific in order to safeguard secrecy of correspondence and respect for privacy".

The risk is real, as demonstrated by the Irish Council for Civil Liberties, which recently released data that shows the Irish Police (Gardai) holds on to the personal data of people who were wrongfully identified as spreading potential child abuse material, even after they had been found innocent. The identification of these people included the use of technologies to scan people’s private messages.

The data showed that from all referrals of child sexual abuse material (CSAM) online that the U.S. National Center for Missing and Exploited Children (NCMEC) sent to the Irish police in 2020 – much of which resulted from scanning practices - just 20% actually constituted CSAM.

Fewer than 10% were actionable, and information was not provided about whether any of these actionable reports led to prosecutions or convictions. At least one in 10 reports was a false alarm, including innocent people who shared images or videos of children playing on a beach, or legitimate content by adults, such as topless, nudist or other adult content.

This case challenges the European Commission’s claim that scanning technology is highly accurate and reliable, and that false reports are negligible and harmless.

Governments could drastically improve the removal of child abuse material from the internet by better funding frontline experts who already do this vital work. Increasing the capacity of hotlines that help survivors get abuse material taken down is a proven method of tackling online abuse, without violating the rights of every person who uses the internet.

EU policymakers must protect end-to-end encryption. Without online privacy, companies, law enforcement agencies and governments would be able to surveil on a mass scale everything that everyone shares online. This would have a massive impact on freedom of expression and other fundamental rights.

In the longer term, the horrific crime of child abuse will continue unless we tackle its chillingly human origins. There is a lot of promising research into prevention - stopping potential abusers before they even make contact with a child, for example - and yet these are rarely pursued by governments. Our collective aim should be to stop the root of this crime, not merely its online symptoms.


Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.


Tags

  • Content moderation
  • Tech regulation
  • Data rights


Get our data & surveillance newsletter. Free. Every week.

By providing your email, you agree to our Privacy Policy.


Latest on Context