Europe is dismantling its digital rights from within
European leaders speak at the EU AI Champions Initiative at a summit on Europe's technological sovereignty, in Berlin, Germany, on November 18, 2025. JOHN MACDOUGALL/Pool via REUTERS
Hallowing out some of the world’s strongest digital safeguards will harm us all. But there is still time to change course.
Itxaso Domínguez de Olazábal is a policy advisor at the European Digital Rights (EDRi)
A quiet shift is happening in Brussels. The European Commission has launched what it calls the Digital Omnibus, a sweeping rewrite of the EU’s core digital protections.
Officials present it as routine housekeeping. Streamlining. Cutting ‘red tape’ to boost Europe’s competitiveness. In reality, the Omnibus marks the most ambitious deregulation of the digital sphere the EU has ever attempted.
This move matters because Europe’s digital rulebook was not built for decoration. Laws like the General Data Protection Regulation (GDPR), the ePrivacy framework and the Artificial Intelligence Act (AI Act) protect people’s most basic rights online.
They set limits on what companies and public authorities may do with personal information, how devices can be accessed, and when algorithms may replace human judgement.
If those protections fall, everyday life changes: how we access healthcare, apply for a job, seek housing, or move through public space.
Safeguards hollowed out
The Commission’s proposals reopen all three laws at once. They tweak definitions, hollow out safeguards and create new exemptions that give powerful actors more room to collect, infer and reuse personal data.
These changes occur alongside the Digital Fitness Check that hints at future changes to nearly any law touching the digital environment.
Taken together, this is not simplification. It is a shift away from the idea that fundamental rights should set hard boundaries for the digital economy. Under the Omnibus, the definition of personal data becomes narrower and more dependent on what companies say they can identify.
Companies will be able to train AI models much more easily and with fewer safeguards. Device-level protections, once anchored in ePrivacy, are moved into a structure allows for broad exceptions.
Companies and other entities might gain new ways to access information stored on phones and smart devices without permissions.
The protections against unfair automated decision-making become weaker, even though algorithms already influence credit scoring, hiring decisions and access to essential services.
AI reinforcing biases
The picture becomes even more alarming when looking at AI. The Omnibus proposes changes to the EU’s brand-new AI Act before it has even fully taken effect.
Key transparency and oversight provisions are delayed. Developers gain the ability to exempt their own systems from strict rules with limited accountability.
Sensitive data, including biometric information, becomes easier to use for training AI models under vague claims of necessity. Combined with the weaker data protection framework, this increases the chances that AI systems will reinforce discrimination rather than reduce it.
The political narrative behind all this is familiar: Europe needs more flexibility to compete globally. However, the beneficiaries of these changes will not be the small companies invoked in Brussels’ institutional speeches.
They will be large data-driven businesses, most of them founded outside the EU, whose power depends on extracting and combining as much information as possible.
Deregulation also plays into pressure from abroad to soften Europe’s digital standards, which have long been seen as an unwelcome barrier to aggressive data exploitation.
The losers are easy to identify. Those already exposed to profiling and surveillance will be hit hardest: workers whose performance is monitored, children targeted by behavioural advertising, and minoritised communities disproportionately flagged by predictive systems.
The Omnibus proposal weakens the safeguards meant to protect them and reduces the transparency that allows journalists, researchers and activists to uncover abuse.
A chance to change course
Fundamental rights are not an inconvenience. They are a boundary line that stops the digital economy from becoming a system of unaccountable extraction and automated exclusion.
The EU often prides itself on having the world’s strongest protections for privacy and personal data and one of the first horizontal laws on AI.
That reputation was built on a simple idea: rights first, innovation only when it respects our rights. The Digital Omnibus drops the final five words.
EU governments and the European Parliament still have a chance to change course. They should reject the Omnibus package and instruct the Commission to return with proposals that strengthen, rather than weaken, the protections people depend on.
And they must be clear that some principles are not up for negotiation, even with proper consultation.
Equal treatment. Strong limits on surveillance and intrusive data collection. Effective remedies when algorithms harm people. A digital environment governed by rights rather than opportunistic corporate claims.
Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.
Tags
- Tech regulation
- Social media
- Data rights
Go Deeper
Related
Latest on Context
- 1
- 2
- 3
- 4
- 5
- 6