Digital Services Act: time for Europe to turn the tables on Big Tech
People holding mobile phones are silhouetted against a backdrop projected with the Twitter logo in this illustration picture taken in Warsaw September 27, 2013. REUTERS/Kacper Pempel
The EU’s new Digital Services Act is a blueprint for forcing Facebook, Youtube, and other major online platforms to tackle the serious risks they pose to individuals and the public sphere. Its success now depends on whether EU officials will effectively enforce the law.
John Albert is the Policy & Advocacy Manager at AlgorithmWatch and Matthias Spielkamp is the founder and executive director of AlgorithmWatch
On March 9, a Russian airstrike tore through a maternity and children’s hospital in the Ukrainian city of Mariupol, killing at least three people and injuring several more.
As pictures of the bombing’s victims - including that of an injured pregnant woman cradling her womb as she was being carried on a stretcher, shortly before she and her baby perished - began to circulate, the Russian embassy in London took to Twitter to cast doubt on their validity. Though Twitter removed the embassy’s posts, false narratives such as these have continued to flood social media and find traction with audiences, propelled by the platforms’ recommender algorithms.
The information war being waged online around the conflict in Ukraine is but one of many troubling examples of how disinformation can flourish on online platforms, posing serious risks to the social and institutional fabric of the EU.
Yet Big Tech companies have continued to essentially regulate themselves when it comes to managing information flows. This has put enormous power in the hands of just a few corporations, who time and again have shown themselves to prioritise profits over public interest by willfully ignoring or downplaying the harms that their services cause.
EU lawmakers recognised the need to curb the power of Big Tech and hold major platforms accountable for their actions. This imperative has resulted in the Digital Services Act (DSA) - a more than 300-page rulebook for tech companies that would force them to do more to tackle illicit content on their platforms, uphold users’ rights, and address the detrimental impact they can have on people and society, or else risk billions of euros in fines.
How does the DSA propose to deliver on its central promises? Thanks in part to the collective efforts of civil society, the DSA would limit platforms’ most egregious forms of tracking-based advertising and deceptive design practices.
It also introduces important safeguards for individual rights such as improved “notice-and-action” procedures for users to flag potentially illegal online content, as well as redress mechanisms for users to dispute platforms’ content moderation decisions.
Perhaps the DSA’s most innovative requirement is for online platforms to identify and tackle so-called “systemic risks” stemming from the design and use of their services – such as dissemination of hate speech, the infringement of basic rights, as well as intentional manipulation which may have negative effects on civic discourse and electoral processes.
Crucially, whether platforms are adequately dealing with these systemic risks will be verified by both independent auditors, as well as regulators and external researchers who may gain access to platform data. The kind of third-party data access the DSA introduces is essential to informing the public debate about how platforms influence our public sphere.
Platforms cannot be trusted to assess themselves; we require external scrutiny to illuminate how their automated systems personalise content via recommender systems and targeted advertising, as well as how they deal with illegal content, handle user complaints, and apply their terms of service.
But as with the entirety of the DSA, the effectiveness of these rules will ultimately rest on whether EU officials can put the law into practice. Properly vetting researchers to qualify for data access and safely facilitating such access, for example, will require investments in expertise and establishing new protocols that work.
It will depend in part on the capacities of Digital Services Coordinators (DSCs) to be set up by EU member states, as well as of untested independent advisory bodies. It also remains to be seen to what extent platforms will invoke a "trade secrets" exemption to deny data access requests – and whether regulators will intervene.
Such enforcement will require a strong governance framework with adequate investments in staffing and capacity-building, and must learn from past mistakes in enforcing the General Data Protection Regulation (GDPR). To this end, the DSA shifts enforcement to the European Commission, and includes a “polluter pays” principle, which will require platforms to pay for their own supervision by the Commission.
Although the DSA is an imperfect document, it represents a potential paradigm shift in tech regulation. Even so, the great promise of the DSA will only be realised if its laws can be implemented and enforced in practice.
Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.
Tags
- Disinformation and misinformation
- Online radicalisation
- Polarisation
- Content moderation
- Facebook
- Twitter
- Instagram
- Tech regulation
- Meta
- Social media
Related
Latest on Context
- 1
- 2
- 3
- 4
- 5
- 6