Thoughts on safety and survival on Twitter from the rest of the world
A 3D printed Twitter logo is seen in front of a displayed photo of Elon Musk in this illustration taken October 27, 2022. REUTERS/Dado Ruvic
As Twitter disintegrates, there is an opportunity to build solidarities that centre the experiences of marginalised communities
Nighat Dad is the executive director, Digital Rights Foundation, Pakistan.
As the world watches the disintegration of one of the biggest and most influential social media platforms, Twitter, it should give us pause regarding the larger structures in place at tech platforms and the impact these developments have on the global majority beyond the United States and Europe.
While many users and experts have suggested moving to alternatives or reducing dependence on tech platforms, many users in the global majority do not have that luxury as they have always been the recipients of global technological innovations and products, marketed to them as an afterthought.
Twitter is a complicated platform—it has always been a site for communities to find their voice and simultaneously, incredible violence. These problems did not start with Elon Musk, but have been exacerbated immensely since the gutting of resources dedicated to safety and the recent dismantling of the Trust and Safety Council. While bad actors and abusive content always found space on Twitter, the fragile systems that could offer some hope of addressing them are now gone.
The Digitial Rights Foundation in Pakistan has been running the Cyber Harassment Helpline for six years, documenting, reporting and flagging harmful content on Twitter as per its community guidelines. For the past few years, we have had a front-seat view of the systems in place to protect users, as well as their limitations.
We have watched content inciting violence against feminist movements, activists and journalists proliferate on the platform. We have documented inauthentic hashtags trend for hours on end before being taken down. However, we have also seen these platforms give space to countless women and trans voices; we have seen communities flourish, and narratives suppressed by the state also flourish.
We have seen the massive failure of safety mechanisms, and overbroad censorship being categorised as mistakes by clumsy automated systems resulting in systemic bias, but we have worked regardless, to provide safety and support to users from Pakistan and beyond because we simply cannot abandon them just because systems are imperfect.
Concern regarding Musk’s arbitrary and misguided attempts at content moderation, which has resulted in the return of multiple bad actors to the platform, is warranted more so because it will hurt the most marginalised among us. However, we must also reflect on how we got here.
The destruction of the safety and integrity teams at Twitter has been acutely felt in the United States and Western economies, but in a cruel twist of fate, it has brought them in step with where the rest of the world has already been—abandoned and ignored by social media platforms. Resources dedicated to ensuring safety for the global majority was a small fraction of what it was for the global north.
There needs to be reflection on the fact that Twitter had previously failed to fully develop transparent and accountable systems to ensure safety. It is precisely this lack of structure and power for users that Musk has been able to weaponise.
Other social media platforms need to ask themselves, do they have resilient and accountable systems that will survive a change of ownership? Have they empowered users or invested in oversight mechanisms that can provide guardrails against profit maximising tendencies of their owners and shareholders? If not, then Musk is an indication of a larger malice.
In this moment of crisis, there is an opportunity. For the first time in a long time, we can see a behemoth of big tech shaking. It is time to build global solidarities which centre the experiences of marginalised communities, particularly from the global majority, and learn from what can happen when the system completely fails us. It is only when we listen to these voices that we can understand the true extent of what is to come, and build systems and alternative futures.
Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.
- Content moderation
- Tech and inequality
- Tech regulation
- Social media
- Data rights
Latest on Context