Tech platforms must drop addictive features that harm young people

Iraqi girls use their phones at a cafe in Kerbala, Iraq August 6, 2019. Picture taken August 6, 2019. REUTERS/Abdullah Dhiaa Al-Deen

Iraqi girls use their phones at a cafe in Kerbala, Iraq August 6, 2019. Picture taken August 6, 2019. REUTERS/Abdullah Dhiaa Al-Deen

Predatory designs spread misinformation and pose real dangers

Viktoria Tomova is a communications and media specialist at European Digital Rights. She is also a fellow for Public Voices on Technology in the Public Interest, an initiative of the OpEd Project in partnership with The MacArthur Foundation.

As a member of Gen Z, I’ve lived and breathed the digital world since a very young age.

I use the internet to connect with my friends from across the world, hone my political views, learn new skills, plan trips, or simply binge-watch a whole show over the weekend.

But the internet does not come with an instruction manual. And, with the dominance of huge social platforms like Facebook and Instagram, I fell into the rabbit hole of social media.

Starting a job in a privacy and data protection organisation after I finished my studies changed everything for me, though. I realised the extent of the mental health issues social media platforms cause and I now no longer spend my time infinitely scrolling.

Of course, you shouldn’t have to work at a digital rights group to learn these things.

The dangers are real, and they are already impacting your young siblings and friends. Studies have linked increasing rates of depression, anxiety and insomnia to the use of the major platforms, as well as the disruption of education and daily routines.

Go DeeperLawsuits pile up as U.S. parents take on social media giants
A group of school girls look at their phones in New York September 24, 2015. REUTERS/Lucas Jackson
Go DeeperDigital toxin? US health officials take aim at social media
A child using a laptop sits on a camp bed
Go DeeperDoes social media harm kids' mental health?

One-fourth of the world's population could be affected by the addictive designs tech companies have used to build their services.

Social media companies construct their platforms in ways that make them addictive. Algorithms show individuals things that they think will keep them hooked for longer.

Problematic content

The European Parliament has highlighted the harmful effects of addictive online design, particularly through so-called recommender systems, which personalise a user’s experience based on their interactions with the platforms. The systems can amplify hate, disinformation, information on suicide and self-harm, among other harmful topics, even in the feeds of minors.

Research into the impact of TikTok's recommender system on young people showed that after just five to six hours on TikTok, almost half of the videos in users’ feeds were related to mental health, posing potential harm.

A report from the Irish Council for Civil Liberties last year found that, as a result of addictive design, individuals end up spending more time on the platforms, allowing the tech companies to earn more revenue by displaying ads to them.

On YouTube, more than 70% of problematic content seen by more than 37,000 test volunteers was amplified by YouTube's recommender system, according to a report from the Mozilla Foundation. Research has also shown that Facebook's recommendation algorithms have prioritised disinformation to increase engagement rates. 

All of this unequivocally points to the business models of big tech firms being reliant on engaging people with their platforms for as long as possible and proves that amplifying harmful content worsens mental health issues for everyone, especially young people.

Well-being fears

A huge survey of more than half a million 11 to 18-year-olds in the UK last month found that health and well-being was their biggest concern. In the US, major tech corporations are facing lawsuits filed by school districts. And in Europe, civil society groups are also raising concerns about the effects of these companies' surveillance models. 

But we need to go beyond just raising awareness. One of the key measures many experts are calling for is disabling profiling-based recommender systems by default and introducing a ban on surveillance ads to change platforms’ business incentives.

Surveillance ads are harmful as they are based on tracking of people’s activities online and collect vast amounts of data. A number of stories have already shown that the ads can polarise, discriminate, and impact the mental well-being of children, youth and adults

The platforms do not need these systems to function technically. There are already alternatives to recommender systems such as user-selected or expert-curated content categories, and ranking content based on factors like views, author reputation, and user feedback. 

One regularly suggested solution, age verification tools, would only take away the right of young people to access legitimate online services. Instead, it is Big Tech’s exploitative business model that needs to change.

Social media can empower and excite us, it can enable us to connect with others and co-create our societies. But this is achievable with social media free from addictive and predatory features.

For that to happen, we need ambitious lawmakers who can effectively enforce existing laws like the Digital Services Act in the European Union and go further by banning addictive systems that are not currently regulated. Additionally, they must empower people to control their online feeds by requiring tech corporations to turn such systems off by default.

Only then can we benefit from social media that empowers us to create positive digital spaces.

Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.


  • Content moderation
  • Facebook
  • Google
  • Twitter
  • TikTok
  • Instagram
  • Tech regulation
  • Meta
  • Social media
  • Tech solutions

Get our data & surveillance newsletter. Free. Every week.

By providing your email, you agree to our Privacy Policy.

Latest on Context