Can social media go from deadly by design to safe by design?

A girl checks her mobile phone as she stands in front of an empty Apple store during an extended lockdown as the spread of the coronavirus disease (COVID-19) continues in Frankfurt, Germany, January 21, 2021. REUTERS/Kai Pfaffenbach
opinion

A girl checks her mobile phone as she stands in front of an empty Apple store during an extended lockdown as the spread of the coronavirus disease (COVID-19) continues in Frankfurt, Germany, January 21, 2021. REUTERS/Kai Pfaffenbach

There is growing pressure on social media platforms for safety in the design of their products to protect users and minimise harm

Imran Ahmed, chief executive, Center for Countering Digital Hate.

You may remember an old public service announcement that asked parents, “It’s 10 pm, do you know where your kids are?” The question today would be, “It’s 10pm, the kids are upstairs … but who are they with?” Do we really know which influencers or content creators are communicating with our children right now, on apps to which we have no real access, or that we might not fully understand?

TikTok, the Chinese-owned video sharing app, has emerged as the leading app for young people’s hearts, minds, and screen time. The app is addictive by design – at its core, it is an endlessly scrollable feed which personalises over time, serving up video clips in its For You feed.

Two-thirds of American and British teenagers flock to the app, spending on average, over an hour on it every day. Our new report, Deadly by Design, discovered that dangerous content encouraging body dysmorphia, eating disorders and self-harm is being served by TikTok’s recommendation engine directly to teenage girls.

RelatedLawsuits pile up as U.S. parents take on social media giants
RelatedHow crypto is supercharging ransomware attacks
RelatedIs crypto worsening climate change? It's complicated.

The ages of 14 through 24 - what psychologists call the transition to adulthood - is a key period in the development of young people. It’s the time when parents stop being the most important socialising influence, and peers become the measure by which young people assess their own place in society. This is the period in which malignancies like eating disorders or serious mental ill-health are most likely to emerge.

In order to test how TikTok’s algorithm works, researchers at the Center for Countering Digital Hate (CCDH) set up accounts at the minimum age the app allows – 13 years – in the United States, Britain, Canada, and Australia. Simulating the behavior of a young person seeking content related to their mental health or body image by watching and liking videos about these subjects, researchers recorded the first 30 minutes of experiencing TikTok as a new user.

The results were, in short, horrifying. Within 2.6 minutes, TikTok recommended suicide content. Within 8 minutes, TikTok served content related to eating disorders. Every 39 seconds, TikTok recommended videos about body image and mental health to the teen accounts in our study.

Moreover, we found that content shared on eating disorder hashtags have amassed 13.2 billion views in total, highlighting the startling scale of distribution of this content to young people.

The report underscores the urgent need for reform of online spaces. Platforms that earn revenue by maximising engagement have consistently prioritised profits over the well-being of their users. In some respects, our research appears achingly familiar.

Earlier this year, a coroner's inquest in Britain concluded that 14-year-old Molly Russell had taken her life while “suffering the negative effects of online content.” Molly had liked, shared, or saved 2,100 posts related to suicide, self-harm, or depression on Instagram in the six months before her death.

Today TikTok, not Instagram, is the most popular platform among children, and as TikTok has grown so has the knowledge gap for parents, guardians, and legislators about the platforms influencing children’s lives. Our Parents’ Guide aims to help parents better understand and mitigate the risks of social media.

One important thing we recommend parents do is talk to elected representatives and demand they do more to protect young people. When Molly died in 2017, there were few ways to hold social media companies accountable for the role they played in her death.

Today, a newfound appetite for change has resulted in meaningful measures to protect kids online. California has passed an age-appropriate design code, which forces companies to think about children’s safety while designing their products. The UK’s Online Safety Bill prioritises children’s safety online, and allows regulators to impose huge fines for non-compliance. And just this week, hundreds of organisations came out in support of the Kids Online Safety Act in the U.S. Congress.

Without intervention, TikTok’s opaque algorithm will continue to profit. Together, young people, parents, and advocates are telling Big Tech that enough is enough. Our STAR Framework argues that legislators need to go further and enact comprehensive solutions.

We want platforms to consider safety throughout the design of their products to protect all users and minimise harm. We want meaningful transparency of algorithms so we can carry out investigations like this one, and we want real accountability for platforms’ failures, and the harms they perpetuate. It’s time to end the age of impunity for social media firms.


Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.


Tags

  • Content moderation
  • Twitter
  • TikTok
  • Tech regulation
  • Meta
  • Social media
  • Tech solutions


Get our data & surveillance newsletter. Free. Every week.

By providing your email, you agree to our Privacy Policy.


Latest on Context