Making the internet safe for children

A child looks at his mobile phone as people camp a day before the funeral of Britain's Queen Elizabeth, following her death, in London, Britain September 18, 2022
opinion

A child looks at his mobile phone as people camp a day before the funeral of Britain's Queen Elizabeth, following her death, in London, Britain September 18, 2022. REUTERS/Marko Djurica

The algorithms that social media platforms use must be better designed and monitored to prevent young people from seeing the sort of content that no child should see

David Wright is the Director of UK Safer Internet Centre, a partnership of Childnet International, Internet Watch Foundation and SWGfl: Safety and Security Online.

The findings of the Molly Russell inquest should send shockwaves across the world and act as motivation for platforms to reassess how they keep our children safe online.

Molly was 14 years old when she took her own life in November 2017, with a coroner concluding that the negative effects of social media played a part in her death. Molly had viewed thousands of posts relating to self-harm and suicide across online platforms, including Instagram and Pinterest. It is unacceptable that more wasn’t done to prevent her from accessing such harrowing content.

The much-anticipated Online Safety Bill aims to protect children who have been, and continue to be, impacted by online harm in this country.

A child using a laptop sits on a camp bed
Go DeeperCan an EU law save children from harmful content online?
Go DeeperBangladesh: New online content regulation, localisation rules threaten privacy
Go DeeperEU Digital Services Act brings us closer to an inclusive, equitable internet

But while the Bill does improve online safety, it doesn’t go far enough. The current processes around ‘Impartial Dispute Resolution’ are set to be removed, and this will take away the essential practice of giving victims the opportunity to seek impartial recourse for the devastating impact of legal but harmful content. It takes away their right to effective appeal, their right to protect themselves.

Molly’s death and the inquest verdict must be a turning point to protect our children online. The Online Safety Bill must not be stripped back or watered down, but do all that is necessary to prevent these unnecessary and tragic deaths. The Online Safety Bill must come in as soon as is practicable.

The UK Safer Internet Centre was set up to help raise awareness of the pitfalls within the world of online safety, and to look at how we can best educate and protect those at risk of falling to harm as a result of what they view on the internet. It’s imperative that online services learn from what Molly went through and ensure that our children are not left vulnerable to the dangers of legal but harmful content.

A key component to achieve this is to ensure the wider public is in the best possible position to protect themselves and those they love. And we need help from online platforms to do this. They have a responsibility to our children to change their practices and to better prevent young people from seeing the sort of content that Molly saw – and which is still readily available at the fingertips of children. 

The algorithms these platforms use must be better designed and monitored to prevent young people from seeing the sort of content that no child should see. The kind of content viewed by Molly needs to be addressed, assessed and removed - with those who fail to do so being held to account.

There can be no hesitation. There can be no delay.


Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.


Tags

  • Disinformation and misinformation
  • Content moderation
  • Tech regulation
  • Social media
  • Data rights


Get our data & surveillance newsletter. Free. Every week.

By providing your email, you agree to our Privacy Policy.


Latest on Context