Section 230 faces a reckoning

People walk across the plaza to enter the U.S. Supreme Court building on the first day of the court's new term in Washington, U.S. October 3, 2022
opinion

People walk across the plaza to enter the U.S. Supreme Court building on the first day of the court's new term in Washington, U.S. October 3, 2022. REUTERS/Jonathan Ernst

If the Supreme Court decides to limit Section 230 protections in any way, expect significant changes to user experience online

David Morar is a policy fellow at New America's Open Technology Institute.

The U.S. Supreme Court recently agreed to take up two cases that could end up having major consequences for the way online platforms function, and for how we all use the internet.

The two cases, Gonzalez v. Google and Twitter, Inc. v. Taamneh, call into question a small piece of the legal protections that have allowed the internet to flourish the way it has. Almost any ruling that does not dismiss the questions could remake the online world entirely.

The legal protections are provided by Section 230, a straightforward but often misunderstood law that protects any provider of “interactive computer services” - a purposefully broad term - from being sued for any third-party content hosted on their services. This includes, for example, user-generated content that makes up the bulk of what we see on social media.

A thumbs up logo is displayed on a rainbow banner
Go DeeperAre social media firms doing enough to protect LGBTQ+ users?
People hold Hamas flags as Palestinians gather after performing the last Friday of Ramadan to protest over the possible eviction of several Palestinian families from homes on land claimed by Jewish settlers in the Sheikh Jarrah neighbourhood, in Jerusalem's Old City, May 7, 2021. REUTERS/Ammar Awad
Go DeeperInstagram, Twitter blame glitches for deleting Palestinian posts
Elon Musk's twitter account is seen on a smartphone in front of the Twitter logo in this photo illustration taken, April 15, 2022
Go DeeperElon Musk's Twitter deal stirs fears of abuse in Asia, Middle East

Section 230 protections are understood as relevant both for the hosting of the content itself, and for the platform’s decision to moderate (or not) this content. And, while there are certain exceptions rooted in specific laws - criminal, intellectual property, and state trafficking - the protection is significant, and provides a reliable, predictable and friendly legal structure conducive to growth.

This has meant that most internet companies, including Big Tech platforms, were able to build, without fear of lawsuits - something common before Section 230 - and scale websites and programs that have since become an important part of our daily lives, be it through search, messaging, video-sharing, social media and other types of sites.

The cases in question don’t get at the question of whether Section 230 is unconstitutional. Instead, they tackle an adjacent question: should online companies be held liable if they host, or recommend through their algorithms, terrorism content? Both cases involve the families of victims of terror attacks that took place offline, but assert that the platforms are responsible for terrorism radicalization.

Online services, and platforms in particular, have weaved algorithms into practically every part of their business. Algorithms could be broadly defined to encompass simple mathematical functions and basic computing infrastructure, or refer more specifically to algorithms that recommend content to users, or govern the placement of content on one’s social media feed.

Whether the court decides only certain types of algorithms do not fall under Section 230 or takes a broader view, the changes would likely end up being substantial.

It’s important to note that algorithms or recommendations wouldn’t be banned, per se. Removing Section 230 protections means that, for whatever purpose the Supreme Court decides, these companies would expose themselves to lawsuits that could not be simply dismissed.

These suits may ultimately be unsuccessful on account of other protections - like the First Amendment, which guarantees the right to free speech - but they will still be tried. This means at a minimum that companies would need to devote resources to this litigation, with potential reverberations into choices about how they govern their products.

If the court decides to limit Section 230 protections in any way, we can expect significant changes to our user experience online. It is not outside the realm of possibility that a broad decision implicating many kinds of algorithms could make any online service that uses ranking and recommendation tools - like a search engine or video-sharing platform - change their systems fully.

In the long-term, this change might lead to innovations in how content is presented and discovered online. Most likely in the short and medium term, it would create a serious disruption in the operation of services we’re used to seeing.

Another scenario would see the Supreme Court choose to invalidate more than just algorithmic recommendations, either partially or fully throwing out Section 230. That would mean platform action rests somewhere along an uneasy continuum, depending on how the Court expresses itself.

At one extreme, companies would be too fearful to moderate any content, leaving everything from death threats to graphic images online, effectively turning the internet into a truly lawless place with an “everything goes” attitude. At the other, platforms might end up needing to censor anything from recipes to political statements, requiring that they closely surveil every piece of content, or not allow user-generated content at all.

A dismissal of the cases would likely mean that, for the time being, Section 230 remains unchanged. Otherwise, the internet as we know it is very likely to change.


Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.


Tags

  • Content moderation
  • Social media
  • Data rights


Get our data & surveillance newsletter. Free. Every week.

By providing your email, you agree to our Privacy Policy.


Latest on Context