Digital toxin? US health officials take aim at social media
A group of school girls look at their phones in New York September 24, 2015. REUTERS/Lucas Jackson
What’s the context?
New York is leading a push by local authorities to address concerns about the mental health impacts of platforms like Instagram
- New York City health chief draws up action plan
- States, school districts and others sue, pass laws
- Tech industry disputes youth mental health links
WASHINGTON - New York City Health Commissioner Ashwin Vasan is on a mission: protect the city's young from the potentially harmful effects of social media - taking his lead from previous public health alerts.
"Take tobacco (or) lead ... We know what to do when we see these sort of toxins in our physical environment. Why can't we use these same rubrics towards this digital toxin?" Vasan told Context by phone.
Frustrated by a lack of federal action to address growing concern about the effects of social media, Vasan and other local officials across the country are looking at what steps they can take.
Last week, 33 U.S. states sued Meta Platforms and its Instagram unit, accusing them of fueling a youth mental health crisis by making their platforms addictive and repeatedly misleading the public about the dangers.
Meta said it was "disappointed" at the lawsuit, but that it shares "the attorneys general's commitment to providing teens with safe, positive experiences online," and referred to the introduction of more than 30 tools to support teens and parents.
But in addition to litigation, health officials including Vasan in New York are considering policy measures.
New York held a national summit in June that included representatives from the Philadelphia, Los Angeles and Boston health departments, aiming to forge a long-term city strategy for addressing social media as "a potential public health threat".
Vasan said he was motivated by concern over a "very clear increase in worsening mental health, in particular in teens", adding that the COVID-19 pandemic had exacerbated a long-term trend.
"By all measure, this is an epidemic," said Vasan, who is working on a city action plan that could include regulations, public awareness campaigns and mental health research.
Big tech disputes links
A sharp increase in depression and suicide among U.S. teenagers coincided with a surge in social media use about a decade ago, though a slew of research has reached mixed conclusions about a possible connection.
Tech industry groups dispute the links and say policies that seek to penalize social media companies infringe on freedoms enshrined in the constitution.
"There are real, constitutional solutions based in fact and reason," said Carl Szabo, vice president and general counsel at NetChoice, a group that advocates for free expression on the internet and includes Google, Meta and others as members.
That includes "educating kids about digital safety and security in the classroom and empowering law enforcement to prosecute the crimes perpetrated against young people online," he said in emailed comments.
"Policymakers in New York should start there instead of infringing the rights of their citizens."
But pressure for policy action is building, and Adriane Casalotti, chief of government and public affairs for the National Association of County and City Health Officials, said New York's effort was being watched nationally.
The association represents about 3,000 public health departments across the country, and "many of them are talking to their school systems about synergies to address mental health challenges – particularly after the pandemic," Casalotti said.
Meanwhile, more than 200 bills on online content moderation were introduced in state legislatures in recent months, according to the Computer & Communications Industry Association.
Several states have passed laws to require age verification, while others have sought to control how data is collected and stored, or to open up platforms to be sued – though many of these efforts have run into legal obstacles, as well as pushback from civil liberties groups.
In addition, about 460 school districts have joined two lawsuits against five social media platforms, accusing them of playing a "major role in causing" youth mental health problems.
"These social media companies have direct impacts on the social emotional well-being of our residents," said Patricia Socarras, communications director for Providence, Rhode Island, a party in one of the suits.
NetChoice's Szabo has called one of the lead school lawsuits an effort to find a "scapegoat" for broader societal problems instead of "trying to teach kids how to better operate online."
Several federal proposals dealing with concerns about children and the internet are pending in Congress, and in May the White House created an inter-agency task force to look into kids' online health and safety, with recommendations due early next year.
During the same month, the U.S. surgeon general published an advisory warning of "ample indicators" of the risk of harm to youth mental health.
The advisory encouraged a "safety-first" approach to the issue, said Kris Perry, executive director of Children and Screens: Institute of Digital Media and Child Development, a research organization.
Such an approach acknowledges the positive potential for kids' social media use, she said, such as offering community and support for LGBTQ+ youths, while seeking to tackle an "array of harms that cross over multiple parts of a child's development."
Public utility model
Social media has become so "ubiquitous" that it requires equally far-reaching regulations, Perry said, a position that has led some people to call for social media to be regulated in the same way as public utilities such as water companies.
The regulation of pipelines might offer a better model, said Ted Kury, energy studies director at the University of Florida's Public Utility Research Center. That could require establishing rules for the provision of content, with fines for platforms that fail to comply.
But implementation could be tricky, Kury warned, complicated by constitutional speech protections, the speed with which social media use changes, and the extent to which fines could influence companies.
Still, the rising conversation on the issue at local levels might also offer an advantage in seeking to create new regulation, he said: smaller groups such as cities may find it easier to arrive at public agreement on the goals of new regulation.
"With local groups, societal preferences may be a bit more united," Kury said.
"That's an advantage, as it might make it a bit easier to define those goals – and that's a critical first step."
(Reporting by Carey L. Biron; Editing by Helen Popper.)
- Content moderation
- Tech regulation
- Social media