Ryan E. Long warns of the US lawsuits currently targeting social media platforms
The Set Up
Great news! Your UK based social media dating company – “Better Than Cupid” – just got funded. It’s been launched in the US and is slated to be released in other countries. You just started celebrating. But then you get an e-mail from your general counsel: “We’ve been sued in NY. They are asking us to shut down until certain protective measures are taken.” Impossible? Think again. The City of New York has recently filed suit against various social media companies including Meta, TikTok, Google, and You Tube for their negative health effects on the public. This article provides an overview of that case and potential theories of liability that can affect UK based social media companies doing business in the US.
The Lawsuit
Social media use has proliferated over the years, particularly in the US. Given this growing use, the lawsuit filed by the City of New York on February 14, 2024, seeks to address the issue of “addiction” to the Defendant’s platforms for youth living in or about the city. The Complaint for Damages and Demand for Jury Trial (“Complaint”) compares the addiction to slot machines and cigarette usage. It points out various alleged negative repercussions, including increased suicide rates nation-wide up 57%; a 40% increase in 2020 of sadness and loneliness among high school students; and 38% of high school students in New York City reported feeling so sad or hopeless during the past year that they stopped engaging in their usual activities.[i]
The complaint links this to the “environmental toxin” of unregulated social media.[ii] While alleging that some Defendants like Tik Tok “knew it was disrupting students’ school day and their sleep,” it generally alleges that all “know or should have known about the risks of social media addiction[.]”[iii] The focus of the Defendants is the underaged group in the schools from New York – and specifically alleges that the Defendants violated the Child Online Privacy and Protection Act (COPPA”) – by not obtaining verifiable consent from user’s parents to obtain individually identifiable personal information.[iv] This is pernicious given the purported links between usage and negative health affects in the Complaint, which cites various studies showing the biological negative effects of social media on youth brain.
One expert cited, New York University professor and social psychologist Adam Alter, says:
“The minute you take a drug, drink alcohol, smoke a cigarette – when you get a like on social media . . . it’s a little bit like taking a drug. As far as your brain is concerned, it’s a very similar experience.”[v]
Other allegations in the complaint include one from Meta’s research team: “41% of teen users of Instagram in the U.S. who reported feeling ‘[not] attractive’ said the feeling began while using the product.”[vi]. The City of New York is not alone in its view of the negative effects of social media. Separately, the U.S. Surgeon General recently proposed that some social media cites come with their own warnings, akin to those found on cigarette packets.
The Complaint alleges that “significant resources” have been expended by the Plaintiffs in terms of paying for this “mental health crisis, by providing services to youth across the city[.]”.[vii] Whether and how these costs can be primarily tied to social media – as opposed to other habits of the users such as diet or exercise – is something that remains to be seen for this case. What is more, some of the studies cited concerning dopamine causation also apply to other products, such as the purchase of products on or offline or video game playing, This is relevant for causation, something touched upon below.
In the US, there are the torts of nuisance and negligence. These are theories upon which the Plaintiffs seek to hold the defendants liable. As for nuisance, the plaintiffs would need to allege an activity on private property that “affects . . . any considerable number of persons,” although the effect may be unequal. A public nuisance can include your neighbour’s trees growing too long and over your side of the fence along with others – creating a fire hazard. To the extent it affects others, it would be a public one. Similarly, negligence in the US has the following elements: duty, breach, causation and damages.
Tricky in this case will be the extent to which the Plaintiffs can allege – and prove – that the Defendants owed a duty to warn about some of the potential negative effects of social media to their users. It is alleged that the Defendants knew – but not clear how much and when. To the extent they did, this would tend in the US to make it more of a duty to warn the consumer that the product is dangerous. This is a form of strict liability for a product defect. There are various allegations to support the notion that social media is in fact dangerous. However, other experts point out it there is no clear evidence linking social media and adolescent mental health issues. This is likely to be an issue for the jury.
Others
The case is not alone. Others have been filed in the US against social media companies for liability issues arising from algorithms. In Gonzalez v. Google, for example, the plaintiff alleged that the defendants were secondarily liable for various attacks perpetrated by ISIS in Paris.[viii] By permitting ISIS related content on their sites, such as You Tube, the defendants aided and abetted ISIS’s acts that violate the Anti-Terrorism Act.[ix] The Ninth Circuit Court of Appeals in California dismissed this theory on the grounds that defendants were protected by Section 230 of the Communications and Decency Act.
In Gonzalez, the majority reasoned that the algorithms “function like traditional search engines that select particular content for users based on user inputs.”[x] In other words, search engines using AI are entitled to Section 230 immunity “because they provide content in response to user inquires ‘with no direct encouragement to perform illegal searches or to publish illegal content.” [xi] The algorithm, in the majority’s view, was “content-neutral” – not generating the options, but steering users to their desired ones. However, the Ninth Circuit did hold that Google could be liable for posts in which it shared revenues with the posting parties and the US Supreme Court affirmed the Ninth Circuit’s ruling.
The New York complaint often compares social media liability to cigarette manufacturer liability. To the extent that the studies and evidence bear out causation in the case, there may be an allegory with the settlements in big tobacco. A recent settlement in California included not only monetary damages but other forms of relief. Aspects of the settlement include monies to assist with the health effects of smoking. While the analogy is not perfect, the big tobacco settlements could prove to be a model.
Conclusion
Social media liability in the US is not static. Whether you are funding a social media company or running one in the US, being aware of these labrynthian legal pitfalls is essential. While the landscape is changing, it does appear that these companies are facing increased scrutiny in the U.S. by private and public plaintiffs. Due care should be taken.
Ryan is the principal of Long & Associates, a boutique intellectual property law practice, and a non-residential fellow of Stanford Law School’s Center for Internet and Society. Since 2006, he has navigated pioneering clients through tricky IP related legal issues including an ex-Google researcher. He has written for, or has been interviewed by, the likes of TechCrunch, among others. His e-mail is: rlong@landapllc.com.
[i] complaint, pg. 4, line 12
[ii] Id. at pg. 5, line 15
[iii] Id. at pg. 12, line 33
[iv] Id. at pg. 20, line 63
[v] Id. at pg. 24, line 77
[vi] Id. at pg. 121, line 375(a)
[vii] Id. at pg. 298, line 905
[viii] 2 F.4th 871, 880 (9th Cir. 2021)
[ix] 18 U.S.C. §2333. Id.
[x] Id. at 896
[xi] Id. (citing Fair v. Roomates, 521 F.3d 1157, 1175 (9th Cir. 2008))