PUBLISHED:March 12, 2025

Professor Stuart Benjamin on why lawsuits challenging social media addiction may fail

Heading

Benjamin’s expertise includes First Amendment issues surrounding social media

Professor of Law Stuart Benjamin Professor of Law Stuart Benjamin

Litigation over social media’s addictive qualities and the harm it does to children is ramping up. Families, school districts, and dozens of state attorneys general have filed lawsuits against platforms, claiming the algorithms that decide what content users see are addictive by design and cause mental and physical harms to children.

Hundreds of those cases have been consolidated into multi-district litigation against Facebook, Instagram, Snapchat, TikTok, and YouTube. Some are being heard in state courts – and one in federal court in California. 

But the plaintiffs’ legal strategies — including suing under products liability, deceptive acts and practices, and public nuisance laws — will likely fail, said Duke Law School Professor Stuart Benjamin.

“There are interesting, clever arguments for the plaintiffs to make about why their claims do not implicate the First Amendment. I just think they’re not likely to be successful,” said Benjamin, the William Van Alstyne Professor of Law and co-director of the Center for Innovation Policy at Duke Law.

“The social media companies will argue — I think successfully — that the decisions they're making on what to prioritize and what not to prioritize are speech for First Amendment purposes.”

Editorial decisions by algorithm are protected speech

The Supreme Court decided in Moody v. NetChoice that editorial discretion over third-party content — whether by humans or algorithms — is speech protected by the First Amendment.

“To the extent that social media platforms create expressive products, they receive the First Amendment’s protection,” Justice Elena Kagan wrote in the majority opinion. “In constructing certain feeds, those platforms make choices about what third-party speech to display and how to display it. They include and exclude, organize and prioritize — and in making millions of those decisions each day, produce their own distinctive compilations of expression.”

Decisions on what to include in social media newsfeeds receive the same protection as editorial decisions on what stories should appear in a newspaper, Kagan wrote. “The principle does not change because the curated compilation has gone from the physical to the virtual world.”

The ruling now creates an almost insurmountable hurdle for the multi-district litigation and other lawsuits seeking to hold social media platforms liable for damages caused by content on their sites, Benjamin said.

That’s because constitutional challenges to protected speech trigger strict scrutiny, the highest standard of judicial review that requires the government to demonstrate a compelling interest in regulating the speech, and show that its actions are “narrowly tailored,” which requires that the government choose the least speech-restrictive means to achieve that end. In only one Supreme Court case has a majority applied strict scrutiny and found it satisfied, he noted.

Even the novel legal strategies being deployed by plaintiffs around the country will run up against the free speech defense.

“It doesn't matter if you violate a public nuisance law that says you can’t gather on a sidewalk if you’re gathering to protest,” he explained. “That’s clearly going to be treated as speech and the court is going to evaluate it in that context.”

Keeping users “hooked” by design won’t matter, says Benjamin

Benjamin, an expert on First Amendment law and telecommunications law, anticipated how the Supreme Court would rule on algorithmically-determined content more than a decade ago in Algorithms and Speech, published when Facebook, the most popular social media platform, had less than half of its more than 3 billion monthly active users.

“The profusion of computer algorithms designed by humans to do the work other humans once did may alter our economy, but it does not significantly change the First Amendment analysis,” he wrote. “So long as humans are making substantive editorial decisions, inserting computers into the process does not eliminate the communication via that editing.”

While society may be rightly concerned about the impact of social media on children, the government will have difficulty regulating it, Benjamin said.

“Social media has such an enormous impact on so many people's lives. I think people are going to be very surprised to hear that no matter how successful social media companies are at creating addictive material, they are likely to win on their First Amendment challenge."  

Benjamin believes the motivation behind the algorithm doesn’t matter legally.

“The plaintiffs say that the social media platforms are sending you content they think you’ll find so alluring that it keeps you hooked. And Moody says that those decisions about what to send you are speech.

"So we have a case that has clearly said that the First Amendment is going to apply to the decisions that are being challenged in these lawsuits. There's just no getting around this issue.”

Testimonial

“Any lawsuit or regulation seeking to penalize social media platforms for decisions — including algorithmic decisions — about what they want to present is going to be subject to the very serious scrutiny courts apply to law regulating the content of speech under the First Amendment. And it’s very unlikely that such lawsuits or regulations will satisfy strict scrutiny.”

Author
Duke Law Professor Stuart M. Benjamin