A footnote in the majority opinion says the Court does “not deal here with feeds whose algorithms respond solely to how users act online — giving them the content they appear to want, without any regard to independent content standards.” The note is almost academic in nature — platforms usually take into account many different variables beyond user behavior, and separating those variables from each other is not a straightforward matter.
“Because it’s so hard to disentangle all of the users’ preferences, and the guidance from the services, and the editorial decisions of those services, what you’re left with — technologically speaking — is algorithms that promote content curation. And it should be inevitably assumed then that those algorithms are protected by the First Amendment,” said Jess Miers, who spoke to The Verge before departing her role as senior counsel at center-left tech industry coalition Chamber of Progress, which receives funding from companies like Google and Meta.
“The Supreme Court made it pretty clear, curation is absolutely protected.”
“That’s going to squarely hit the New York SAFE Act, which is trying to argue that, look, it’s just algorithms, or it’s just the design of the service,” said Miers. The drafters of the SAFE Act may have presented the law as not having anything to do with content or speech, but NetChoice poses a problem, according to Miers. “The Supreme Court made it pretty clear, curation is absolutely protected.”
Miers said the same analysis would apply to other state efforts, like California’s Age Appropriate Design Code, which a district court agreed to block with a preliminary injunction, and the state has appealed. That law required platforms likely to be used by kids to consider their best interests and default to strong privacy and safety settings. Industry group NetChoice, which also brought the cases at issue in the Supreme Court, argued in its 2022 complaint against California’s law that it would interfere with platforms’ own editorial judgments.
“To the extent that any of these state laws touch the expressive capabilities of these services, those state laws have an immense uphill battle, and a likely insurmountable First Amendment hurdle as well,” Miers said.
Michael Huston, a former clerk to Chief Justice Roberts who co-chairs law firm Perkins Coie’s Appeals, Issues & Strategy Practice, said that any sort of ban on content curation would likely be unconstitutional under the ruling. That could include a law that, for instance, requires platforms to only show content in reverse-chronological order, like California’s Protecting Our Kids from Social Media Addiction Act, which would prohibit the default feeds shown to kids from being based on any information about the user or their devices, or involve recommending or prioritizing posts. “The court is clear that there are a lot of questions that are unanswered, that it’s not attempting to answer in this area,” Huston said. “But broadly speaking … there’s a recognition here that when the platforms make choices about how to organize content, that is itself a part of their own expression.”
The new Supreme Court decision also raises questions about the future of the Kids Online Safety Act (KOSA), a similar piece of legislation at the federal level that’s gained significant steam. KOSA seeks to create a duty of care for tech platforms serving young users and allows them to opt out of algorithmic recommendations. “Now with the NetChoice cases, you have this question as to whether KOSA touches any of the expressive aspects of these services,” Miers said. In evaluating KOSA, a court would need to assess “does this regulate a non-expressive part of the service or does it regulate the way in which the service communicates third-party content to its users?”
Supporters of these kinds of bills may point to language in some of the concurring opinions (namely ones written by Justices Amy Coney Barrett and Samuel Alito) positing scenarios where certain AI-driven decisions do not reflect the preferences of the people who made the services. But Miers said she believes that kind of situation likely doesn’t exist.
David Greene, civil liberties director at the Electronic Frontier Foundation, said that the NetChoice decision shows that platforms’ curation decisions are “First Amendment protected speech, and it’s very, very difficult — if not impossible — for a state to regulate that process.”
Similarly important is what the opinion does not say. Gautam Hans, associate clinical professor and associate director of the First Amendment Clinic at Cornell Law School, predicts there will be at least “some state appetite” to keep passing laws pertaining to content curation or algorithms, by paying close attention to what the justices left out.