Whether states can regulate social media content for minors is now before the Ninth Circuit Court of Appeals — and the appellate court clearly considers the issue pressing and important. The questions raised reach beyond protection of minors and could profoundly impact how social media creates personalized feeds for users. Recently, the Ninth Circuit stayed enforcement of California’s “Protecting our Kids from Social Media Addiction Act” and expedited a tech industry appeal from a trial court order that declined to enjoin most of the law.
Social media services have relied for nearly 30 years on the federal Communications Decency Act, 47 U.S.C. § 230, as a bar against state laws that might control how user-created content is distributed online. In NetChoice v. Rob Bonta, however, the services are contending that the individualized feeds seen by users on social media are the platforms’ expressive speech, and California’s Senate Bill 976, which was to take effect on January 1, 2025, infringes on their First Amendment rights.
This is a bit of a twist from positions taken by NetChoice’s members in other litigation. In most cases that have reached the appellate courts, Facebook, YouTube, X and other platforms have contended that they are merely serving up content created by users, and state regulations are preempted by section 230, a federal law immunizing interactive computer services from liability for user postings. Numerous courts have held that the choice of content delivered to individual users involves “publisher” decisions for which they are immune from liability, and the platforms can do as they please.
The battle over SB 976 is a First Amendment battle, however, because the law regulates how the social media services operate their services, including age verification and “age gating.” SB 976 requires social media companies to (1) restrict minors’ access to certain personalized feeds, (2) refrain from sending minors notifications during certain times of the day, (3) develop settings that parents can use to control their childrens’ social media use, and (4) make public disclosures regarding the number of minors using their services.
U.S. District Judge Edward J. Davila for the Northern District of California denied most of a motion by NetChoice to enjoin enforcement of the law on December 31, 2024, but two days later entered an order granting an injunction pending NetChoice’s appeal. Judge Davila acknowledged in his January 2 order that “the First Amendment issues raised by SB 976 are novel, difficult, and important, especially the law’s personalized feed provisions.” He acknowledged that if NetChoice is correct in its contention that the entire law is unconstitutional,“its [industry] members and the community will suffer great harm from the law’s restriction of speech.”
While SB 976 purports to limit the negative effects of social media on children, its implementation could affect adult communications as well, and Judge Davila acknowledged this in his January 2 order: “Given that SB 976 can fundamentally reorient social media companies’ relationship with their users, there is great value in testing the law through appellate review.” The Ninth Circuit should have an opportunity to review “these weighty issues before the law goes into effect.”
The Ninth Circuit evidently agrees. It issued a one-page order enjoining California from enforcing SB 976 while the appeal is pending and expediting the appeal so oral argument will be held in April — keeping the case on an exceptionally fast track.
In his December 31 order largely denying NetChoice’s motion for a preliminary injunction, Judge Davila focused on the law’s imposition of an “age gate,” which would block minors from accessing certain features without “verifiable parental consent.” Services were required to start gating on January 1 with users actually known by the services to be minors, and, by January 1, 2027, the services would be required to gate any user the company cannot reasonably determine to be an adult. Behind the age gate would be “personalized feeds” that recommend user-generated or user-shared content based on “information provided by the user, or otherwise associated with the user or the user’s device.” Also, minors would not receive notifications at night between 12 a.m. and 6 a.m. and during school hours (8 a.m. to 3 p.m.) from September through May.
SB 976 also requires social media companies to develop settings that parents can use to control their children’s social media use, including making their accounts private, and requires companies to annually disclose to the public the number of minors using their service, and the number of minors who received parental consent to access personalized feeds.
Judge Davila found in his lengthy ruling on December 31 that much of the dispute over the law was not “ripe” for decision and an injunction was inappropriate because the most onerous aspects (for the social media companies) would not take effect until 2027, the state had not yet developed implementing regulations, and the court record lacked a lot of necessary facts, including the current state and future of technology for age verification. He did preliminarily enjoin enforcement of the notification and mandatory disclosure provisions, however.
Most interesting about Judge Davila’s ruling is the discussion of whether personalized feeds — which are the creation of algorithms that drive particular content to individual users based on data collected about each user by the social media service — are protected speech by the social media services themselves. In Moody v. NetChoice, 603 U.S. 707 (2024), the U.S. Supreme Court found that “[d]eciding on the third-party speech that will be included in or excluded from a compilation — and then organizing and presenting the including items — is expressive activity of its own.” Meaning, the social media services may engage in speech in their selection of content for users. But Judge Davila held that Moody “stands only for the proposition that restrictions on a private speaker’s ability to compile and organize third-party speech implicate speech rights only if those restrictions impair the speaker’s own expression.” (Emphasis added). And Judge Davila said the record was unclear as to whether the personalized feeds regulated by SB 976 were the result of the social media services’ independent content standards or based “solely on how users act online” — which might not be the service’s expressive speech at all.
Judge Davila agreed that an algorithm designed to convey a message can be expressive (thereby triggering First Amendment protections), but “what if an algorithm’s creator had other purposes in mind? What if someone creates an algorithm to maximize engagement, i.e., the time spent on a social media platform?” That, the judge said, might not trigger First Amendment concerns. SB 976 allows social media platforms to continue with content moderation, such as removal of posts for violation of the services’ terms of service and community standards, which are discretionary publisher activities protected under Section 230 as well as the First Amendment.
“‘The First Amendment does not prevent restrictions directed at . . . conduct from imposing incidental burdens on speech,’” Judge Davila wrote, quoting Sorrell v. IMS Health, Inc., 564 U.S. 552, 567 (2011). Regulating feeds that use algorithms mostly relying on non-expressive factors may not trigger First Amendment scrutiny at all because doing so only incidentally burdens any expressive component of these algorithms.”
“In short,” Judge Davila wrote in his December 31 order, “much of the First Amendment analysis depends on a close inspection of how regulated feeds actually function. Because NetChoice has not made a record that can be used to address these important questions, it has not met its burden to show facial unconstitutionality.”
How this will be decided by the Ninth Circuit — and possibly ultimately by the Supreme Court — will be vitally important to operators of online platforms and users of their services. Aggregation, ranking, and removals of content on social media might seem, at first glance, the simple exercise of editorial decision-making that was found to be protected fifty years ago by the Supreme Court in Miami Herald Publishing Co. v. Tornillo, 418 U.S. 241 (1974). But, quoting Judge Davila, “old precedents on editorial discretion do not fully resolve the issue at hand regarding the expressiveness of personalized feeds.”