Page 3
Ninth Circuit:
Portion of New Social Media Law Likely Violates Constitution
Opinion Says Single Provision of Act Targeting Minors’ Use of Platforms, One That Requires Companies to Block Underage Users From Seeing How Many ‘Likes’ Posts Receive, Probably Fails Strict Scrutiny
By Kimber Cooley, associate editor
The Ninth U.S. Circuit Court of Appeals held yesterday that a trial judge erred in not preliminarily enjoining the enforcement of a portion of a newly enacted law—designed to address childhood addiction to social media—that requires platforms to block underage users from seeing how many “likes” or “shares” a post has received, saying that the provision is likely to be found to be content-based and to violate the First Amendment.
However, the court affirmed the judge’s denial of preliminary relief as to the bulk of the statute, finding that the offending provision is severable from the remainder of the law.
At issue is California’s Protecting Our Kids from Social Media Addiction Act, signed into law by Gov. Gavin Newsom last September, which aims to address childhood addiction to social media by regulating minors’ access to personalized feeds based on algorithms predicting their interests and requiring certain default settings for underage users. All of the law’s restrictions can be bypassed with parental consent.
Circuit Judge Ryan D. Nelson, writing for the court, summed up the relevant portions of the law, remarking:
“Regulating minors’ access to personalized feeds is the Act’s pičce de résistance. But it attacks the problem of minors’ social media addiction in other ways, too. California seeks to defang social media sites by regulating some design features and functionalities that it views as making these platforms especially addictive….
“Two default settings are at issue. First, covered web platforms may not show minors how many likes, shares, or other forms of feedback a post has received within a personalized recommendation feed….Second, covered platforms must make minors’ accounts private, which means their posts are visible only to friends on the platform.”
Entitled to Relief
In November, NetChoice LLC—a trade association with members that include Google (owner of YouTube), Meta (proprietor of Facebook and Instagram), and X (formerly known as “Twitter”)—filed a complaint asserting that the law violates the First Amendment and sought to enjoin its enforcement. In the pleading, the group argued:
“California is…attempting to unconstitutionally regulate minors’ access to protected online speech….The restrictions imposed by California…violate bedrock principles of constitutional law and precedent from across the nation. As the United States Supreme Court has repeatedly held, ‘minors are entitled to a significant measure of First Amendment protection.’ ”
In December, Senior District Court Judge Edward J. Davila of the Northern District of California preliminarily enjoined California from enforcing two of the law’s provisions—a restriction on sending minors notifications and a requirement that companies annually report the number of underage users accessing their platforms—but otherwise denied NetChoice’s motion for relief. Bonta did not appeal the order.
After NetChoice appealed the denial of the remainder of the relief it sought, the Ninth Circuit issued a stay preventing California from enforcing any aspect of the act until the court issued its decision in expedited proceedings. In yesterday’s opinion, joined in by Senior Circuit Judges Michael Daly Hawkins and William A. Fletcher, the court declared:
“[W]hen it comes to the like-count default setting, the district court overlooked that the regulation is…content based and thus triggers strict scrutiny….Because we conclude that NetChoice has shown a likelihood of success on the merits regarding this provision,…and…on the remaining injunction factors, we direct the district court to modify its injunction. In all other respects, we affirm the district court’s denial of a preliminary injunction.”
Like-Count Setting
Addressing the like-count default setting mandated by the law, Nelson noted that the threshold question raised on appeal is “[w]hat level of scrutiny is warranted.” He pointed out that if the provision is found to regulate speech based on its content, it is considered to be presumptively unconstitutional and subject to strict scrutiny.
Applying these principles, he opined:
“[T]he regulation of like counts…is…content based….A platform may show a post to a minor. And it may presumably tell that minor that other users have interacted with it. But it cannot tell the minor the number of likes or feedback that the post has received. Thus, whether the Act restricts a website’s description of a post turns on what message the description will communicate….That is content discrimination.”
He continued:
“As a result, strict scrutiny applies to this provision. And the like-count default setting is not the least restrictive way to advance California’s interest in protecting minors’ mental health. Normally, we would remand to the district court to conduct the strict scrutiny analysis….Here, however, on-point authority compels a single result….California could encourage websites ‘to offer voluntary content filters’ related to like counts or educate children and parents on such filters….So we conclude that NetChoice is likely to prevail on the merits of its challenge to the like-count provision as applied to its members.”
The jurist added:
“Since they apply the same way to all covered websites, NetChoice has established a likelihood of success for its facial challenge to the like-count provision, as well.”
Finding the remaining factors relevant to the preliminary injunction analysis—irreparable harm, the balance of equities, and the public interest—weigh in NetChoice’s favor, he remarked:
“[W]e reverse the district court’s denial of an injunction as to that provision and remand to the district court with instructions to enter an order enjoining its enforcement.”
As to the default setting relating to privacy restrictions, he reasoned that the section “is agnostic as to content and therefore need only survive intermediate scrutiny” and concluded that the provision “logically serves the end of protecting minors’ mental health.”
Personalized Feeds
As to the act’s restriction on minors’ access to personalized feeds, Nelson found that the group lacked associational standing to bring an as-applied challenge because the claim required individualized proof. He wrote:
“NetChoice acknowledges that each of its members is unique. That matters because the unique design of each platform and its algorithm affects whether the algorithm at issue is expressive. For example, the more an algorithm implements human editorial directions, the more likely it is to be expressive for First Amendment purposes….
“On the other hand, an algorithm that ‘respond[s] solely to how users act online,’ merely ‘giving them the content they appear to want,’ probably is not expressive….Personalized algorithms might express a platform’s unique message to the world, or they might reflect users’ revealed preferences to them. Knowing where each NetChoice member’s algorithm falls on that spectrum reasonably requires some individual platforms’ participation.”
With respect to the group’s facial challenge, he said:
“The district court concluded that personalized feeds are not necessarily a form of social media platforms’ speech, so restricting personalized feeds does not restrict access to those platforms’ speech….This is a novel question, and we are careful not to decide more than necessary….”
He added:
“[A]ll we recognize is that some personalized recommendation algorithms may be expressive, while others are not, and that inquiry is fact-intensive….We need go no further because NetChoice ‘fails to show that any unconstitutional applications of the statute substantially outweigh its constitutional applications.’ ”
Saying that Davila correctly ruled that the plaintiff’s challenge to a portion of the act requiring companies to verify users’ ages was not yet ripe, he pointed out that this provision does not go into effect until January 2027 and requires the Attorney General to promulgate regulations before that time to define the mandated verification obligations.
The case is NetChoice LLC v. Bonta, 25-146.
The lawsuit represents the second constitutional challenge by NetChoice to recently adopted California legislation dealing with minors on the internet.
In 2022, the group sued Attorney General Rob Bonta over the California Age-Appropriate Design Code Act, which mandated that all businesses that provide online products and services to children must adhere to certain requirements, including the creation of a report on whether the product could cause harm.
In March, on remand from the Ninth Circuit, District Court Judge Beth Labson Freeman of the Northern District of California enjoined enforcement of that act in its entirety. Bonta has appealed the decision.
Copyright 2025, Metropolitan News Company