Wednesday, April 29, 2026
Page 1
Ninth Circuit:
Action Against Meta Over ‘Myanmar Genocide’ Rightly Tossed
Opinion Says Putative Class Claims Accusing Facebook of Algorithm Defects That Allegedly Resulted in Promotion of Calls for Violence Against Muslim-Minority Group Are Barred by ‘Publisher’ Immunity
By Kimber Cooley, associate editor
The Ninth U.S. Circuit Court of Appeals held yesterday that a judge rightly dismissed a putative class action complaint against Meta Platforms Inc. that accuses Facebook of design defects in a since-replaced algorithm that allegedly supercharged the spread of posts promoting genocide against a Muslim-minority group in Myanmar that was the subject of a targeted campaign of violence by military officials in 2017.
Saying the Silicon Valley-based internet giant is shielded from the plaintiffs’ negligence and design-defect claims asserted under California law by immunity protections established by the Communications Decency Act, codified at 47 U.S.C. §230, relating to third-party posts, the court declared:
“Plaintiffs believe that Facebook’s design, coupled with the darker elements of human nature, caused real-world harm. But Section 230, as we have interpreted it, bars their claims, and we cannot hold Meta ‘responsible for the unfortunate realities of human nature.’ ”
Subdivision (c)(1) of the section, which provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” protects social media companies from liability for third-party posts on their platforms.
Product-Liability Claims
Circuit Judge Ryan D. Nelson authored yesterday’s opinion, joined in by Senior Circuit Judges William A. Fletcher and Marsha S. Berzon, acknowledging case law holding that products-liability causes of action against web-based platforms may fall outside the protection of §230 if the claims do not relate to the content posted by third parties. However, he declared:
“Even if the Facebook algorithm and system of third-party feedback and ‘social rewards’ encouraged the posting of content, nothing about the platform’s design contributed to what made those posts illegal or actionable.”
Berzon, joined by Fletcher, wrote separately to “urge this Court to reconsider en banc our precedent extending section 230 immunity” to claims asserting flaws in platforms’ algorithms that make content recommendations.
Nelson also penned a concurring opinion, arguing that Ninth Circuit precedent has expanded the definition of “publisher” to “include what…was never considered publishing conduct,” but saying that “this is not the case to correct our precedent.”
He noted that the algorithm challenged by the plaintiffs is a now-defunct version that boosted posts based on platform-wide interactions and that more advanced ones, making suggestions based on the users’ history or those favoring certain viewpoints, “are more clearly outside the scope of…conduct that Section 230 protects.”
Scope of Protections
The question of the scope of §230’s protections arose after a plaintiff, identified only as “Jane Doe,” filed a putative class action complaint against Meta in San Mateo Superior Court on Dec. 6, 2021, seeking to represent “[a]ll [members of the minority Rohingya group] who left…Myanmar…on or after June 1, 2012, and arrived in the United States under refugee status, or who sought asylum protection, and now reside in the United States.”
She asserted negligence and product liability claims under California law and sought “compensatory damages, in excess of $150 billion.” After Meta removed the matter to federal court in January 2022 based on diversity jurisdiction, a first amended complaint was filed in March 2023.
In the operative pleading, two “Doe” plaintiffs assert that “the introduction of Facebook into” Myanmar in 2011 “materially contributed to the…widespread dissemination of anti-Rohingya hate speech…, leading to mass killings and rape of the Rohingya people.”
They alleged that “Meta encouraged…toxic and dangerous Facebook posts that directly resulted in attacks on the villages of each of the Plaintiffs, Jane Doe 1 and Jane Doe 2” and cited multiple posts purportedly calling for acts of terrorism against the Rohingya people.
Although allegedly designed to maximize user time on the platform and exposure to revenue-generating advertisements, the plaintiffs assert that the algorithm in place during the 2017 campaign of violence encouraged users to favor making negative posts because those were more likely to receive a higher volume of likes and comments due to the company “boosting” such “popular” content onto more users’ feeds than it would for benign commentary.
In January 2024, District Court Judge Yvonne Gonzalez Rogers of the Northern District of California granted Meta’s motion to dismiss, without leave to amend, declaring that the plaintiffs’ claims are “time-barred as a matter of law” and declining to address the applicability of §230.
Applicable Law
Pointing out that “Meta renews its Section 230 arguments on appeal and asserts that the district court can be affirmed on the ground” that the provision bars the plaintiffs’ claims, Nelson first addressed whether the Communications Decency Act applies to the dispute. Saying that, “[g]enerally, federal courts sitting in diversity apply the law of the forum state,” he considered whether “the rules of decision of Myanmar apply to some questions.”
Noting that “no case in Myanmar concerning a social media company” could be found by the plaintiffs’ foreign law expert, he opined:
“Myanmar’s interest in protecting its citizens from harmful attacks and misinformation on Facebook, while real, is insufficiently incorporated into the positive law of the country. Myanmar’s interest therefore does not predominate. For these reasons, even if we could or should consider Myanmar law, Section 230 applies [as California law includes federal statutes under the Supremacy Clause].”
Turning to the merits of the assertion of §230 protections, he commented that “[w]e have been wary of the ‘artful skirting’ of…immunity,” saying that “if a claim turns on a defendant’s status as a publisher or conduct of publishing—including ‘reviewing, editing, and deciding whether to publish or to withdraw from publication third-party content’—it is barred under Section 230.”
Product Design
Applying those principles, he wrote:
“Plaintiffs characterize Meta’s duty as one of product design—that Meta should not have built Facebook in a way that boosted incitements to violence. Still, the alleged defects relate to Facebook’s core design as a publishing platform, particularly how Facebook promoted or downplayed third-party posts using algorithms. Under our case law, matching users with content is publishing conduct, even when the user has not requested the content.”
Recognizing that jurisprudence has suggested that, if Facebook’s algorithm “was itself Meta’s message or content,” then “we might not have to treat Meta as the publisher or speaker of third-party content,” he said that “our case law has taken another tack on whether algorithm design represents platforms’ own content.”
Citing the 2019 decision in Dyroff v. The Ultimate Software Group, in which the court held that recommendations and notifications were content-neutral tools used to facilitate communications and so fell within §230’s scope, he reasoned:
“Following Dyroff’s lead, we conclude that Facebook’s promotion of engagement-driven content through the…algorithm must be characterized as recommending and matching conduct. We have already decided that such conduct is the work of publishers, rather than a platform’s own content or messaging…. Accordingly, each of Plaintiffs’ claims seek to hold Meta responsible in its capacity as a publisher of the third-party content.”
He added:
“Plaintiffs’ theory of social rewards does not amount to an allegation that Facebook ‘made suggestions regarding the content of potential user posts,’ either….There is no plausible allegation in the complaint that the Facebook algorithm specifically treated anti-Rohingya content differently than any other third-party content.”
Concurring Opinions
Saying that “I…continue to think that this Court’s precedent has unduly expanded the scope of section 230 immunity,” Berzon argued that “this Court’s broad reading…permits internet service providers…to behave indecently, potentially…causing serious harm to vulnerable people.” Pointing out that other circuits have taken steps to narrow the law’s “sweep,” she concluded:
“If not bound by Circuit precedent, I would hold that section 230 does not bar the claims raised against Meta in this case because ‘websites’ use of machine-generated algorithms to recommend content and contacts are not within the publishing role immunized under section 230.’ ”
For his part, Nelson said:
“Under our precedent, Section 230 bars Plaintiffs’ claims. I write separately to address two issues. First, we have over-read Section 230, straying from the original public meaning of the statutory text and creating an all-purpose liability shield for internet platforms. Second, state choice-of-law rules can never direct application of a rule of decision contrary to federal law. Because Section 230 is supreme to state law, it controls even if California’s choice-of-law rules point to a conflicting foreign rule.”
The case is Doe 1 v. Meta Platforms, 24-1672.
According to the United Nations, over 700,000 people fled Myanmar beginning in August 2017 when military groups in the country allegedly carried out acts of violence against the Rohingya people, including reports of entire villages being burned to the ground. In 2018, Mia Garlick, Meta’s director of Asia Pacific policy, told Reuters:
“We were too slow to respond to the concerns raised by civil society, academics and other groups in Myanmar. We don’t want Facebook to be used to spread hatred and incite violence. This is true around the world, but it is especially true in Myanmar where our services can be used to amplify hate or exacerbate harm against the Rohingya.”
Copyright 2026, Metropolitan News Company