Metropolitan News-Enterprise

 

Monday, August 4, 2025

 

Page 1

 

Ninth Circuit Revives Action Against Twitter Over Child Porn

Opinion Partially Reinstates Lawsuit Seeking to Hold Social Media Giant Responsible for Distribution of Images, Says Product Liability, Negligence per se Claims Are Not Shielded by Publisher Immunity Under CDA

 

By Kimber Cooley, associate editor

 

The Ninth U.S. Circuit Court of Appeals on Friday partially reinstated a civil action seeking to hold Twitter Inc. responsible for posts containing sexually explicit images of two then-13-year-old boys—who say that they were tricked into providing the material by a trafficker posing as a teenage girl—and accuse the social media platform of failing to establish adequate safeguards against the distribution of child pornography.

The opinion by Circuit Judge Danielle J. Forrest reverses the dismissal of a product-liability claim based on an allegedly defective reporting-infrastructure design and a negligence per se claim, each arising out of the social media giant’s purported failure to inform authorities of the posts, as mandated by law. These claims, Forrest said, are not shielded by the immunity afforded to Internet providers in their roles as a publishers of third-party content.

That immunity is set forth in 47 U.S.C. §230(c)(1), a part of the Communications Decency Act of 1996. It provides:

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Forrest said that the plaintiffs’ other claims seeking to hold Twitter responsible for the distribution of child pornography, as well as for alleged defects in search features—which they say allow for the promotion of sexually explicit images of children—and in blocking offending posts were properly dismissed as barred by §230.

Complaint Filed

The question about the scope of §230’s protections arose after the plaintiffs, identified only as “John Does 1 and 2”, filed a complaint against Twitter in January 2021, asserting 13 causes of action relating to the images, which they say began to circulate in their high school in 2020.

After learning of the content, Doe 1’s mother contacted Twitter, seeking to have the posts removed from the platform. According to the plaintiffs, Twitter initially refused to act, saying that no policy had been violated by the distribution of the material.

Nine days later, the posts were removed after federal agents with the Department of Homeland Security contacted the company. At that time, Twitter also reported the content to the National Center for Missing and Exploited Children (“NCMEC”), as required by law.

During the nine days that lapsed between the initial report and the removal of the posts, the images allegedly accrued more than 167,000 views and 2,223 retweets.

Significant Repository

In the operative complaint, the plaintiffs assert that Twitter is a significant repository for child pornography, underutilizes tools to curb the spread of the offending material, and receives significant advertising revenue from hosting popular posts, including ones that depict sexually explicit images of children.

Then-Chief Magistrate Judge Joseph C. Spero for the Northern District of California (now retired) found that all claims are barred by §230. Spero granted defense motions to dismiss, with prejudice, the causes of action under Federal Rule of Civil Procedure 12(b)(6), “failure to state a claim upon which relief can be granted.”

After the ensuing defense judgment, the plaintiffs appealed the dismissal of the product liability and negligence per se claims, as well as a claim seeking to hold the provider liable as a beneficiary of sex trafficking.

Friday’s opinion, joined in by Circuit Judge Gabriel P. Sanchez and Senior Circuit Judge M. Margaret McKeown, affirms the dismissal except as to the claims asserting a defect in the reporting-infrastructure system and the failure to report the posts to the NCMEC.

Immunity Protections

Forrest remarked:

“[Sec.] 230 immunity protects only: ‘(1) a provider or user of an interactive computer service (2) whom a plaintiff seeks to treat…as a publisher or speaker (3) of information provided by another information content provider.’ ”

 

Noting that whether or not an entity qualifies as a “publisher” is generally the key issue in §230 immunity challenges, she said:

“[I]f an interactive computer service provider is disseminating content that it created, it is functioning as a ‘content provider,’ not a publisher, and has no immunity under § 230….But if the provider is disseminating content created by others, it is functioning as a publisher and is immune from liability related to that content.”

Addressing the product defect claims, she explained that the plaintiffs assert that Twitter makes it difficult to report child pornography on the website because a party wishing to flag such content is not able to use an “easily-accessible” feature for reporting other offending posts and is directed to use a special form.

She said that, according to the plaintiffs, “that form has disadvantages” because it “does not allow a user to report child pornography sent via private messaging, it requires reporters to supply an email address, and it requires a person to have and be logged into a Twitter account.”

Alleged Defects

As to the alleged defect, she opined:

“Twitter could fulfill its purported duty to cure reporting infrastructure deficiencies without monitoring, removing, or in any way engaging with third-party content….This claim thus does not seek to hold Twitter responsible as a publisher or speaker….Increased removal of third-party content may well be the outcome of a more robust reporting structure. But a claim alleging a duty that does not treat a defendant as a publisher is not barred by § 230, even if that legal duty ‘might lead a company to respond with monitoring or other publication activities.’ ”

Addressing the other defects asserted by the plaintiffs, a failure to block reported child pornography pending an investigation as well as features allowing search suggestions and hashtags that they say amplify the reach of offending posts, Forrest concluded that such theories are barred by §230.

She said:

“Distinguishing between innocent #ParisOlympics-type hashtags and the more nefarious ones would require Twitter to act as a publisher. Notwithstanding Plaintiffs’ allegation that ‘Twitter has the ability to, and in fact does, block certain hashtags,’ deciding when to take that step is a publisher decision.”

Negligence Per Se

Forrest noted that “[u]nder 18 U.S.C. § 2258A(a)(1)(A)(i), electronic-communication services are required to file a report with NCMEC” as soon as is “reasonably possible” after “obtaining actual knowledge” of violations of sex-trafficking laws involving children, and wrote:

“Plaintiffs do not claim that Twitter must scour its platform for content triggering its NCMEC-reporting duty. They do not even claim that Twitter must review reported child pornography. Rather, they allege that once Twitter has obtained actual knowledge of such content, as evidenced by its representation that it had ‘reviewed the content,’ it had a legal duty to promptly report that content to NCMEC.”

Agreeing with the plaintiffs, she remarked:

“Because that duty neither requires Twitter to monitor content nor take any action associated with publication (e.g., removal) once it learns of the objectionable content, § 230 does not immunize Twitter from Plaintiffs’ negligence per se claim.”

Trafficking Beneficiary

As to the plaintiffs’ claim that Twitter was a beneficiary of sex trafficking in violation of 18 U.S.C. §§1591 and 1595—statutes that allow victims to file a civil action to recover for damages against anyone who “knowingly” benefits by “receiving anything of value” from a venture profiting from child exploitation—they argued that the claim is covered by a statutory carveout to immunity for those who engage in the proscribed activities.

Saying the contention “runs headlong” into recent Ninth Circuit decisions holding that a party asserting that the carveout applies must allege more than that the company ignored third parties’ misdeeds, Forrest declared:

“While we understand the logic of Plaintiffs’ argument that continuing to make available known child pornography is tantamount to facilitating sex trafficking, that reasoning fails under our prior holding that merely turning a blind eye to illegal revenue-generating content does not establish criminal liability under §1591.”

The case is Doe 1 v. Twitter Inc., 24-177.

 

Copyright 2025, Metropolitan News Company