Last week, in Doe #1 et al. v. MG Freesites Ltd. et al., No. 7:21-cv-00220-LSC, 2022 WL 407147 (N.D. Ala. Feb. 9, 2022), the Northern District of Alabama permitted plaintiffs to proceed with a proposed class action against Pornhub for allegedly profiting from child sex trafficking and exploitation videos uploaded to its platform. The claims were brought under the Trafficking Victims Protection Reauthorization Act (“TVPRA”) and federal child sexual abuse material statutes. U.S. District Judge L. Scott Coogler rejected Pornhub’s defense that section 230 of the federal Communications Decency Act shields it from liability for plaintiffs’ claims. Section 230 generally immunizes “interactive computer service[s]” from liability for content provided by third parties. 47 U.S.C. § 230(c).
Instead, the court agreed with plaintiffs’ argument that Pornhub acted as an “information content provider” and as such, was not entitled to section 230 immunity. Citing the Ninth Circuit’s ruling in Fair Housing Council v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008), which the court noted has been adopted by other circuits, Judge Coogler held that section 230 immunity is limited to “interactive computer service[s]” that display content created by third parties, and does not extend to platforms acting as “information content providers” that are “responsible, in whole or in part, for the creation or development of the offending content.” Doe #1, 2022 WL 407147 at *15 (citing Roommates, 521 F.3d at 1162 (internal marks omitted)). In Roommates, the Ninth Circuit set forth the following test for determining whether a website acts as an information content provider: “[a] website helps to develop unlawful content and thus falls within the exception to section 230 [immunity] if it contributes materially to the alleged illegality of the conduct.” Roomates, 521 F.3d at 1168.
Applying that test, the Alabama court noted the plaintiffs’ allegations that Pornhub itself “generate[s] tags, categories, and keywords” to help users seeking child sexual abuse material (“CSAM”) locate such material, sometimes even using coded tags to make CSAM visible to users who search for it. Doe #1, 2022 WL 407147 at *16. The court also pointed to plaintiffs’ allegations that Pornhub creates at least some of the CSAM on its platform, including thumbnails of videos as well as timelines that allow viewers to jump to labeled scenes in videos, and that Pornhub controls the content on its platform by giving uploaders “extensive instructions” about what content to create and uploads. The district court’s ruling indicates that platforms could face liability if they go beyond “merely encourag[ing] the posting of CSAM by providing a means for users to publish what they created” and instead “materially contribute to it by designing their platform for an illicit purpose.” Id.