FOSTA-SESTA is getting a new real-world test. The Verge reports that an anonymous woman had filed a lawsuit against Reddit for allegedly letting her ex-boyfriend post sexual images and videos of her that he took without consent when she was 16. The plaintiff had reached out to moderators but was told that she’d have to wait “several days” before content was removed and had little success obtaining a truly permanent ban. According to the suit, she had to “spend hours” coming through subreddits to find the offending material and report it to Reddit.
The woman claimed Reddit’s reported lack of action violates FOSTA-SESTA, an amendment to the Communications Decency Act’s Section 230 that strips safe harbor protections from online services for sex trafficking content. Reddit supposedly violated the law by running ads that made the imagery a “commercial sex act” and allegedly knew this kind of material existed on its service while tolerating its existence.
This Jane Doe is pursuing class-action status to include anyone affected. Reddit denied any tolerance in a statement. It said that child sexual abuse has “no place” in its community. It went “above and beyond” the law in cracking down on this material using a combination of automation and human moderation. The company also stressed that it removed content, banned users, and reported offenders to law enforcement.
The case could help define how broadly FOSTA-SESTA applies. The amendment was meant to curb direct trafficking on sites like Backpage. In this case, the woman argued that it covers child sexual abuse material no matter how it was obtained or whether or not the producer demanded payment. There’s a real chance the court will reject claims this amounted to sex trafficking, but a successful expansion of that definition could open the door to many other legal challenges.
This lawsuit also illustrates the limits of Reddit’s partly community-driven moderation. Subreddit moderators are frequently invited in by existing mods, with Reddit rarely involved. While many mods are up to the task, the approach can leave users few options if neither community overseers nor Reddit’s automated tools quickly catch an offender.