Facebook just quietly adopted a policy that could push thousands of innocent people off of the platform. The new “sexual solicitation” rules forbid pornography and other explicit sexual content (which was already functionally banned under a different statute), but they don’t stop there: they also ban “implicit sexual solicitation”, including the use of sexual slang, the solicitation of nude images, discussion of “sexual partner preference,” and even expressing interest in sex. That’s not an exaggeration: the new policy bars “vague suggestive statements, such as ‘looking for a good time tonight.’” It wouldn’t be a stretch to think that asking “Netflix and chill?” could run afoul of this policy.
The new rules come with a baffling justification, seemingly blurring the line between sexual exploitation and plain old doing it:
[P]eople use Facebook to discuss and draw attention to sexual violence and exploitation. We recognize the importance of and want to allow for this discussion. We draw the line, however, when content facilitates, encourages or coordinates sexual encounters between adults.
In other words, discussion of sexual exploitation is allowed, but discussion of consensual, adult sex is taboo. That’s a classic censorship model: speech about sexuality being permitted only when sex is presented as dangerous and shameful. It’s especially concerning since healthy, non-obscene discussion about sex—even about enjoying or wanting to have sex—has been a component of online communities for as long as the Internet has existed, and has for almost as long been the target of governmental censorship efforts.
Until now, Facebook has been a particularly important place for groups who aren’t well represented in mass media to discuss their sexual identities and practices. At very least, users should get the final say about whether they want to see such speech in their timelines.
Overly Restrictive Rules Attract Trolls
Is Facebook now a sex-free zone? Should we be afraid of meeting potential partners on the platform or even disclosing our sexual orientations?
With such broadly sweeping rules, online trolls can take advantage of reporting mechanisms to punish groups they don’t like.
Maybe not. For many users, life on Facebook might continue as it always has. But therein lies the problem: the new rules put a substantial portion of Facebook users in danger of violation. Fundamentally, that’s not how platform moderation policies should work—with such broadly sweeping rules, online trolls can take advantage of reporting mechanisms to punish groups they don’t like.
Combined with opaque and one-sided flagging and reporting systems, overly restrictive rules can incentivize abuse from bullies and other bad actors. It’s not just individual trolls either: state actors have systematically abused Facebook’s flagging process to censor political enemies. With these new rules, organizing that type of attack just became a lot easier. A few reports can drag a user into Facebook’s labyrinthine enforcement regime, which can result in having a group page deactivated or even being banned from Facebook entirely. This process gives the user no meaningful opportunity to appeal a bad decision.
Given the rules’ focus on sexual interests and activities, it’s easy to imagine who would be the easiest targets: sex workers (including those who work lawfully), members of the LGBTQ community, and others who congregate online to discuss issues relating to sex. What makes the policy so dangerous to those communities is that it forbids the very things they gather online to discuss.
Even before the recent changes at Facebook and Tumblr, we’d seen trolls exploit similar policies to target the LGBTQ community and censor sexual health resources. Entire harassment campaigns have organized to use payment processors’ reporting systems to cut off sex workers’ income. When online platforms adopt moderation policies and reporting processes, it’s essential that they consider how those policies and systems might be weaponized against marginalized groups.
A recent Verge article quotes a Facebook representative as saying that people sharing sensitive information in private Facebook groups will be safe, since Facebook relies on reports from users. If there are no tattle-tales in your group, the reasoning goes, then you can speak freely without fear of punishment. But that assurance rings rather hollow: in today’s world of online bullying and brigading, there’s no question of if your private group will be infiltrated by the trolls; it’s when.
Did SESTA/FOSTA Inspire Facebook’s Policy Change?
The rule change comes a few months after Congress passed the Stop Enabling Sex Traffickers Act and the Allow States and Victims to Fight Online Sex Trafficking Act (SESTA/FOSTA), and it’s hard not to wonder if the policy is the direct result of the new Internet censorship laws.
Wrongheaded as it is, the new rule should come as no surprise. After all, Facebook endorsed SESTA/FOSTA.
SESTA/FOSTA opened online platforms to new criminal and civil liability at the state and federal levels for their users’ activities. While ostensibly targeted at online sex trafficking, SESTA/FOSTA also made it a crime for a platform to “promote or facilitate the prostitution of another person.” The law effectively blurred the distinction between adult, consensual sex work and sex trafficking. The bill’s supporters argued that forcing platforms to clamp down on all sex work was the only way to curb trafficking–nevermind the growing chorus of trafficking experts arguing the very opposite.
As SESTA/FOSTA was debated in Congress, we repeatedly pointed out that online platforms would have little choice but to over-censor: the fear of liability would force them not just to stop at sex trafficking or even sex work, but to take much more restrictive approaches to sex and sexuality in general, even in the absence of any commercial transaction. In EFF’s ongoing legal challenge to SESTA/FOSTA, we argue that the law unconstitutionally silences lawful speech online.
While we don’t know if the Facebook policy change came as a response to SESTA/FOSTA, it is a perfect example of what we feared would happen: platforms would decide that the only way to avoid liability is to ban a vast range of discussions of sex.
Wrongheaded as it is, the new rule should come as no surprise. After all, Facebook endorsed SESTA/FOSTA. Regardless of whether one caused the other or not, both reflect the same vision of how the Internet should work—a place where certain topics simply cannot be discussed. Like SESTA/FOSTA, Facebook’s rule change might have been made to fight online sexual exploitation. But like SESTA/FOSTA, it will do nothing but push innocent people offline.