EFF and more than 100 civil society organizations across the globe wrote directly to Mark Zuckerberg recently demanding greater transparency and accountability for Facebook content moderation practices. A key step, we told Facebook, is implementation of a robust appeals process giving all users the power to challenge and reverse the platform’s content removal decisions.
Facebook responded earlier this month in a private letter to EFF that we made public this week. In it we see a win for users fighting to get their content back up. Facebook has extended the appeals option to cover almost all categories of speech alleged to be in violation of platform rules, including content deemed to be spam, dangerous, or terrorist propaganda. People who report content that violates Facebook’s community standards but remains on the platform will also get to challenge Facebook’s inaction (although it’s not clear when), according to the letter.
Facebook said user appeals will be reviewed by people other than those who originally decided to remove content. In addition, Mr. Zuckerberg recently announced plans to launch an “independent body” next year that will hear and decide certain appeals. In its letter to EFF, Facebook said it would consult with stakeholders and external groups, including signatories to the Zuckerberg letter, during the decision-making process about the independent panel. We will hold Facebook to its word to ensure that geographically and culturally diverse voices are heard.
Independent review of content removal is an important element of the Santa Clara Principles—a set of minimum content moderation standards with a human rights framing created by EFF and its partners. We called on Facebook, Twitter, and other social media platforms to implement the standards when they were announced in May. We’re still waiting, but it’s encouraging to see Facebook implementing a key element of the Santa Clara Principles.
Facebook wrote, and users are confirming, that it’s providing more details about alleged violations of community standards when notifying users about content removal. The company also says it has upped its data reporting game. Two days after the global coalition wrote to Mr. Zuckerberg demanding more transparency about content moderation, Facebook published a new report with more information about the volume of content removals. Facebook said in its response that future transparency reports will include data on how quickly it removed content in violation of its rules and how often content was restored following appeals.
Unfortunately, Facebook’s efforts to be more transparent about content moderation only go so far. That has to change. The company wrote that it doesn’t disclose information about the format of the content it removes, whether it’s text, photos, or video, or reveal whether a government or individual reported and/or requested content be removed. “We don’t believe this data provides critical information to our users or civil society about our content review practices,” Facebook wrote to EFF. We and over 100 civil society organizations strenuously disagree. Facebook has removed legitimate speech at the behest of governments seeking to suppress voices in marginalized communities. We want to know if speech is removed because of government requests, algorithms, or decisions by employees.
We will continue work with our partners around the world to push for more accountability and transparency from Facebook. Our open letter of demands to Mr. Zuckerberg was an unprecedented effort to collectively advocate for hundreds of millions of Facebook users who rely on the platform to share, communicate, and create. We know that sloppy, inconsistent, and unfair content moderation practices hurt, and have real world consequences for the most vulnerable communities that struggle to be heard. That’s why we’ll keep the pressure on in 2019 and beyond.