Facebook took a step toward greater accountability this week, expanding the text of its community standards and announcing the rollout of a new system of appeals. Digital rights advocates have been pushing the company to be more transparent for nearly a decade, and many welcomed the announcements as a positive move for the social media giant.
The changes are certainly a step in the right direction. Over the past year, following a series of controversial decisions about user expression, the company has begun to offer more transparency around its content policies and moderation practices, such as the “Hard Questions” series of blog posts offering insight into how the company makes decisions about different types of speech.
The expanded community standards released on Tuesday offer a much greater level of detail of what’s verboten and why. Broken down into six overarching categories—violence and criminal behavior, safety, objectionable content, integrity and authenticity, respecting intellectual property, and content-related requests—each section comes with a “policy rationale” and bulleted lists of “do not post” items.
But as Sarah Jeong writes, the guidelines “might make you feel sorry for the moderator who’s trying to apply them.” Many of the items on the “do not post” lists are incredibly specific—just take a look at the list contained in the section entitled “Nudity and Adult Sexual Activity”—and the carved-out exceptions are often without rationale.
And don’t be fooled: The new community standards do nothing to increase users’ freedom of expression; rather, they will hopefully provide users with greater clarity as to what might run afoul of the platform’s censors.
Facebook’s other announcement—that of expanded appeals—has received less media attention, but for many users, it's a vital development. In the platform’s early days, content moderation decisions were final and could not be appealed. Then, in 2011, Facebook instituted a process through which users whose accounts had been suspended could apply to regain access. That process remained in place until this week.
Through Onlinecensorship.org, we often hear from users of Facebook who believe that their content was erroneously taken down and are frustrated with the lack of due process on the platform. In its latest announcement, VP of Global Policy Management Monika Bickert explains that over the coming year, Facebook will be building the ability for people to appeal content decisions, starting with posts removed for nudity/sexual activity, hate speech, or graphic violence—presumably areas in which moderation errors occur more frequently.
Some questions about the process remain (will users be able to appeal content decisions while under temporary suspension? Will the process be expanded to cover all categories of speech?), but we congratulate Facebook on finally instituting a process for appealing content takedowns, and encourage the company to expand the process quickly to include all types of removals.