In the wake of the Google+ Nymwars, the events of the Arab Spring, and discussion surrounding the Computer Fraud and Abuse Act (CFAA), there is a growing need for both companies and users to have a better understanding of how terms of service (ToS) and community policing methods affect online speech. Social networking platforms like Facebook, Twitter, and Google+--as well as video and photo-sharing sites--are increasingly playing the role of the public sphere, and policies around content removal and account deactivation can have chilling effects on free expression.
Today, the Berkman Center for Internet & Society and the Center for Democracy and Technology (CDT) released a joint paper offering best practices for companies and users in dealing with account deactivation and content removal (full disclosure: I am a co-author of the paper, which was first drafted when I was at the Berkman Center). The paper emerged from the Global Network Initiative's ongoing learning series on the subject.
The paper puts forth two sets of recommendations, one for companies and one for users.
The authors suggest that companies:
- Offer clear, consistent guidelines
- Provide clear methods of contact with support teams
- Develop robust appeals processes
- Embed human rights considerations into their platform design
To users, the authors recommend:
- A better understanding of platform rules
- Increased engagement with companies
- The use of tags and other cues to provide context to content
- Backing up content stored on any social platform or cloud service
The importance of each point becomes apparent in recent incidents during the Arab Spring. Take, for instance, the case of Hossam Hamalawy, an Egyptian activist who uploaded a set of photos to Flickr, only for the company to remove them on the basis that the photos were not his. The photos had been retrieved by activists from Egyptian state security offices, and Hamalawy had been explicit about their origins, prompting Flickr to enforce their guidelines, which advise users to upload only content which they've created. While Hamalawy argued that "Flickr is full of accounts with photos that people did not take themselves," Flickr responded by sharing their own struggles with enforcing the rules evenly. In this case, both company and user could have benefited from the recommendations put forth in the paper.
As privately-owned online social spaces increasingly play the role of the public sphere, companies must take into account the various ways in which users are employing their platforms. And while Facebook and Google+ may be reluctant to identify as "activist platforms," the events of the Arab Spring have made it apparent that this is exactly what they are, whether they like it or not.
At the same time, users have a responsibility to understand the rules and regulations of these online spaces; research indicates that most users don't read license agreements. Users should also feel empowered to stand up to companies when they deem rules or processes to be unfair; or as Rebecca MacKinnon advocated in her recent TED talk, users must "take back the Internet" and become more engaged in policy, be it at the government or corporate level.
Ultimately, however, the power resides with companies, and it is incumbent upon them to implement rules and processes that take human rights into account. As CDT put it in their announcement of the paper today, "By giving greater thought and attention to these issues, these companies can have a significant impact on user rights and user satisfaction." We couldn't agree more.