In the wake of the mass shootings at two mosques in Christchurch, New Zealand, that killed fifty-one people and injured more than forty others, the New Zealand government has released a plan to combat terrorist and violent content online, dubbed the Christchurch Call. The Call has been endorsed by more than a dozen countries, as well as eight major tech companies.
The massacre, committed on March 15 by an Australian living in New Zealand connected with white supremacist groups in various countries, was intentionally live-streamed and disseminated widely on social media. Although most companies acted quickly to remove the video, many New Zealanders—and others around the world—saw it by accident on their feeds.
Just ahead of the Call's release, the New Zealand government hosted a civil society meeting in Paris. The meeting included not only digital rights and civil liberties organizations (including EFF), but also those working on countering violent extremism (CVE) and against white supremacy. In the days prior to the meeting, members of civil society from dozens of countries worked together to create a document outlining recommendations, concerns, and points for discussion for the meeting (see PDF at bottom).
As is too often the case, civil society was invited late to the conversation, which rather unfortunately took place during Ramadan. That said, New Zealand Prime Minister Jacinda Ardern attended the meeting personally and engaged directly with civil society members for several hours to understand our concerns about the Call—a rather unprecedented move, in our experience.
The concerns raised by civil society were as diverse as the groups represented, but there was general agreement that content takedowns are not the answer to the problem at hand, and that governments should be focusing on the root causes of extremism. PM Ardern specifically acknowledged that in times of crisis, governments want to act immediately and look to existing levers—which, as we’ve noted many times over the years, are often censorship and surveillance.
We appreciate that recognition. Unfortunately, however, the Christchurch Call released the following day is a mixed bag that contains important ideas but also endorses those same levers.
The good:
- The first point of the Christchurch Call, addressing government commitments, is a refreshing departure from the usual. It calls on governments to commit to “strengthening the resilience and inclusiveness of our societies” through education, media literacy, and fighting inequality.
- We were also happy to see a call for companies to provide greater transparency regarding their community standards or terms of service. Specifically, companies are called upon to outline and publish the consequences of sharing terrorist and violent extremist content; describe policies for detecting and removing such content and; provide an efficient complaints and appeals process. This ask is consistent with the Santa Clara Principles and a vital part of protecting rights in the context of content moderation.
The not-so-good:
- The Call asks governments to “consider appropriate action” to prevent the use of online services to disseminate terrorist content through loosely defined practices such as “capacity-building activities” aimed at small online service providers, the development of “industry standards or voluntary frameworks,” and “regulatory or policy measures consistent with a free, open and secure internet and international human rights law.” While we’re glad to see the inclusion of human rights law and concern for keeping the internet free, open and secure, industry standards and voluntary frameworks—such as the existing hash database utilized by several major companies—have all too often resulted in opaque measures that undermine freedom of expression.
- While the government of New Zealand acknowledged to civil society that their efforts are aimed at social media platforms, we’re dismayed that the Call itself doesn’t distinguish between such platforms and core internet infrastructure such as internet service providers (ISPs) and content delivery networks (CDNs). Given that, in the wake of attacks, New Zealand’s ISPs acted extrajudicially to block access to sites like 8Chan, this is clearly a relevant concern.
The ugly:
- The Call asks companies to take “transparent, specific measures” to prevent the upload of terrorist and violent extremist content and prevent its dissemination “in a manner consistent with human rights and fundamental freedoms.” But as numerous civil society organizations pointed out in the May 14 meeting, upload filters are inherently inconsistent with fundamental freedoms. Moreover, driving content underground may do little to prevent attacks and can even impede efforts to do so by making the perpetrators more difficult to identify.
- We also have grave concerns about how “terrorism” and “violent extremism” are defined, by whom. Companies regularly use blunt measures to determine what constitutes terrorism, while a variety of governments—including Call signatories Jordan and Spain—have used anti-terror measures to silence speech.
New Zealand has expressed interest in continuing the dialogue with civil society, and has acknowledged that many rights organizations lack the resources to engage at the same level as industry groups. So here's our call: New Zealand must take its new role as a leader in this space seriously and ensure that civil society has a early seat at the table in all future platform censorship conversations. Online or offline, “Nothing about us without us.”
UPDATED May 16, 2019: This post was edited to correct the nationality of the shooter in the Christchurch massacre.