Democracies are in many ways like the internet. In both cases, it may take a thousand cuts to demolish their foundation, yet each cut contributes significantly to their erosion. One such cut exists in the Digital Services Act (DSA) in the form of drastic and overbroad government enforcement powers.
But first, the good news. By rejecting the idea of a made-in-Europe "filternet" and by focusing on user rights and better transparency, the version of the DSA that the European Parliament approved in January 2021 got many things right. The DSA strengthens users’ right to online anonymity and private communication (Article 7) and its preamble explains that users should have the right to use and pay for services anonymously wherever reasonable (Recital 28).
However, we are worried about how enforcement is envisioned. The draft bill disturbingly lets government agencies order a broad range of providers to remove allegedly illegal content (Article 8). Another provision allows governments alarming powers to uncover data about anonymous speakers—and everyone else—without adequate procedural safeguards (Article 9). Finally, a new mechanism gives special privilege to “trusted flaggers” of content, which can also include law enforcement agencies. All of this adds up to enforcement overreach that the Parliament should have rejected.
Direct Administrative Agencies and Orders for Content Takedowns
Article 8 empowers national judicial or administrative authorities to issue mandatory takedown orders directly to “intermediary services" such as social media networks. In its recent vote, the Parliament rejected EFF’s suggestion and the proposal by the Civil Liberties Committee to limit these powers to independent courts. Instead, the Parliament followed the Commission’s proposal and allowed a broad category of non-independent authorities to exercise this power.
Taking down user communication is a highly intrusive act that interferes with the right to privacy and threatens the foundation of a democratic society. This is why only judicial authorities should be authorized to issue such orders. They are best able to determine with certainty that the material at issue is unlawful. By contrast, non-independent administrative authorities are often under the supervision of executive political power and don’t necessarily consider the legitimate interests of all parties involved, including the protection of their fundamental rights.
On a positive note, instead of supporting global content takedowns, the EU Parliament partly followed EFF’s suggestion to limit takedown orders to the territory of the issuing state, also known as "geo-blocking." However, where “rights at stake require a wider territorial scope,” cross-border and even worldwide removal orders remain possible, which could also encourage companies to preemptively block content in countries where such content may not be illegal at all.
Mandatory Disclosure of Data by Authorities
Under the Parliament’s version of the DSA, the same broad range of intermediaries will be compelled to unmask their users (“individual recipients of a service”) or disclose their data in response to a government agency’s order. Under Article 9, authorities can order a broad range of intermediaries to turn over “a specific item of information” about the user who is receiving a particular service – think about background information about a user who has set up a social media account on Facebook. While the specificity required in the data request is a good thing for privacy, the DSA fails to explain which particular “item[s] of information” about internet users can be mandated by these orders. For instance, some online service providers may be able to identify their users’ likely physical location through their IP address or even cell tower data.
Even though the Recitals of the DSA explain that such orders should be issued in compliance with the GDPR and respect the prohibition of mandated monitoring, we are concerned that, without additional safeguards, the DSA could nevertheless disrespect users’ right to privacy and legitimize surveillance measures by national authorities. When it comes to surveillance measures, the Human Rights Committee has made clear that any interference with the right to privacy has to comply with the Legality Principle: it should not only be authorized by law, but the law should be “sufficiently precise and specify in detail the precise circumstances” in which any such interference may be permitted. The European Court of Human Rights has stressed the need for effective legal protection against arbitrary interference with the rights to private life under Art 8 of the European Convention on Human Rights.
Any system is vulnerable to capture, especially one that lacks checks and balances. That is why only independent courts should have the legitimacy to issue orders, whether those orders relate to illegal content or access to information. And since there will always be attempts to bypass this process, the regulation needs clear, predictable, proportionate, and consistent thresholds. Law enforcement should neither be singled out as an exceptional case nor allowed to shortcut the accountability route.
Reasonable alternative text versions were on the table ahead of the vote, including the suggestion to limit enforcement power to the investigation of serious crimes, but ultimately, they were rejected by a majority of the Members of Parliament. Instead, the Parliament opted for a third option: adding procedural safeguards, such as a reference to requirements under national administrative law and the rights of platforms and users to seek an effective remedy against such orders. This is certainly an important improvement compared to the slim text version presented by the EU Commission, which barely went beyond the requirements of “legal basis” and “statement of reasons why the information is required.”
However, a right to appeal is not good enough. Of course, the availability of legal remedies is necessary as a corrective measure; but what precedes an appeal is even more significant. One trend that shows up over and over again in tech policy is that users struggle with systems of private adjudication either due to their complexity or intense resource requirements. Having checks and balances in place not only preserves the integrity of the appeals process but also ensures users continue to enjoy their fundamental rights. The Parliament compromise does not achieve this. Disclosing the identity of people and intimate details of their lives is a highly intrusive act that should be subjected to strict legality, necessity, and proportionality requirements.
Trusted Flaggers
Like other organizations, EFF is worried that the current draft of the DSA will open the door to enforcement overreach and, in the process of doing so, assign trust to entities that don't deserve it. Whereas traditionally there are certain checks and balances regarding the process of enforcement, the DSA lifts some of these and tilts the scales towards a system that could be prone to abuse. This will have a significant impact on the rights of users, in particular that of privacy.
One place where abuse can occur is in how the DSA mandates “trusted flaggers.” Trusted flaggers are not new; YouTube, for instance, adopted its own system in 2012 to clamp down on violent content, and various companies use them for copyright enforcement. In a nutshell, trusted flaggers are either individuals or entities with certain expertise used by hosting services to "flag" illegal content. Typically, trusted flaggers are considered a voluntary measure for platforms, and some have questioned their effectiveness. Nonetheless, under the DSA, the status of trusted flagger is awarded by the "Digital Services Coordinator," which is a national authority responsible for supervising the services of online platforms. The DSA allows law enforcement agencies or profit-seeking industry organizations to apply for the status of a trusted flagger, the notices of which must be treated with priority. The legislation draft has certain conditions under which the trusted flagger status will be awarded, but they still lack appropriate safeguards for users, opening the door to potential notice, action, and human rights misuse. In practical terms, this means that law enforcement agencies (the DSA specifically mentions the European Union Agency for Law Enforcement Cooperation - "Europol") will have a privileged channel to go to platforms and request the swift removal of content. Does law enforcement deserve this amount of trust? We highly doubt it, especially considering the recent decision against Europol’s unlawful hoarding of user data issued by the European Data Protection Supervisor (DPS). The data, which amounted to “billions of points of information,” could be used by Europol indiscriminately to surveil European citizens. This power of Europol will only increase through the trusted flagger system.
The same goes for the police. With each member state assigning different rights and responsibilities to its police force, we should expect an environment that is highly unpredictable and inconsistent and could encourage forum shopping. And worse, the abuse this system could enable in countries like Poland and Hungary, which have been on the record for their anti-human rights views, must not be taken lightly. Hungary, for instance, has already been found guilty by the European Court of Human Rights of violating the right to respect for private life of an Iranian transgender man; similarly, Poland is ranked as the lowest in the EU when it comes to respect of human rights and, in particular, those of the LGBTQ+ community. How comfortable are we with this type of online enforcement privatization?
It’s crucial that these issues are addressed before the DSA becomes law. For example, policymakers should ensure that trusted flaggers act independently from commercial entities and law enforcement agencies, which often only follow efficiency objectives. Trusted flaggers should have the collective interests of the public and the protection of fundamental rights in mind.
What’s Next? Negotiations.
As the DSA continues its path towards finalization, EFF hopes that these issues are addressed in the ongoing negotiations between the EU institutions. We recommend that the European Union, which seeks to advance its rule of law on the internet, takes a hard look at the potential for enforcement overreach and act now to address it.