The company publicly announced last week that it was shutting down its Partner Categories program to “help improve people’s privacy on Facebook.” What it didn’t mention was that the move is actually part of the company’s efforts to comply with the GDPR, the new EU data protection law going into effect in May, which imposes consent requirements that make using third-party data more difficult.
While it’s nice to see Facebook deciding to implement this EU-mandatory privacy change across the globe, it would be missing some of the larger picture to interpret this as a completely voluntary, privacy-protective measure taken wholly in response to Cambridge Analytica. Beyond the stark fact of legal compliance, this isn’t even a move that is likely to affect Facebook’s bottom line: the company may actually stand to benefit from this, in terms of boosted profits and solidified market dominance.
Even combined with the repackaged account settings also announced last week, winding down Partner Categories is nowhere near the level of deep, structural change Facebook needs in order to protect privacy in the wake of the Cambridge Analytica scandal.
Partner Categories is a program that has allowed advertisers to use data from seven third-party data broker “partners,” layered on top of Facebook data, to target users with ads. Facebook purchases the third-party data, on behalf of an advertiser, directly from the data broker. Unlike some articles have suggested, the third-party data broker partners do not get Facebook data via this program, but they do get a cut of the advertising sale. Cutting off this program means advertisers will be serving ads to you based only on your Facebook data (plus any data the advertisers themselves have)—which means Facebook gets to keep all this money.
Even though Facebook will soon stop using third-party data to serve us ads, the fact remains that it is collecting more data about us than ever.
Facebook’s greatest data source for targeting users is the data it has itself amassed, and continues to amass, about all of its users. Websites like Facebook now have so much data about users that they are able to help advertisers reach the exact sliver of the population needed to effectively manipulate our opinions—whether about politics or consumer goods. And as an anonymous Facebook director told Marketing Land, while third parties have historically had the most shopper and in-market behavioral data, Facebook has been catching up. As Facebook’s ability to track users across the Web continues to grow via the proliferation of tools like Facebook Login and Facebook pixel (a backend plugin that allows websites to track users and measure ad performance), Facebook can make even more accurate inferences about users. Even though Facebook will soon stop using third-party data to serve us ads, the fact remains that it is collecting more data about us than ever.
Unlike the company’s public announcement, Facebook’s email to advertising partners about the end of the Partner Categories program did mention the General Data Protection Regulation (GDPR), which applies to all companies that process any personal data of EU residents. As Facebook explained in this email, Partner Categories will shut off in the EU on May 25, 2018—the day the GDPR goes into effect—and at the end of September for the rest of the world.
Companies doing business in Europe have been slow to adapt to the GDPR’s requirements, and until EU regulators start acting in earnest to deal with violations, no one knows exactly how it will be enforced. At this point, experts say that after the GDPR goes into effect, companies won’t be able to rely on most existing third-party data collected on EU residents, because that data won’t meet the law’s stringent consent requirements. And the law’s consent rules will make the use of third-party data much more complicated going forward.
The GDPR requires consent from EU residents that is explicit, informed, freely given, and verifiable prior to the collection, correlation, and use of their data, and it has no “grandfather” provision allowing for the use of third-party data collected without GDPR-level consent prior to May 2018. Facebook does not technically have to get rid of the program for the rest of the globe, but having one global policy certainly helps simplify compliance with the GDPR’s complicated rules for the use of third-party data—in addition to likely benefiting the company financially and solidifying its place at the top of the advertising market.
Facebook has big incentives to not mess this up. Fines for GDPR violations can reach 4 percent of a company’s global revenue. For Facebook, with a reported annual revenue of $40 billion, that means fines of up to $1.6 billion. And regulators have made it clear that they intend to go after high-profile violators.
The GDPR goes into effect next month, and Facebook says it has been preparing for it for nearly two years, “supported by the largest cross-functional team in Facebook’s history.” It seems absurd to imagine that the company had not yet started planning how it would address use of non-GDPR compliant third-party data via its Partner Categories program. This change, or at least aspects of it, was almost certainly already in the works before Cambridge Analytica hit headlines. Last week was just a particularly handy time to tell everyone about it—now that Facebook desperately needs to regain people’s trust.
Don’t get us wrong: Data brokers are creepy. We have been critical of Facebook’s decision to serve turbocharged targeted ads based on data from data brokers from the start. Now that the GDPR is blocking use of existing third-party data and imposing new rules for the use of any third-party data going forward, Facebook is finally cutting off their Partner Categories program. That’s great, but that doesn’t mean the company should act as if this had nothing to do with the GDPR, or their bottom line.
Facebook’s announcement about winding down Partner Categories is at best evidence of an attempt to comply with the EU’s privacy law, and at worst a misleading attempt to save face, via changes that will actually benefit the platform.
This is not what proactive privacy protections look like. Privacy protections that are concrete and user-focused are going to require the company to forego extracting as much value as possible from each and every user and do some real, hard thinking about its business model. If Facebook wants to demonstrate that it cares beyond legal compliance, it needs to make far broader changes.