Why should you care about data brokers? Reporting this week about a Substack publication outing a priest with location data from Grindr shows once again how easy it is for anyone to take advantage of data brokers’ stores to cause real harm.
This is not the first time Grindr has been in the spotlight for sharing user information with third-party data brokers. The Norwegian Consumer Council singled it out in its 2020 "Out of Control" report, before the Norwegian Data Protection Authority fined Grindr earlier this year. At the time, it specifically warning that the app’s data-mining practices could put users at serious risk in places where homosexuality is illegal.
But Grindr is just one of countless apps engaging in this exact kind of data sharing. The real problem is the many data brokers and ad tech companies that amass and sell this sensitive data without anything resembling real users’ consent.
Apps and data brokers claim they are only sharing so-called “anonymized” data. But that’s simply not possible. Data brokers sell rich profiles with more than enough information to link sensitive data to real people, even if the brokers don’t include a legal name. In particular, there’s no such thing as “anonymous” location data. Data points like one’s home or workplace are identifiers themselves, and a malicious observer can connect movements to these and other destinations. In this case, that includes gay bars and private residents.
Another piece of the puzzle is the ad ID, another so-called “anonymous" label that identifies a device. Apps share ad IDs with third parties, and an entire industry of “identity resolution” companies can readily link ad IDs to real people at scale.
All of this underlines just how harmful a collection of mundane-seeming data points can become in the wrong hands. We’ve said it before and we’ll say it again: metadata matters.
That’s why the U.S. needs comprehensive data privacy regulation more than ever. This kind of abuse is not inevitable, and it must not become the norm.