Do Not Track (not to be confused with EFF's recently launched browser policy) makes the most of the documentary web series format: it is a fully immersive, interactive experience that makes good use of cat gifs in order to explain the technologies that track us across the web.
So to tell the truth, I was a bit taken aback when the first thing it did was ask me for my email address—wasn’t this supposed to be about privacy? But in order to explain the world of cookies, trackers, and like mining, director Brett Gaylor opted to show instead of tell. I temporarily disabled Privacy Badger (hat tip to EFF’s tech team—the site doesn’t work if you leave it on), took a read of the site’s privacy policy, and settled in for the ride.
Over the course of seven episodes Gaylor, with the help of prominent experts in this space like danah boyd, Julia Angwin, Emily Bell, Ethan Zuckerberg, Gilad Lotan, Harlo Holmes, and Kate Crawford, builds out a powerful explanation of the privacy invasions we experience on a day-to-day basis.
Because advertising is the dominant business model of the Internet, many of the technologies we interact with daily collect information on us to monetize. Bit by bit, these incursions have built over time into what Gaylor calls the “tracking universe,” making it nearly impossible to regain control over the digital traces we leave behind with every interaction.
He traces the history of the tracking universe from the creation of pop-up ads by Ethan Zuckerman to the Doubleclick case ruling, explaining how we ended up with the web we have today and, importantly, how it could have gone another way.
Gaylor embeds interactive experiences throughout the material to illustrate how the tracking universe works in practice—showing me the data advertisers can sift out from my Facebook account, the number of trackers on my favorite news sites, and the strange conclusions that can be drawn from the seemingly unconnected data points I leave behind as I travel across the web.
Gaylor shows how the algorithms deployed by social networks influence the information we receive about the outside world. According to a Pew survey cited by the doc, 30% of Americans report getting their news through Facebook—a system optimized to give us the content we already like. The documentary emphasizes that social networks create a version of the truth highly influenced by those we are connected to; we’re much less likely to run across news we don’t like when it’s curated by Facebook’s Edgerank algorithms. “Are we opting for clicks and traffic or for an informed public?” it asks. As the documentary emphasizes, this is not just engineering: these are editorial decisions.
These decisions can also have a discriminatory impact on the social fabric of our communities. For example, if tracking data is used, as it increasingly is by data brokers and credit monitors, to assess the economic potential of individuals, it paints a bleak and rather dystopian picture: one in which as a female, I’m assessed as a high-risk candidate for health insurers, and my high ranking on the “openness index” created from my Facebook likes makes it harder for me to get a loan.
A dark reality underlies the analysis of social networks—by matching profiles with similar users and looking at the data of those we’re connected to, creditors and advertisers can make bets on our likely behavior in the future. We can be assessed not only for our own actions, but those of our entire social universe. For example, Facebook recently patented a technology that would allow lenders to discriminate in offering loans to users based on the average credit scores of others in their social network. It's unclear whether and how this technology would be used, but financial start-ups like Affirm already use data collected through social media to assess borrowers' risk.
Because we tend to associate with those similar to us, this could mean the economic potential of those who are already well off will be far more likely to be deemed a safe bet for lenders than those who are poor. Algorithms already create invisible forms of redlining that exacerbate socioeconomic divides: for example, by showing advertisements for higher-paying jobs more frequently to men than women. danah boyd puts it well in a question at the end of the series: "I think it's high time we take a step back to look at the costs of our gains on other people and ask, is this a society that we’re okay with?”
The world we live in at present does not necessarily have to define the future, Gaylor suggests. He helpfully illustrates a number of possible directions we could head, and the steps we can all take to create a more privacy-friendly future: by protecting our own data, lobbying to end government surveillance, and advocating to change the invasive practices of corporations that make surveillance possible. EFF is directly involved in that effort: just this week, we launched a privacy-friendly Do Not Track web browsing standard and a new version of Privacy Badger, a browser add-on that blocks spying ads and invisible trackers.
Curbing the pervasive surveillance of the tracking universe may be challenging, but it's a battle that well worth the effort. When I finished the series, I deleted my data from the site thinking, at least there’s one place on the Internet where I can.