Through her leaks and Congressional testimony, Frances Haugen, the “Facebook Whistleblower,” revealed a lot about Facebook's operation. Many of these revelations are things we've long suspected but now have proof of: Facebook focuses on growth—of users and time spent on its platforms—to the exclusion of everything else. For Facebook, growth trumps all, even the health and safety of its most vulnerable users.
In her testimony, Haugen explained that at Facebook, metrics are king. Facebook’s “growth division” works to increase "user engagement," and it succeeds. This is a circular process: Facebook identifies content that users "engage" with and promotes it, leading to more engagement. Facebook's automated systems don't evaluate what is being engaged with – they just identify and rank materials by engagement itself. So, according to Haugen, the automated scoring system will rank successful bullying as "engaging" alongside anything else that garners a lot of attention. Politicians who make extreme statements get more engagement, and are therefore ranked higher by Facebook, and are therefore seen by more Facebook users.
It’s not like Facebook could discriminate between “good” and “bad” content even if it wanted to. Haugen says the "AI" Facebook uses to evaluate content is bad at posts in English and worse at posts in other languages. Facebook “focused on scale over safety” and “chooses profit over safety.”
These aren't mere priorities—they are reflected in the incentives Facebook offers to its engineers, designers and product managers, whose bonuses are tied to the quantity of “meaningful social interactions” (AKA "engagement") their products generate.
What’s more, Facebook isn't content to milk its existing, aging user base for "engagement." Facebook’s on again/off again plan for an "Instagram for kids" is a bid to grow its users by habituating people to its products at an early age, normalizing this kind of engagement-maximization as an intrinsic element of social interactions, including on play-dates. Haugen doesn’t believe Facebook’s "pausing" of this plan is permanent. She believes they’re just waiting for the heat to die down.
For Facebook, the heat never dies down. The company is always in the middle of one spectacular scandal or another. Haugen’s testimony confirms what we long suspected – Facebook's neverending crises are the result of a rotten corporate culture and awful priorities.
Ms. Haugen told Congress that she thinks Facebook should be reformed, not broken up. But Facebook’s broken system is fueled by a growth-at-any-cost model. The number of Facebook users and the increasing depth of the data it gathers about them is its biggest selling point. In other words, Facebook’s badness is inextricably tied to its bigness.
EFF’s position is that if when a company’s badness is inseparable from its bigness, it's time to consider breaking that company up.
EFF’s position is that if when a company’s badness is inseparable from its bigness, it's time to consider breaking that company up.
Much of this latest Facebook controversy concerns Instagram ads—specifically, which ads it shows to young people, and what effect these have on their mental health.
Remember, though: Facebook didn’t build Instagram. It bought it, explicitly to neutralize a competitor. That raises the question of whether that merger should have been permitted in the first place, and whether it should be unwound today.
Facebook bought Instagram because it was a “threat.” Instagram was growing, by attracting the younger users who were leaving Facebook. Facebook's research showed that young users viewed it as a service for older people. Facebook's dwindling attractiveness caused friction after the company's merger with Instagram, as Facebookers seethed with jealousy of their Instagram colleagues. Facebook's corporate suspicion of Instagram eventually forced Instagram's founders out of the company, leaving everything about Instagram up to Facebook. Facebook’s focus on engagement, its insularity, its need to pull all services under the umbrella of the core Facebook app—all of that is rooted in its growth-at-any-cost mentality.
For most companies, the goal is to maximize profit. Without meaningful checks, that impulse can run amok, leading to unethical, abusive and, eventually, illegal conduct. The Facebook story, with its repeat offenses despite record fines, consent decrees, and market forces show that these simply do not do the trick.
By establishing breakups as a serious possibility that companies must consider, we can discipline them, so that they police themselves better, and we can open up space for more creative regulatory solutions. And if that doesn't succeed, we can break them up, creating more competition that will discipline their behavior.
Breakups aren't and never will be the first line of defense for every problem with tech. They can be complicated and expensive, and history has shown that when a breakup is not followed by enforcement, a monopoly’s splintered parts can simply reconstitute themselves. The 1984 breakup of AT&T was the result of nearly two decades of work by the Department of Justice, and it led to a radical diversification of the market. But in the two decades that followed, lax merger review and deregulation allowed the telecom market to concentrate into just a handful of big players once again.
We can and should pursue multiple strategies that will get us to a place where we don’t have to worry every morning about what Facebook is doing to us today.
Breakups are a powerful tool. For breakups to be effective, we also need other tools, too—a whole toolbox full of ways to keep companies broken up and ensure a healthy supply of innovative competitors. That means enhancing merger reviews, removing barriers to interoperability, and well-crafted privacy laws to protect consumers and level the playing field.