The Facebook whistleblower revealed her identity last night, along with her plans to reform the embattled social media company from the outside. Frances Haugen, a data scientist by training and a veteran of Google and Pinterest, had been recruited to Facebook in 2018 to help the platform prepare for election interference. When she quit in May, she took with her a cache of tens of thousands of documents that now underpin a sweeping congressional investigation into Facebook’s practices.
But Haugen’s turning point came months earlier, on December 2, 2020, less than a month after the presidential election, when the company disbanded the Civic Integrity team she worked on.
“They told us, ‘We’re dissolving Civic Integrity.’ Like, they basically said, ‘Oh good, we made it through the election. There wasn’t riots. We can get rid of Civic Integrity now.’ Fast forward a couple months, we got the insurrection,” Haugen told CBS’s 60 Minutes, referring to the January 6 insurrection at the US Capitol. “And when they got rid of Civic Integrity, it was the moment where I was like, ‘I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.’”
The next day, she used encrypted messaging to contact a Wall Street Journal reporter who had reached out to her earlier, the Journal reported. Part of her motivation, she said, was her fear that the genocide in Myanmar, where the military used Facebook to launch the pogrom, would repeat itself elsewhere.
“When we live in an information environment that is full of angry, hateful, polarizing content, it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other. The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world,” she told CBS.
Until last night, Haugen’s identity had been kept secret by media and politicians alike. She broke her anonymity in an interview with 60 Minutes and a lengthy profile in The Wall Street Journal. Haugen’s leaked documents formed the basis of The Wall Street Journal’s investigation into Facebook’s ills, and they also sparked an ongoing congressional inquiry. She is scheduled to testify before the US Senate on Tuesday.
Haugen traces Facebook’s recent problems to a significant change the company made in 2018 to the News Feed algorithm, which prioritizes the content that is shown to users. Those changes, she said, pushed divisive content to users because that’s what drove engagement and profits. “Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money,” she told CBS.
The company rolled out new safety systems for the 2020 election, but Haugen said they were turned off shortly thereafter. “As soon as the election was over, they turned them back off or they changed the settings back to what they were before to prioritize growth over safety,” she said. “And that really feels like a betrayal of democracy to me.”
Facebook says that some of the systems it put in place for the election remain active.
Facebook’s vice president of policy and global affairs, Nick Clegg, sent a lengthy memo to employees in advance of Haugen’s interview, claiming that social media in general and Facebook in particular are not responsible for rising political polarization in the US and elsewhere. “The idea that Facebook is the chief cause of polarization isn’t supported by the facts,” Clegg wrote. The New York Times has published the internal memo in its entirety.
From the outside
Being on the Civic Integrity team meant that Haugen was intimately familiar with much of Facebook’s research into the various problems its platforms faced, from disinformation to incitements to violence and more. While some of the research dates back a few years, other studies were published this year, including one that says, “We estimate that we may action as little as 3-5% of hate and about 6-tenths of 1% of V & I [violence and incitement] on Facebook despite being the best in the world at it.”
After the Civic Integrity team was disbanded, Haugen began to consider how Facebook might be reformed. “I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground,” she said.
Once she knew that she couldn’t fix the company from the inside, she decided to copy as many files as she could that related problems like violence, hate speech, and mental health. “At some point in 2021, I realized, ‘Okay, I’m gonna have to do this in a systemic way, and I have to get out enough that no one can question that this is real.’”
Haugen’s leaked documents form the basis of eight complaints filed by her attorneys with the Securities and Exchange Commission. They allege Facebook breached its fiduciary duty by not revealing information that was material to investors. The filings give Haugen some protection as a whistleblower, but because she also leaked the documents to the press, she may not enjoy complete immunity from legal action on the part of Facebook.
Before she left Facebook in May, Haugen retrieved tens of thousands of documents from Facebook Workplace. As she trawled the company’s internal social network, she was expecting to get caught, she told the WSJ, because the company logs workers’ activities on the site. “She said that she began thinking about leaving messages for Facebook’s internal security team for when they inevitably reviewed her search activity,” the Journal wrote.
The company apparently either trusted its employees implicitly or had questionable operations security—or both. Haugen said that she found draft versions of presentations intended for CEO Mark Zuckerberg, complete with notes from executives. Documents marked with attorney-client privilege were available seemingly to anyone. She said that nearly all of Facebook’s 60,000-plus employees could have found the files she copied and that she kept gathering more until her last hours with the company.
After reviewing study after internal study, Haugen came to a conclusion—automated recommendation tools may never be safe. What Facebook has built, she told the WSJ, is fatally flawed. “As long as your goal is creating more engagement, optimizing for likes, reshares, and comments, you’re going to continue prioritizing polarizing, hateful content,” she said.
At Facebook, that optimization was pursued aggressively, Haugen said. “The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money,” she told CBS.
“I’ve seen a bunch of social networks, and it was substantially worse at Facebook than anything I’d seen before.”