Frances Haugen is the name on everybody’s lips following her 60 Minutes reveal as the Facebook whistleblower.
A former Facebook product manager, Haugen was hired to help secure the social network ahead of elections and battle misinformation. It was an attractive role in part because she’d lost a friend to online conspiracies. After the 2020 election, however, Haugen says Facebook dissolved her team, known as the Civic Integrity unit, which made Haugen think the company was not “willing to actually invest what needs to be invested to keep Facebook from being dangerous.”
On Twitter, Guy Rosen, VP of Integrity at Facebook, disputes Haugen’s story, arguing that Civic Integrity was instead merged with “a larger Central Integrity team,” he says.
Still, the 37-year-old data scientist left the company in May after nearly two years, and she took a few things with her. “At some point in 2021, I realized, ‘OK, I’m going to have to do this in a systemic way, and I have to get out enough that no one can question that this is real,” Haugen tells 60 Mintues60 Mintues‘ Scott Pelley. Her tactic: secretly copy tens of thousands of pages of internal Facebook research from Facebook Workplace threads, which demonstrate the lack of progress against hate, violence, and misinformation at the social network, according to 60 Minutes.
The documents were provided to the Wall Street Journal, which published several stories as part of a series it dubbed The Facebook Files. Last night, the Journal confirmed that Haugen is indeed the Facebook whistleblower.
Before taking a job at Facebook in 2019, Haugen worked for various tech titans like Google, Yelp, and Pinterest. “I’ve seen a bunch of social networks and it was substantially worse at Facebook than anything I’d seen before,” she says.
Optimizing for Hate
The root of the company’s problem, she explains, is the 2018 change Facebook made to its algorithms, which adjusted what users see on their news feed in an attempt to “improve mental health.” The ranking algorithm began showing more posts from family, friends, and groups, and less content from businesses, brands, and media organizations. The change, Facebook expected, would mean people spend less, “more valuable” time on the app.
“One of the consequences of how Facebook is picking out that content today is, it is optimizing for content that gets engagement, or reaction,” Haugen says. “But its own research is showing that content that is hateful, that is divisive, that is polarizing… It’s easier to inspire people to anger than it is to other emotions.
“Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money,” she adds.
In a statement to 60 Minutes60 Minutes, Facebook defended its decisions, noting that “every day, our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”
One of the internal studies Haugen released to The Wall Street JournalThe Wall Street Journal suggests that among teens who have had suicidal thoughts, 13% of British users and 6% of Americans placed the blame on Instagram. Late last month, Facebook VP and head of research Pratiti Raychoudrhury refuted the characterization, claiming that part of the firm’s effort to “minimize the bad on our platforms and maximize the good” requires identifying problems.
“We live in an information environment that is full of angry, hateful, polarizing content; it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other,” Haugen says. “The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world”—like that of Myanmar in 2018, where the military used the social network to launch a genocide.
“Facebook, over and over again, has shown it chooses profit over safety,” the whistleblower adds. “I’m hoping that this will have had a big enough impact on the world that they get the fortitude and the motivation to actually go put those regulations in place. That’s my hope.”
Haugen is scheduled to testify before the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security on Tuesday, Oct. 5, starting at 10 a.m. ET.