Facebook Whistleblower Comes Forward in 60 Minutes Interview: ‘Over and Over Again…It Chooses Profit over Safety’ (UPDATED)

 
facebook whistleblower frances haugen on 60 minutes

Screenshot via CBS.

Frances Haugen was hired by Facebook to join the social network’s “Civic Integrity” team, and was distraught when the group was dissolved right after the 2020 election. She began gathering internal documents and communications, eventually quitting in May and turning over a trove of tens of thousands of pages of evidence to federal investigators. On Sunday, she publicly revealed her identity for the first time, granting an interview to 60 Minutes’ Scott Pelley.

Haugen, 37, is a data scientist with an undergraduate degree in computer engineering and a master’s degree in business from Harvard. Over her career, she worked for Google and Pinterest before being recruited to join Facebook in 2019.

“The thing I saw at Facebook over and over again,” Haugen told Pelley, “was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money.”

Comparing Facebook to other social networks, she said the problems at Facebook were “substantially worse” than anything she had seen before.

Pelley asked her why she didn’t just quit, and Haugen replied that she had seen others try to fix things within Facebook but just get “ground down,” and she decided to gather evidence “in a systemic way,” and “get out enough that no one can question that this is real.”

Among the internal research Haugen gathered, she says there is proof that the company is lying to the public about their efforts to combat hate, violence, and misinformation.

“The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world,” said Haugen, including the 2018 genocide in Myanmar, organized by the military on Facebook.

The issue of misinformation is personal for Haugen, who told 60 Minutes that she had lost a friend to online conspiracy theories. “I never wanted anyone to feel the pain that I had felt,” she said, citing the “high stakes” in making sure Facebook had high quality information.

Haugen recalled how her work with Civic Integrity had focused on risks to elections, including misinformation, and how it had been a turning point when the company decided to dissolve the group after the election — only for the Jan. 6 insurrection to happen just a few months later.

“When they got rid of Civic Integrity, it was the moment where I was like, ‘I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous,'” she said.

The root of the problem, Haugen explained, was the 2018 changes Facebook made to its algorithms that decide what content appears on users’ Facebook news feeds.

Facebook is “optimizing for content that gets engagement, or reaction,” she said, “but its own research is showing that content that is hateful, that is divisive, that is polarizing — it’s easier to inspire people to anger than it is to other emotions.”

“Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money,” said Haugen.

Haugen also criticized Facebook for turning off the safety systems designed to reduce misinformation during the 2020 election as soon as the election was over, “to prioritize growth over safety.”

Making those changes only temporary, said Haugen “really feels like a betrayal of democracy to me.”

In a written statement to 60 Minutes, Facebook said that some of those safety systems did remain in place, and defended its content management strategies.

“Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place,” said Lena Pietsch, Facebook’s director of policy communications. “We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”

Watch the video clips above, via CBS.

This article has been updated with additional information. 

Tags:

Sarah Rumpf joined Mediaite in 2020 and is a Contributing Editor focusing on politics, law, and the media. A native Floridian, Sarah attended the University of Florida, graduating with a double major in Political Science and German, and earned her Juris Doctor, cum laude, from the UF College of Law. Sarah's writing has been featured at National Review, The Daily Beast, Reason, Law&Crime, Independent Journal Review, Texas Monthly, The Capitolist, Breitbart Texas, Townhall, RedState, The Orlando Sentinel, and the Austin-American Statesman, and her political commentary has led to appearances on television, radio, and podcast programs across the globe. Follow Sarah on Threads, Twitter, and Bluesky.