A Facebook whistleblower who brought internal documents detailing the company's research to The Wall Street Journal and the U.S. Congress unmasked herself ahead of an interview she gave to "60 Minutes," which aired Sunday night.
Frances Haugen, a former product manager on Facebook's civic misinformation team, according to her website, revealed herself as the source behind a trove of leaked documents alleging the social media giant knew its products were fuelling hate and harming children's mental health.
Haugen, who has also worked for companies including Google and Pinterest - said in the interview that Facebook was "substantially worse" than anything she had seen before.
She accused the company of choosing profit over safety and called for the company to be regulated.
"Facebook over and over again has shown it chooses profit over safety. It is subsidising, it is paying for its profits with our safety," Haugen said.
She alleged that the version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.
In the "60 Minutes" interview Haugen explained how the company's News Feed algorithm is optimised for content that gets a reaction.
The company's own research shows that it is "easier to inspire people to anger than it is to other emotions," Haugen said.
"Facebook has realised that if they change the algorithm to be safer, people will spend less time on the site, they'll click on fewer advertisements, they'll make less money."
During the 2020 US presidential election, she said, the company realised the danger that such content presented and turned on safety systems to reduce it.
But "as soon as the election was over they turn them back off, or they change the settings back to what they were before, to prioritise growth over safety, and that feels like a betrayal of democracy to me," she said.
"No one at Facebook is malevolent," she said, adding that co-founder and CEO Mark Zuckerberg did not set out to make a "hateful" platform. But, Haugen said, the incentives are "misaligned."
Facebook's vice president of policy and global affairs Nick Clegg also vehemently pushed back at the assertion its platforms are "toxic" for teens, days after a tense congressional hearing in which US lawmakers grilled the company over its impact on the mental health of young users.
He also disputed reporting in an explosive Wall Street Journal series that Facebook's own research warned of the harm that photo-sharing app Instagram can do to teen girls' well-being.
"It's simply not borne out by our research or anybody else's that Instagram is bad or toxic for all teens," Clegg told CNN but added Facebook's research would continue.
Facing pressure, the company had previously announced it would suspend but not abandon the development of a version of Instagram meant for users younger than 13.