Facebook’s seven-hour outage on Monday happened hours after an explosive expose by a former employee who had identified herself and divulged details about the allegedly harmful algorithms of the social media website.
“The thing I want everyone to know is that Facebook is far, far more dangerous than anyone knows, and it is getting worse. We can’t expect it to fix itself on its own,” Frances Haugen, 37, said in a television interview to 60 Minutes on Sunday.
Haugen, who has worked with social media giants like Google and Pinterest over a career spanning 15 years, left Facebook this May.
Before leaving the organisation, Haugen, a former product manager with Facebook’s “civic integrity team” had copied thousands of pages of internal research and communications that she shared with the Congress, the Securities and Exchange Commission and The Wall Street Journal.
Explaining why Facebook algorithms were dangerous, Haugen said the social networking site had deliberately ignored changing the algorithm to a safer form, fearing it would wean away people from the site, leading to fewer clicks on ads and fewer revenue.
“I’ve seen a bunch of social networks and it was substantially worse at Facebook than what I had seen before. Facebook, over and over again, has shown it chooses profit over safety,” she told 60 Minutes.
She said the way algorithms were ranked on the platform, they amplified “angry” and divisive content, evidence of which has been found in the company’s own research papers.
“When you have a system that you know can be hacked with anger, it’s easier to provoke people into anger. And publishers are saying, ‘Oh, if I do more angry, polarising, divisive content, I get more money.’ Facebook has set up a system of incentives that is pulling people apart,” she said.
She said Facebook had changed its algorithm in 2018 to promote “meaningful social interactions” through “engagement-based rankings”. While it was found that content with reactions, comments and shares has a wider reach, Facebook’s internal research has corroborated that “angry content” is more engaging.
She alleged that while Facebook assured that it was doing everything to fight fake news, misinformation and hate speech pertaining to the Presidential elections in 2020, it in reality promoted such content, which might have triggered the storming of US Capitol building on January 6, 2021.
“Facebook has publicised its work to combat misinformation and violent extremism relating to the 2020 election and insurrection. In reality, Facebook knew its algorithms and platforms promoted this type of harmful content, and it failed to deploy internally recommended or lasting countermeasures,” she said.
In another interview with The Journal on Monday, Haugen said Facebook’s algorithms are designed in such a way that they don’t allow an idea to die down and constantly remind the user about it – a phenomenon which she said is utterly dangerous for teenagers.
“And that’s part of the danger for like teenagers, right? Part of the reason why these teen girls are getting eating disorders is they one time look up weight loss and the algorithm’s like ‘Oh great. We’ll keep showing you more and more extreme weight loss things’,” she said.
Haugen, who identified herself after applying for federal whistleblower protection, has an army of lawyers to assist her in tackling the legal repercussions of the expose.
Haugen, however, has suggested that dismantling Facebook is not the solution and hiring more people to audit and guide content on the platform would be a more appropriate approach.
The data scientist and Harvard MBA graduate is slated to testify before the Congress this week.