Facebook whistleblower says company engaged in ‘betrayal of democracy’
A former Facebook employee who is sounding the alarm regarding the company’s practices said the platform engaged in a “betrayal of democracy” when it scaled back anti-misinformation efforts following the 2020 presidential election.
Frances Haugen, who quit her job at Facebook in May, told CBS’s Scott Pelley during an interview on “60 Minutes,” which aired on Sunday, that the company implemented safety systems ahead of the 2020 election in an effort to reduce misinformation, but many of them ended up being temporary.
She said the company ultimately prioritized growth over safety.
“And as soon as the election was over, they turned them back off or they changed the settings back to what they were before to prioritize growth over safety. And that really feels like a betrayal of democracy to me,” Haugen told Pelley.
Facebook whistleblower Frances Haugen says Facebook turned on safety systems to reduce misinformation during the 2020 Election. But she says many of the changes were temporary. Facebook says some of the safety systems remained. https://t.co/pwJ8R4RFLx pic.twitter.com/APKtjIGblL
— 60 Minutes (@60Minutes) October 3, 2021
An internal Facebook task force reportedly found that the company failed to take proper action against users on the platform to coordinate the “Stop the Steal” rally that turned into an insurrection at the Capitol on Jan. 6.
The revelation of Haugen’s identity and her explosive interview with “60 Minutes” comes after The Wall Street Journal published a bombshell report last month detailing the tech giant’s response to a number of issues, including anti-COVID-19 vaccine rhetoric, the effect of Instagram on younger users and internal concerns over human traffickers and drug cartels using the platform.
Facebook has since pushed back on the newspaper’s reporting, said it “contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees.”
The five-part series was based on internal documents provided by Haugen. She covertly copied tens of thousands of pages of the company’s internal research before leaving her post.
Haugen filed complaints against Facebook with federal law enforcement last month.
She told Pelley that Facebook “chooses profit over safety,” arguing that the company is “paying for its profits with our safety.”
She said the version of the company that exists today, 17 years after its founding, is “tearing our societies apart and causing ethnic violence around the world.”
Facebook, in a statement to The Hill, said protecting the company’s community “is more important than maximizing our profits” and pointed to a number of investments it makes for safety and security.
“To say we turn a blind eye to feedback ignores these investments, including the 40,000 people working on safety and security at Facebook and our investment of $13 billion since 2016,” Facebook said.
“The growth of people or advertisers using Facebook means nothing if our services aren’t being used in ways that bring people closer together — that’s why we are investing so much in security that it impacts our bottom line,” the company added.
Haugen said Facebook’s main problem harkens back to a change the company made to its algorithm in 2018, which she said determines content the user sees based on what will receive the most engagement or spark the strongest reaction.
According to research, however, that type of content is often hateful, divisive and polarizing, Haugen said.
“It’s easier to inspire people to anger than it is to other emotions,” Haugen said. She added that angry content is “very enticing” and keeps users glued to the platform.
“Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money,” she said.
Facebook, however, defended the the 2018 change, telling The Hill in a statement that “the goal of the Meaningful Social Interactions ranking change is in the name: improve people’s experience by prioritizing posts that inspire interactions, particularly conversations, between family and friends — which research shows is better for people’s well-being — and deprioritizing public content.”
The company also cited research that says polarization in the U.S. had been growing even before platforms like Facebook were created and that it has been on the decline in nations that have seen an increase in internet and Facebook use.
“We have our role to play and will continue to make changes consistent with the goal of making people’s experience more meaningful, but blaming Facebook ignores the deeper causes of these issues — and the research,” the company added.
Haugen is scheduled to testify before a Senate subcommittee on consumer protection Tuesday as part of a hearing titled “Protecting Kids Online: Testimony from a Facebook Whistleblower.”
Facebook declined to be interviewed for the “60 Minutes” piece, but told The Hill in a statement, “Every day our teams have to balance protecting the ability of billions of people to express themselves openly with the need to keep our platform a safe and positive place.”
“We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true,” the company added.
In a statement to “60 Minutes,” Facebook’s director of policy communications, Liena Pietsch, said in part, “If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”
In a written statement to 60 Minutes, Facebook said, “we continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.” https://t.co/9aEbupiWjM pic.twitter.com/xmzm36qhn8
— 60 Minutes (@60Minutes) October 3, 2021
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..