Facebook exec in 2016 warned taking down misinformation would ‘disproportionately affect conservatives’: report
A Facebook executive in 2016 warned the company that taking down all instances of political misinformation would unfairly target conservative users of the social network, The Washington Post reported on Thursday.
The reported warning came in the weeks following the 2016 presidential election, when Facebook was in the early stages of developing “Project P” — for propaganda — in an effort to streamline how misinformation about American elections and politics was identified and removed from the site.
The program quickly found dozens of pages that had put out false information about the election and its candidates in the weeks leading up to the election. It was also discovered that the majority of the pages were based overseas, had financial motives and exhibited significant conservative bias, the Post reported.
When higher-ups in the company moved to have all of the pages removed, Joel Kaplan, now head of Facebook’s Washington office and the highest-ranking Republican at the company, reportedly pushed back.
“We can’t remove all of it because it will disproportionately affect conservatives,” said Kaplan, who worked in former President George W. Bush’s White House, according to the Post.
When others involved in the discussion stated that the pages were producing “fake news,” Kaplan reportedly replied: “They don’t believe it to be fake news.”
The end result was that the worst of the pages were taken down, but the majority of them remained, according to the Post report.
Throughout President Trump’s first term, Facebook has received criticism from the both the White House and GOP lawmakers that the site has an unfair bias against conservatives.
These factors have influenced how the social media giant has shaped its policy in the lead-up to the 2020 elections, the Post reported.
Facebook’s new policy, which they announced in January, didn’t introduce much change.
Campaigns can still post and promote political messages that are blatantly false, all while using the site’s powerful advertising tools to target specific audiences.
Instead, Facebook gave users the ability to better control the frequency in which political ads appear.
“We’re updating our Ad Library to increase the level of transparency it provides for people and giving them more control over the ads they see,” Facebook director of product management Rob Leathern said in a blog post at the time.
“We recognize this is an issue that has provoked much public discussion — including much criticism of Facebook’s position. We are not deaf to that and will continue to work with regulators and policy makers in our ongoing efforts to help protect elections,” Leathern added.
The Hill has reached out to Facebook for comment.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..