Facebook eyeing tools used in Sri Lanka, Myanmar in case of election unrest in US: report
Facebook officials are reportedly considering the use of tools developed for so-called at-risk countries in times of political instability during the upcoming U.S. election.
The Wall Street Journal reported Monday that people familiar with Facebook’s decisionmaking told the newspaper that the company could deploy a range of responses, including settings that would stifle the reach of some viral posts or lower the bar for what would constitute dangerous content that should be removed from the platform.
A Facebook spokesperson told the Journal that the company “spent years building for safer, more secure elections,” but did not specify what options the company is considering.
“We’ve applied lessons from previous elections, hired experts, and built new teams with experience across different areas to prepare for various scenarios,” said the spokesman.
Many of the platform’s tools against disinformation and viral misleading political information were developed in countries where content on the platform is thought to have contributed to violence against minority groups, such as in Myanmar, the newspaper noted.
“There’s some content that needs to be removed, and there’s some content that needs to not have access to the Facebook tools that create virality,” a consultant who worked with Facebook on human rights issues following violence against Muslims in Sri Lanka, where the company’s role was also examined, told the Journal.
The platform has been the target of widespread criticism over its handling of political content in the U.S., both from those on the left who say the company needs to do more to fight disinformation as well as conservatives who argue Facebook and other tech giants censor right-leaning speech.
Earlier this month, both Facebook and Twitter found themselves in hot water over a New York Post report about Hunter Biden that has faced criticism about its sourcing and accuracy. It was blocked by Twitter while Facebook officials claimed that it would be suppressed on their platform while third-party fact-checkers evaluated it, though the article’s spread was not meaningfully limited.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..