Technology

Facebook admits enabling violence in Myanmar following report

Facebook admitted that it failed to stop its platform from being used to exacerbate violence in Myanmar, following the conclusion of a new study that the company commissioned an independent group to conduct.

“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence,” Alex Warofka, a Facebook product policy manager, wrote in a post about the report on Monday. “We agree that we can and should do more.”

{mosads}The report, conducted by Business for Social Responsibility (BSR) between May and Sept. 2018, concludes that Facebook often did not do enough to prevent its platform from being a vehicle that contributed to violence in the already severely embattled country.

In many cases, Facebook was oblivious to the damage its platform caused and did not devote enough resources curbing violence in Myanmar stemming from misinformation and hate spread on the site.

Facebook entered the Myanmar market and became the de facto source for all information online in the country still grappling with the consequences of decades of strict censorship. The ongoing ethnic cleansing of the country’s Rohingya Muslim population only compounded the matter.

BSR made a number of recommendations to Facebook in its report, including that Facebook improve the enforcement of its community standards and “establish a separate policy that defines our approach to content moderation with respect to human rights.”

Warofka wrote that Facebook was looking into both measures.

Facebook tried to pass some of the blame on its difficulty in moderating Burmese content to Myanmar being the “only country in the world with a significant online presence that hasn’t standardized on Unicode — the international text encoding standard.”

“This lack of a single standard poses some very real technical challenges for us and others. It makes automation and proactive detection of bad content harder, it can weaken account security, it means less support for languages in Myanmar beyond Burmese, and it makes reporting potentially harmful content on Facebook more difficult,” Warofka wrote.

A United Nations report published earlier this year similarly concluded negligence on the part of Facebook in how it has handled its platform’s role in Myanmar.

Numerous officials from Facebook have publicly acknowledged its failings in Myanmar in the past, including CEO Mark Zuckerberg. The company has introduced new provisions and rules in an attempt to temper its role in the spread of violence.

Facebook previously only had two Burmese moderators who could assess content in Myanmar. It has since expanded that number to 60 and plans to add 40 more by the end of the year.

Civil society groups and activists in Myanmar have welcomed the changes, but argue that they’re not enough. Critics have noted that the company still has yet to hire staff on the ground in Myanmar with the most direct and up to date knowledge of the country.

Myanmar is also far from the only country where Facebook’s platform has contributed to violent conflicts. The platform has also been accused of fanning flames of violence in South Sudan and Sri Lanka.