Violence in Myanmar poses major test for Facebook

Facebook officials tell The Hill they are making a serious effort to prevent their platform from being used to stir ethnic violence in Myanmar after mounting criticism.

Critics have long complained the company failed to take quick action to block hate speech they say incites genocide. Now Facebook says it’s working to address these concerns and make sure its platform is no longer used to help spread violence.

With human rights groups watching, the situation in Myanmar is becoming a public test of the company’s ability to respond quickly in real time to prevent its platform from being hijacked for political or divisive ends.

Sara Su, a product manager at Facebook for their News Feed product, spoke to The Hill on Tuesday about a trip she took last month to Myanmar with other high-profile company execs led by public policy vice president Simon Milner.

Su said the team met with “civil society groups on the ground” and others “to better understand how abuse of our platform can lead to real-world harm and how we can act more urgently to address it.”

{mosads}

The trip was just the latest step as Facebook scrambles to come to grips with the scope of the problem.

Myanmar has been mired in violence between the country’s Buddhist majority and the Rohingya, a Muslim minority ethnic group. U.N investigators have called the conflict a “textbook example of ethnic cleansing.”

The country held open elections in 2015 and gradually began to open to the world, including embracing social media, after decades under a brutal military dictatorship.

But international observers say the ethnic turmoil has only gotten worse with the arrival of social media, which allows people to spread misinformation and hateful rhetoric and organize violence against the Rohingya.

Su said Facebook is taking steps to curb such abuse in the country.

The company’s first big step was banning high-profile Buddhist nationalists who have called for violence against ethnic minorities.

Another big hurdle the company is now trying to address is the language barrier. Because the Burmese language isn’t on the Unicode standard for languages on digital platforms yet, many use the app in English. Facebook said that contributed to misunderstandings and made it harder for users navigating the platform in the nonnative language to report offensive content.

Su said the company is now making sure it has well-functioning font converters within the Facebook app to address language issues.

“One of the purposes of this trip was to engage with local telecos and font experts to figure out what’s a good short-term solution while we also focus on longer-term efforts,” she said.

Those changes will hopefully help local users communicate more clearly and also flag troublesome content easier.

Su said these changes are part of Facebook’s efforts to “get to baseline” in Myanmar as it explores longer-term solutions.

“We’re making sure that we can fix those basic problems and then we can start to use the integrity infrastructure that we’re building globally to act on hate speech and misinformation as fast as possible, especially when lives are at stake,” she said.

Facebook appears to be taking the advice of critics who urged it to bring in local help to understand the situation. The company said it is staffing up on experts with a knowledge of the language.

“We all feel really strongly that we’ve been slow to act,” Su added.

Her remorse echoes sentiments from officials across the company, including CEO Mark Zuckerberg; Adam Mosseri, Facebook’s vice president of product management; and Chris Cox, chief product officer.

Cox told The Hill last month that the platform’s misuse in Myanmar has been a chief concern he is working to address.

The stakes are high for Facebook.

The United Nations has singled the company out, accusing its platform of playing a role in the spread of violence.

The chief of a U.N. fact-finding mission in March said social media had “substantively contributed to the level of acrimony and dissension and conflict.”

And the official added, in stark language, that “as far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media.”

The following month Zuckerberg called those concerns “real issues” and vowed to “take this really seriously.”

The issue also has the attention of Congress. Zuckerberg was questioned by senators, including Sen. Patrick Leahy (D-Vt.), about the platform’s role in the spread of hate speech in Myanmar when he testified before Congress on the company’s data practices in April.

People on the ground and Rohingya activists say they are cautiously optimistic about Facebook’s new efforts.

“There has been some movements within the company, which is positive and certainly unprecedented,” said Victoire Rio, a Yangon, Myanmar-based social media researcher. Rio worked on the “Dear Mark” efforts which sought to use a hashtag to draw Zuckerberg’s attention to the problem. She also worked to help coordinate nongovernmental organizations and other groups to engage on matters regarding Facebook’s role in the country.

“We’re still looking to see this translate into concrete improvements,” she added.

Nay San Lwin, a Rohingya activist based in Germany, was similarly cautious, but acknowledged that it seems as though “now they’re taking serious action.”

But there are lingering questions about Facebook’s approach.

One thing the company still hasn’t done is provide detailed answers to groups about how it will better detect and block problematic content in Myanmar.

Rio told The Hill that the company has yet to give “concrete data” in response to questions that groups asked Facebook in a letter it sent to the company this past April, published by The New York Times.

Facebook said it believed unreliable data would paint an inaccurate picture. 

In answers Facebook provided to Congress after Zuckerberg’s testimony in April, the company gave lengthy but vague responses.

It avoided detailing the specific number of content reviewers it has for Myanmar. When asked why it did not quickly delete a post targeting a Muslim journalist in Myanmar — an incident Leahy asked Zuckerberg about — the company said it was “unable to respond without further information.”

Critics, though, say the problematic post was widely known about.

Lawmakers in the U.S. have made it clear that they will keep pressure on Facebook.

Sen. John Kennedy (R-La.) said in April that the company needs to do a better job making sure that its platform doesn’t aid genocide in Myanmar.

An aide for one of Facebook’s toughest critics in Congress, Sen. Mark Warner (D-Va.), said the lawmaker is aware of the issue and has discussed it with staff.

All sides acknowledge that Myanmar won’t be the last time Facebook must address such issues.

A U.N. report in November also noted that groups have been using Facebook to promote violence in South Sudan.

Tags Facebook John Kennedy Mark Warner Mark Zuckerberg Myanmar Patrick Leahy Rohingya people

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..

 

Main Area Top ↴

Testing Homepage Widget

 

Main Area Middle ↴
Main Area Bottom ↴

Most Popular

Load more

Video

See all Video