Facebook has struggled to curb hateful content in India, according to documents from a former Facebook employee obtained by The Associated Press.
The documents were part of files handed over to Congress and provided to the Securities and Exchange Commission by Frances Haugen, a former Facebook employee and whistleblower who testified before a Senate panel earlier this month.
Haugen testified about the ways the content on Facebook-owned Instagram impacts children’s mental health and alleged that Facebook places profits over implementing safeguards to minimize harmful content.
The internal company documents from date back to 2019 and span to March 2021 and show Facebook has not been able to keep up with misinformation and hateful content in India.
The wire service reported that the Silicon Valley giant has known about the issues for years, and the documents raise questions about if enough has been done to mitigate such content online.
One of the issues detailed by the AP report was that Facebook did not have enough moderators who spoke Hindi or Bengali to stop misinformation or hateful content. The company labeled the languages as priorities for “automation on violating hostile speech.”
One employee in 2019 created a completely new Facebook account in India to see what someone would see recommended on their feed with no prior bias, according to the AP.
The employee created the page after a militant attack that killed 40 Indian soldiers.
He said the account, which was live for three weeks, became “a near constant barrage of polarizing nationalist content, misinformation, and violence and gore.”
The recommended content the employee received was reportedly Islamophobic and contained fake news.
In other incidents, misinformation about Muslims became popular on the platform, but was not taken down for days, according to AP.
The Indian government has previously raised concerns with Facebook over its moderation policies.
In September, a penal of New Delhi lawmakers summoned the company’s top executive in India to respond to claims that the platform did not properly implement its hate speech rules following a riot that left several dead.
The Wall Street Journal reported in August that a Facebook employee opposed enforcing the platform’s hate speech rules on a member of Prime Minister Narendra Modi’s Bharatiya Janata Party (BJP).
Facebook has “invested significantly in technology to find hate speech in various languages, including Hindi and Bengali,” the company told The Associated Press, which “reduced the amount of hate speech that people see by half” in 2021.
“Hate speech against marginalized groups, including Muslims, is on the rise globally. So we are improving enforcement and are committed to updating our policies as hate speech evolves online,” the Facebook spokesperson said.
The Hill has reached out to Facebook for comment.