Leaked documents shed light on how Facebook filters for and handles sensitive content regarding things like abuse, hate speech and pornography on its platform, according to a Guardian report.
The documents obtained by the outlet show how Facebook grapples with users livestreaming self-harm and why it won’t automatically take down suicide streams along with other issues on its platform.
{mosads}“We don’t want to censor or punish people in distress who are attempting suicide,” one of the documents reads. “Experts have told us what’s best for these people’s safety is to let them livestream as long as they are engaging with viewers.”
The company said however that the footage would be taken down “once there’s no longer an opportunity to help the person,” with some exceptions.
The Guardian said it has seen more than 100 internal training manuals that Facebook uses for guidance in situations as far reaching as match-fixing and cannibalism.
Moderators expressed concern at their ability to sift through such content correctly, according to the report. The volume of work they do means that they often have 10 seconds to look at content and make an assessment of it.
“Facebook cannot keep control of its content,” one source told the paper. “It has grown too big, too quickly.”
Moderators noted that issues like revenge porn are among the most difficult to evaluate because of Facebook’s complex and confusing rules on the matter.
“Keeping people on Facebook safe is the most important thing we do,” Facebook’s head of global policy management, Monica Bickert, said in a statement on Monday. Bickert did not speak on the authenticity of the documents.
“We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously.”