Facebook unveils new plans to combat anti-vaccine content
Facebook announced Thursday that it is taking steps to limit the circulation of anti-vaccine content on its platform.
The move comes after weeks of pushback from critics who said the social media giant was not doing enough to stave off the spread of conspiracy theories promoting misinformation about vaccines.
Lawmakers and public health advocates have been pushing Facebook, as well as YouTube and Amazon, to take stronger action against anti-vaccine content, arguing that the proliferation of medical misinformation has bolstered the “anti-vax” movement of people who choose not to vaccinate themselves or their children.
{mosads}Health experts have attributed recent measles outbreaks in the U.S. to an increasing number of people not getting vaccinations, warning that the movement largely uses social media to promote their views.
According to its new plan, Facebook will no longer promote anti-vaccine groups and pages in search results, and will not surface them in users’ newsfeeds.
Facebook-owned Instagram’s search and “explore” features will also no longer promote posts that spread anti-vaccine content.
The announcement of the changes comes after Rep. Adam Schiff (D-Calif.) in recent weeks pressed Facebook CEO Mark Zuckerberg, as well as Google CEO Sundar Pichai, over the spreading of anti-vaccine misinformation through those platforms.
Schiff slammed Facebook and Google, which owns YouTube, over reports that its algorithms were recommending and surfacing anti-vax content.
On Thursday, Schiff’s office released Facebook and Google’s response letters.
Facebook in its letter wrote that it will confer with leading health organizations, including the World Health Organization and the U.S. Centers for Disease Control and Prevention, to identify “verifiable hoaxes” on vaccines.
“An example of a claim that has been widely disproven by these organizations is the assertion that vaccines cause autism,” Facebook wrote to Schiff’s office.
Previous reports found that Facebook and Instagram were allowing anti-vaccination content to be shown to users who were not seeking it out.
Facebook on Thursday also announced that it will reject all ads that include “misinformation about vaccines.”
Previously, Facebook offered advertisers the option to target people interested in categories including “vaccine controversies” with anti-vaccine content.
Now, the company says it will no longer allow that kind of targeting.
The company says it might disable ad accounts that continually violate the new anti-vax policy.
YouTube has taken similar steps in recent months, announcing last year that it will no longer recommend videos promoting blatant conspiracy theories, including those about vaccines.
The platform also moved last month to target channels that promote content that advocates against vaccinations.
Google, which owns YouTube, laid out those steps in a letter to Schiff released on Thursday as well.
Schiff in recent weeks sent a letter to Amazon scrutinizing the anti-vax books and documentaries sold on its online retail store.
According Schiff, Amazon plans to answer the letter in the coming days.
“I’m pleased that all three companies are taking this issue seriously and acknowledged their responsibility to provide quality health information to their users,” Schiff said in a statement.
“The crucial test will be whether the steps outlined by Google and Facebook do in fact reduce the spread of anti-vaccine content on their platforms, thereby making it less likely to reach users who are simply seeking quality, fact-based health information for their children and families,” he continued.
Facebook said it is now exploring options to share more medically sound information about vaccines with its users.
— Updated at 6:21 p.m.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..