Facebook Bans Ads Discouraging Vaccines, In Latest Misinformation Crackdown
Updated at 7:05 p.m. ET
Facebook said on Tuesday it will ban anti-vaccination ads, following widespread pressure on the social network to curb harmful content.
Under the new global policy, the company will no longer accept ads discouraging people from getting vaccines; ads that portray vaccines as unsafe or ineffective; or ads claiming the diseases vaccines prevent are harmless.
"We don't want these ads on our platform," Facebook officials Kang-Xing Jin and Rob Leathern wrote in a blog post. "Our goal is to help messages about the safety and efficacy of vaccines reach a broad group of people, while prohibiting ads with misinformation that could harm public health efforts."
The new policy also applies to Instagram. Facebook said it will start enforcing it in the next few days.
The move is a reversal of Facebook's longstanding reluctance to police what people are allowed to say, saying it wants to encourage free speech. Public health groups and other critics have urged it to stamp out widespread misinformation on its platforms, which have more than 3 billion monthly active users.
This year the company has cracked down on hoaxes and falsehoods about the coronavirus. It had already banned vaccine misinformation and hoaxes identified by global health organizations like the World Health Organization and the Centers for Disease Control and Prevention.
The new policy is not a total ban on opposition to vaccines. The company will still accept ads that advocate against government vaccination policies. But those ads cannot explicitly discourage vaccination or make false claims, Facebook spokesperson Devon Kearns told NPR.
The ban also only applies to paid ads, not to the much larger volume of regular posts, including by users with large followings and posts in private groups where vaccine misinformation has flourished.
That means it may have limited effect on the spread of false claims about vaccines among people and groups that are already skeptical, Renée DiResta, who studies misinformation at the Stanford Internet Observatory, told NPR.
"This is important in preventing people from using ads to target new audiences that have not been pulled into the rabbit hole," she said. "But we shouldn't expect this to disrupt people who are already in that rabbit hole."
The ad ban is the latest in a series of steps Facebook has taken to limit harmful content, in some cases after years of pressure from critics. In recent weeks it has banned posts that promote Holocaust denial, the baseless Qanon conspiracy theory and armed militia groups. It also announced a temporary ban on political advertising after polls close on election day.
DiResta said the coronavirus pandemic appears to have changed the way Facebook thinks about health misinformation in particular.
"For a long time researchers and activists were trying to point out that harm was not only linked to imminent calls to violence," she said. "COVID is a global issue. We're operating in a time in which outbreaks are not locally confined. And they've had to take a much broader view of how to define what an authoritative, reputable health source is."
Facebook said the new stance toward anti-vaccination ads was part of its routine evaluation of its policies.
"We regularly refine our approach around ads that are about social issues to capture debates and discussions around sensitive topics happening on Facebook. Vaccines are no different. While we may narrow enforcement in some areas, we may expand it in others," Jin and Leathern wrote in Tuesday's post.
Facebook also said on Tuesday it would direct people to information about the flu vaccine and where users in the U.S. can get it. And it said it was working with the WHO and UNICEF on messaging campaigns to increase immunization rates.
Editor's note: Facebook is among NPR's financial supporters.
Copyright 2021 NPR. To see more, visit https://www.npr.org.