Facebook said Thursday that it will begin removing false claims about coronavirus vaccines that have been debunked by public health officials.
The move is an extension of Facebook's coronavirus misinformation policy and comes as experts worry that conspiracy theories and baseless claims about vaccines could limit the number of people who get them.
Facebook said in a blog post that it would potentially take action against "false claims about the safety, efficacy, ingredients or side effects of the vaccines."
"For example, we will remove false claims that COVID-19 vaccines contain microchips, or anything else that isn’t on the official vaccine ingredient list," the company said in the blog post. "We will also remove conspiracy theories about COVID-19 vaccines that we know today are false: like specific populations are being used without their consent to test the vaccine’s safety.”
Previously, Facebook's policies banned misinformation about coronavirus that "contributes to the risk of imminent violence or physical harm." Facebook said that its new enforcement on vaccine misinformation will happen gradually.
The company has historically struggled to handle anti-vaccine misinformation on its platform. In the wake of a measles outbreak in the US nearly two years ago, Facebook promised to take action on anti-vaxx misinformation, including making it less prominent in the news feed and not recommending related groups. But even then, anti-vaxxer information was easily searchable on Facebook-owned Instagram.
Facebook recently booted a large private group dedicated to anti-vaccine content. But many groups dedicated to railing against vaccines remain. A recent cursory search by CNN Business found at least a dozen Facebook groups advocating against vaccines, with membership ranging from a few hundred to tens of thousands of users. At least one group was specifically centered around opposition to a Covid-19 vaccine.