New York CNN Business  — 

YouTube will remove videos spreading misinformation about any approved vaccine, not only those aimed at preventing Covid-19, the company announced in a blog post Wednesday.

In a statement provided to CNN Business, YouTube also confirmed it would remove the channels of “several well-known vaccine misinformation spreaders” under the new policy, including one belonging to the Children’s Health Defense Fund, a group affiliated with controversial anti-vaccine activist Robert F. Kennedy, Jr.

Users who post misinformation about any “currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO” will have their videos taken down, and will be subject to YouTube’s strike policy and could face removal, the company said in the blog post.

“This would include content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them,” YouTube said, adding that the policy applies to specific immunizations like those for measles, as well as general statements about vaccines.

Kennedy pushed back on YouTube’s decision, telling CNN Business in a statement: “There is no instance in history when censorship has been beneficial for either democracy or public health.”

The Google (GOOGL)-owned platform previously introduced a policy prohibiting misinformation related to Covid-19 during the pandemic, including about treatment and prevention. That policy had already led to actions against some high-profile figures, including Kentucky Senator Rand Paul, who was suspended for seven days in August for making false claims about the effectiveness of masks. (Paul criticized the decision by YouTube, and called the suspension a “badge of honor.”)

YouTube said Wednesday that misleading claims have spilled over into other areas of medicine.

The announcement comes as the United States and other countries around the world have struggled to tackle misinformation that experts say contributes to vaccine hesitancy. The global rate of daily Covid-19 vaccinations has fallen recently to around 26 million doses per day. It also comes as Covid-19 vaccines for children are expected to be approved in the coming months.

YouTube’s action is potentially significant because of its impact on the misinformation ecosystem. “A lot of the vaccine misinformation you see on other platforms links to YouTube videos,” said Lisa Fazio, an associate professor of psychology and human development at Vanderbilt University who has studied misinformation. “It was a major loophole in our information ecosystem that it was so easy to post blatantly false information about vaccines on YouTube and have it gain large audiences.”

In an effort to evade the previous YouTube bans on Covid-19 misinformation, bad actors had pivoted to posting more general anti-vaccine content to sow confusion and distrust in inoculations more broadly, Fazio said.

Some social media platforms have been criticized for not doing enough to address vaccine misinformation. The White House in July called on tech companies to ban the “disinformation dozen,” a list of 12 people, including Kennedy, who were identified by the nonprofit Center for Countering Digital Hate as being leading spreaders of vaccine misinformation. Multiple people on that list were among those YouTube said it took action against Wednesday.

Facebook in August said it removed dozens of pages and groups related to the disinformation dozen. Kennedy’s Facebook-owned Instagram page was shut down earlier in the year for posting Covid-19 misinformation.

Under YouTube’s new rules, users who post vaccine misinformation will be subject to its strike policy, which provides up to three strikes for content that goes against its policies posted within a 90-day period. The third strike leads to the user being permanently suspended. However, the company also says it may remove users after only one severe violation of the rules or when a channel is dedicated to violating the policy.

YouTube also said Wednesday that there will be exceptions to its new anti-vaccine guidelines. It will allow, for example, content about new vaccine trials and historical vaccine successes or failures.

“Personal testimonials relating to vaccines will also be allowed, so long as the video doesn’t violate other Community Guidelines, or the channel doesn’t show a pattern of promoting vaccine hesitancy,” YouTube said.