YouTube bans all content containing vaccine misinformation

YouTube bans all content containing vaccine misinformation

YouTube has banned all videos containing misinformation about vaccines that are currently administered and have been approved by local health authorities or the World Health Organization. The measure is an expansion of a policy covering COVID-19 vaccines.


The service says that users shouldn't, for instance, post videos in which they claim that vaccines lead to chronic side effects (other than rare side effects that health authorities have acknowledged); content that alleges vaccines don't reduce transmission or contraction of diseases; or videos that have inaccuracies about vaccine ingredients.


There are some exceptions. YouTube "will continue to allow content about vaccine policies, new vaccine trials and historical vaccine successes or failures." Users can also share scientific discussions of vaccines and personal testimonials about their experiences, as long as they don't have a history of promoting vaccine misinformation and their video complies with YouTube's other rules. Posting videos that "condemn, dispute or satirize misinformation" that violates YouTube's policies should be okay too.
YouTube told the Washington Post that it's taking down channels linked to prominent anti-vaccine advocates, including Joseph Mercola and Robert F. Kennedy Jr. The reason it didn't move to ban all anti-vaccine content sooner is because it was focusing on COVID-19 vaccine misinformation.
“Developing robust policies takes time,” YouTube’s vice president of global trust and safety Matt Halprin told the publication. “We wanted to launch a policy that is comprehensive, enforceable with consistency and adequately addresses the challenge.”
YouTube, as well as Facebook and Twitter, banned COVID-19 misinformation in the early days of the pandemic in the spring of 2020. YouTube has removed more than 130,000 videos that broke its rules about COVID-19 vaccines, which it announced last October, and more than a million videos in total that included coronavirus misinformation.
Meanwhile, Facebook has been working to reduce the spread of anti-vaccine content since at least 2019. It formally banned vaccine misinformation in February.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.