YouTube bans vaccine misinformation

You Tube bans fake vaccination videos
FILE – This Oct. 21, 2015, file photo shows signage inside the YouTube Space LA offices in Los Angeles.YouTube says it’s cracking down on conspiracy videos, though it’s scant on the details. YouTube CEO Susan Wojcicki said at a conference on Tuesday, March 13, 2018, that the company will work to debunk videos espousing conspiracy theories by including links to the online encyclopedia Wikipedia. (AP Photo/Danny Moloshok, File)

YouTube announced Wednesday that it will no longer allow videos that suggest vaccines certified by health authorities are hazardous or ineffective, in a new effort to combat anti-vaccine propaganda.
Anti-vaccine accounts such as Joseph Mercola’s channel and the Robert F. Kennedy Jr.-linked Children’s Defense Fund have been banned by the site.

In 2019, YouTube removed adverts from anti-vaccination content, and in October 2020, it said that it would remove videos that spread disinformation about the COVID-19 vaccine.
Other immunizations, such as the flu shot, the HPV vaccine, and the measles, mumps, and rubella (MMR) vaccine, are now covered by the new policy.
Under the new policy, videos that falsely imply that the MMR vaccine causes autism or that the flu shot causes infertility, for example, would be prohibited.

There are some exceptions: YouTube will still allow videos that include people sharing their personal experiences with vaccination. It’ll remove that content if the channels they’re on “demonstrate a pattern of promoting vaccine misinformation.” The guidelines say that the platform will also allow videos with information violating the policy if that video includes other contexts, like statements from medical experts.

Along with the new policy, YouTube is also terminating the channels of major anti-vaxxers, a YouTube spokesperson confirmed to The Verge. Those include Joseph Mercola, the Children’s Health Defense Fund, Erin Elizabeth, and Sherri Tenpenny. Channels for two other major figures, Rashid Bhuttar and Ty and Charlene Bollinger, were terminated a few months ago, the spokesperson said.

Those anti-vaccine figures are all part of the “Disinformation Dozen,” a group identified by the Center for Countering Digital Hate as responsible for the bulk of misleading claims about vaccines on social media.

YouTube expanded its vaccine policies after noting that misinformation around all vaccines could contribute to mistrust around the COVID-19 vaccine, Matt Halprin, YouTube’s vice president of global trust and safety, told The Washington Post. Over the past few months, the backlash to COVID-19 vaccinations has been expanding to target other vaccines: the Tennessee Department of Health temporarily suspended outreach around childhood vaccinations this summer, and a Florida state senator said he wants to “review” school vaccination requirements.

In February, Facebook amended its vaccine disinformation policy to include claims that the doses are hazardous.