YouTube has announced a major crackdown on videos that spread vaccine misinformation on its platform, banning content and high-profile anti-vaccine activists who claim spurious information about any commonly used vaccines.
In an announcement on Wednesday, the Google-owned video-sharing platform said it’s expanding its policy around medical misinformation to include tighter restrictions on misleading and false information about all vaccines. Previously it had blocked only COVID-19 anti-vaccine misinformation.
More specifically, YouTube has said it will seek to weed out content that “falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines will be removed.” This includes claims that vaccines cause autism, cancer, and infertility, or that technology in vaccines can monitor people who receive them.
The crackdown will also include banning accounts held by high-profile anti-vaccine figures, including Robert F Kennedy Jr, Joseph Mercola, and Sherri Tenpenny, according to the New York Times. Research has indicated that well-connected individuals with big followings play an integral role in the anti-vax ecosystem. For instance, one report from McGill University found that up to two-thirds of anti-vaccine content shared on Facebook and Twitter between February 1 and March 16, 2021, could be attributed to just 12 individuals (including Kennedy, Mercola, and Tenpenny).
YouTube had previously barred content containing false claims about COVID-19 vaccines under its COVID-19 misinformation policy. Since the pandemic began, the site claims to have banned over 130,000 videos that have promoted misinformation about the COVID-19 vaccines. Its new policy goes further and will also remove misleading content regarding other vaccines, such as immunizations for measles or Hepatitis B. Paired with this move, YouTube says it will aim to promote authoritative and accurate health information from reputable sources.
"Crafting policy around medical misinformation comes charged with inherent challenges and tradeoffs," YouTube said in its announcement. "Scientific understanding evolves as new research emerges, and firsthand, personal experience regularly plays a powerful role in online discourse. Vaccines, in particular, have been a source of fierce debate over the years, despite consistent guidance from health authorities about their effectiveness.
"Working closely with health authorities, we looked to balance our commitment to an open platform with the need to remove egregious harmful content," the statement continued. "We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we're now at a point where it's more important than ever to expand the work we started with COVID-19 to other vaccines."
Other social platforms have taken similar steps over the past year. In February, Facebook announced it would remove posts with false claims about COVID-19 vaccines following consultations with leading health organizations, including the World Health Organization. Twitter also rolled out a five-strikes ban policy for accounts that repeatedly spread vaccine misinformation.
The accounts of Mercola and Kennedy are still active on both platforms however, although Instagram has banned Kennedy.