YouTube Continues To Promote Anti-Vax Videos As Facebook Prepares To Fight Medical Misinformation

While Facebook vows to stop recommending anti-vaccination content to users, YouTube is still promoting videos like “You’ll Be Glad You Watched This Before Vaccinating Your Child!”

Facebook, under pressure from lawmakers and in the midst of a national measles outbreak, says it’s working to prevent anti-vaccination content from being algorithmically recommended to users. But on YouTube, an increasingly popular source of health information, vaccine-related searches such as “should i vaccinate my kids” frequently return search results and recommendations for videos that describe vaccines as dangerous and harmful.

For example, last week, a YouTube search for “immunization” in a session unconnected to any personalized data or watch history produced an initial top search result for a video from Rehealthify that says vaccines help protect children from certain diseases. But YouTube’s first Up Next recommendation following that video was an anti-vaccination video called “Mom Researches Vaccines, Discovers Vaccination Horrors and Goes Vaccine Free” from Larry Cook’s channel. He is the owner of the popular anti-vaccination website StopMandatoryVaccination.com.

After playing one pro-vaccine video from Rehealthify, YouTube recommended an anti-vaccination parent testimonial video from StopMandatoryVaccines.com:

YouTube / Via YouTube.com
YouTube / Via YouTube.com
Left: YouTube | Right: YouTube YouTube / Via YouTube.com

In the video, a woman named Shanna Cartmell talks about her decision to stop vaccinating her children after she read a book that said leukemia is a side effect of vaccination.

(While some studies have found lower risk of leukemia in kids exposed to infections, that doesn’t mean vaccination increases risk of leukemia; in fact, a 2017 study found that vaccinations were associated with reduced leukemia risk. Despite the growing popularity of the anti-vaccination movement, the scientific consensus is that vaccines are safe, and do not cause autism.)

“I wasn't always that person who was going to not vaccinate, but it has to start somewhere. If you go down a road, follow the road, and see where it leads,” Cartmell says in the video. “Unless you know for sure that your child will be 100% safe, do you want to play that game? If you can’t say ‘yes’ right now, pause.”

“YouTube is a place where we see users not only come for entertainment. They come to find information.”
—Google CEO Sundar Pichai

YouTube’s promotion of misleading testimonials like Cartmell’s is concerning precisely because people do turn to YouTube for health information — and Google knows that. During a Google earnings call earlier this month, CEO Sundar Pichai said, “YouTube is a place where we see users not only come for entertainment. They come to find information. They’re coming to learn about things. They’re coming to discover, to research.” A recent Pew Research survey, which found that more than half of YouTube users turn to the site for news and information, confirms Pichai’s statement.

Last week, California Rep. Adam Schiff sent a letter to both Facebook and Google asking that each company address the anti-vaccination issue. In his letter to Pichai, Schiff expressed his “concern that YouTube is surfacing and recommending messages that discourage parents from vaccinating their children, a direct threat to public health, and reversing progress made in tackling vaccine-preventable diseases.”

But while Facebook responded last week by saying it would take “steps to reduce the distribution of health-related misinformation on Facebook,” so far, YouTube hasn’t responded publicly.

The US House of Representatives Committee on Energy and Commerce plans to hold a hearing next week addressing concerns about the reemergence of diseases that can be prevented by vaccine. “It’s unconscionable that YouTube’s algorithms continue to push conspiracy videos to users that spread disinformation and, ultimately, harm the public health,” Rep. Frank Pallone, the committee’s chairman, told BuzzFeed News. “We have a hearing next week on the measles outbreak concentrated in the Pacific Northwest and will be sure to discuss this with the public health experts who are testifying.”

YouTube, which did publish a blog post earlier this month announcing that it would be tweaking its recommendation algorithm to better handle conspiracies following a BuzzFeed News report, said it’s working on reducing the spread of harmful misinformation.

“Over the last year we’ve worked to better surface credible news sources across our site for people searching for news-related topics, begun reducing recommendations of borderline content and videos that could misinform users in harmful ways, and introduced information panels to help give users more sources where they can fact check information for themselves,” a spokesperson told BuzzFeed News via email. “Like many algorithmic changes, these efforts will be gradual and will get more and more accurate over time.”

“First-person narratives and testimonials are very powerful, and people go to YouTube for that experience.”

YouTube’s algorithm doesn’t necessarily favor anti-vaccination videos. In an initial test on Feb. 14, a little over two-thirds of searches for queries like “is it safe to vaccinate my kids” or “are vaccines safe” resulted in a top search result from a professional medical outlet, like this video from Johns Hopkins or this video from the Mayo Clinic. Those videos were followed by recommendations for a video in which celebrity doctor Mike Varshavski discusses another popular video about the vaccine debate uploaded by an entertainment channel called Jubilee. That Dr. Mike video, which was posted on Feb. 6 and currently has more than 2 million views, was consistently followed by recommendations for more popular videos of Dr. Mike.

But while in some cases search queries like “should i vaccinate my kids” led to authoritative sources and entertainment, in other cases, the exact same search led down a misleading path. In one instance, after a search for “should i vaccinate my kids,” YouTube played a pro-vaccine video called “Why Are Vaccines Required Before My Child Goes to School?” from the Philadelphia Children’s Hospital but followed it with a recommendation for “Mom Researches Vaccines, Discovers Vaccination Horrors and Goes Vaccine Free.” That was followed by “You’ll Be Glad You Watched This Before Vaccinating Your Child!” from iHealthTube, which was followed by three videos in a row featuring anti-vaccination activists Dr. Sherri Tenpenny and Dr. Suzanne Humphries.

Just two recommendations after a pro-vaccine video from a children’s hospital, YouTube recommended content from VAXXED TV:

YouTube / Via http://
YouTube / Via http://
Left: YouTube | Right: YouTube YouTube / Via http://

In a second set of tests run by BuzzFeed News (also in fresh search sessions with no personalized data) four days later, results like these were even more consistent. In 16 searches for terms including “should i vaccinate my kids” and “are vaccines safe,” whether the top search result we clicked on was from a professional medical source (like Johns Hopkins, the Mayo Clinic, or Riley Hospital for Children) or an anti-vaccination video like “Mom Gives Compelling Reasons To Avoid Vaccination and Vaccines,” the follow-up recommendations were for anti-vaccination content 100% of the time. In almost every one of these 16 searches, the first Up Next recommendation after the initial video was either the anti-vaccination video featuring Shanna Cartmell (currently at 201,000 views) or “These Vaccines Are Not Needed and Potentially Dangerous!” from iHealthTube (106,767 views). These were typically followed by a video of anti-vaccination activist Dr. Suzanne Humphries testifying in West Virginia (currently 127,324 views).

A channel called VAXXED TV posted the video of Humphries testifying; in every one of these 16 searches, YouTube ultimately ended up recommending VAXXED TV videos on repeat. The VAXXED TV channel has 54,000 subscribers and describes itself as as a promotional channel for a 2016 documentary directed by infamous anti-vaxxer Andrew Wakefield that claims to reveal “how the CDC destroyed data on their 2004 study that showed a link between the MMR vaccine and autism.” That film, produced by Polly Tommey and Brian Burrowes, was yanked from the 2016 Tribeca Film Festival and got one of the filmmakers banned from Australia after a promotional tour. Nonetheless, the film developed a popular following, and its creators have so far raised $86,000 via Indiegogo to produce a sequel.

The information panel on this VAXXED TV YouTube video says, “The MMR vaccine is a vaccine against measles, mumps, and rubella” and links to the MMR vaccine Wikipedia page.

Some of the VAXXED TV videos most frequently recommended by YouTube’s algorithm following vaccination-related searches included “ MERCK'S DIRTY LITTLE SECRET - BY DR. SUZANNE HUMPHRIES,” “It Almost Smelled Chemical,” “Nobody Warned Me!” and “They Know.” Two of those videos (“Nobody Warned Me!” and “It Almost Smelled Chemical”) specifically mention the measles, mumps, and rubella vaccine, or MRR. Because the video descriptions mention MMR specifically, YouTube added an information panel that explains what the MMR vaccine does and links to more information on Wikipedia. YouTube does this on videos explicitly mentioning common conspiracies, such as climate change denial or the idea that the moon landing was fake, but doesn’t do it for anti-vaccination content more generally.

In 25 additional tests on Feb. 19 and 20, YouTube didn’t recommend VAXXED TV videos, instead overwhelmingly recommending the same entertainment videos from Dr. Mike and Jubilee (in addition to some outliers, like “Autism Didn’t Exist Before 1943?” from the Real Truth About Health channel). Videos by Jubilee, which has 2.5 million subscribers, claim to “bridge people together, challenge conventional thinking, and inspire love.” Jubilee’s video on the vaccine debate — “Pro-Vaccine vs Anti-Vaccine: Should Your Kids Get Vaccinated?” — popped up frequently in BuzzFeed News’ queries and currently has 1.9 million views since it was posted two weeks ago. It also doesn’t definitively say whether vaccines are safe or not.

“There’s always been asymmetry of passion on social platforms: The most compelling content is the most sensational,” Renee DiResta, a computational propaganda researcher with the Mozilla Foundation, told BuzzFeed News. “We’re in a media environment where these first-person narratives and testimonials are very powerful, and people go to YouTube for that experience. The most popular vloggers are looking into the camera and telling you about their day or a product — it’s the kind of content that does well with people.”

Anti-vaccination videos on YouTube fit this criteria. They are highly emotional, often featuring parents talking about their sick kids. They also have an element of conspiracy — the theory that the government and medical profession have conspired to keep the relationship between vaccines and certain diseases a secret — that draws the viewer in and provokes debate. This makes these videos like catnip to YouTube’s recommendation algorithm, which seeks out and feeds on user engagement.

That kind of organic algorithmic lift can be difficult to compete with, even with better content. There is no organic community of regular people creating pro-vaccine videos, no enormous Facebook group where parents whose children died of vaccine-preventable diseases gather to talk and share links.

if a platform has incentivized a broad style of content for *years,* its catalog assumes its character. you can't just redirect youtube users to pro-vaccine content, because it barely exists. creating pro-vox content for youtube would have been a weird thing to do! https://t.co/3VW1cOVl7C

Some of the anti-vax videos on YouTube are genuine conspiracies, and those are likely to be affected by YouTube’s recent tweaks. But it’s the steady stream of videos of crying parents who genuinely believe vaccines harmed their children that keeps the movement boosted via YouTube’s algorithm — and it’s not clear what YouTube plans to do about those.

“I think that’s going to be something [YouTube’s] policy team has to determine,” DiResta said. “What kind of content it’s willing to recommend.”

Topics in this article

Skip to footer