YouTube moves to block and remove all anti-vaccine misinformation
Company has already taken down 130,000 pieces of video with false info about vaccines
YouTube is moving to block and remove all content that spreads misinformation about vaccines against COVID-19 and other illnesses, such as measles, hepatitis and chickenpox.
The Google-owned online video company said in a blog post on Wednesday that any content that "falsely alleges that approved vaccines are dangerous and cause chronic health effects" will be removed.
"This would include content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them."
Since 2020, Google says it has taken down 130,000 videos for violating the company's COVID-19 vaccine policies, and says it is stepping up those efforts.
"We're expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO," the company said.
The company will remove individual videos from some users, and — as first reported by the Washington Post — will go as far as taking down the accounts entirely of serial spreaders of misinformation, including Joseph Mercola, an American doctor who had more than half a million subscribers, and Robert F. Kennedy Jr., son of the former presidential candidate, who has been a vocal critic of vaccines long before the pandemic.
Claims about vaccines that are being tested will still be allowed. Personal stories about reactions to a vaccine will also be permitted, as long as they do not come from an account that has a history of promoting vaccine misinformation.
Tim Caulfield, the Canada Research Chair in Health Law and Policy at the University of Alberta, told CBC News in an interview Wednesday that the move is long overdue.
As to why it took so long, "the cynical explanation is that these videos are extremely popular and that they do drive traffic," he said. Social media algorithms tend to value whatever is drawing attention, he said, and "there's a lot of research that tells us that misinformation gets a lot of traction on these platforms — in fact, travels further and faster than the truth."
The move comes as YouTube and other tech giants such as Facebook and Twitter have been criticized for not doing enough to stop the spread of false health information on their sites. Twitter recently started flagging a lot more of its content as misleading or "manipulated media" and has increased the number of accounts it has banned either temporarily or permanently.
Caulfield says there is some danger that YouTube's move could simply add fuel to the conspiracy theory fires, spread by many online pedlars of misinformation, that big technology companies are trying to silence the truth. But on balance, the move will do far more good than harm, he said.
"It probably won't have as much impact on those hardcore deniers, but we know it's very difficult to change their minds anyway," he said. "The movable middle that are complacent or are only marginally hesitant — that should always be your target."
With files from the CBC's Meegan Read