Marketplace

Marketplace flagged over 800 social media posts with COVID-19 misinformation. Only a fraction were removed

The world’s social media giants promised to crack down on harmful COVID-19 misinformation that has proliferated since the pandemic began, but a CBC Marketplace investigation found that when problematic posts were flagged, most weren’t labelled or removed. 

Claims that vaccines cause sterility, fake COVID treatments among posts Facebook hadn't removed as of March 29

Despite promises to tackle COVID-19 misinformation from social media companies like Facebook, Twitter, YouTube, and Instagram, a Marketplace investigation still found plenty online. (CBC)

The world's social media giants promised to crack down on harmful COVID-19 misinformation that has proliferated since the pandemic began, but a CBC Marketplace investigation found that when problematic posts were flagged, most weren't labelled or removed. 

Marketplace producers, between Feb. 3 and Feb. 16, combed through Facebook, Instagram, YouTube and Twitter — using the user tool to flag and report more than 800 posts that breach each company's policies that cover, among other things, posting misinformation.

The result: 12 per cent of the posts were labelled with warnings or taken down entirely. That number jumped to 53 per cent only after Marketplace journalists identified themselves and shared the findings directly with the companies.

WATCH | Full Marketplace report on COVID-19 misinformation:

Covid-19 Conspiracies | Sugar Shock

4 years ago
Duration 22:30
Inside one of the world's most dangerous Covid-19 conspiracy movements; Canada's food labels fail to disclose added sugar content which makes some packaged foods appear healthier than they are.

"Facebook, Twitter, YouTube and Instagram have become the primary superspreaders of misinformation in our world," said Imran Ahmed, founder of the Centre for Countering Digital Hate (CCDH), a non-profit based out of Washington, D.C., which Marketplace collaborated with on this project. "That is a shocking failure to act on misinformation that was handed to them on a silver platter."

This post, presented as a study, claims ‘masks provide no benefit’ and ‘vaccines are inherently dangerous.’ It was one of the few posts that was taken down shortly after Marketplace reported it. (CBC)

Of the 832 posts Marketplace flagged, 391 came from Facebook, 166 from Instagram, 173 from Twitter and 102 from YouTube. The posts had a combined 1.5 million likes and 120,000 comments and covered a range of COVID-19-related topics, but generally circled back to a few central themes: vaccines are dangerous, COVID-19 isn't and don't trust authorities. 

Partly fuelled by social media, partly fuelled by the COVID-19 conspiracy movement's effective persuasion tactics, misinformation has contributed to anti-lockdown sentiment, COVID-19 denial and vaccine hesitancy, said Ahmed.

Ahmed says companies such as Facebook are motivated to keep users sharing more content, not less. The more you scroll and the more users consume, the more these companies make from advertisements, which is where most of their revenue is generated, he said.

Imran Ahmed, the founder of the Centre for Countering Digital Hate, says social media companies have become the primary superspreaders of misinformation online.
Imran Ahmed, the founder of the Centre for Countering Digital Hate, says social media companies have become the primary superspreaders of misinformation online. (CBC)

'Incredibly dangerous'

Marketplace was interested in seeing if the social media giants had made improvements since a 2020 CCDH study, which found the companies only acted on five per cent of misinformation it reported. The CCDH cross-referenced and analyzed CBC's data to ensure problem posts did breach company policies for FacebookInstagram, YouTube and Twitter.

Facebook, which owns Instagram, took action on about 18 per cent of the posts flagged on both platforms. That number jumped to about 67 per cent after Marketplace shared its findings. 

One of the posts that is still up on Facebook weeks later shows a picture of Bill Gates with the headline: "New vaccine causes sterility in 97% of women!" There is no evidence that links coronavirus vaccines to sterility.

As of March 29, this post remains on Facebook, even though Marketplace reported it and subsequently shared the findings with the company. (CBC)

Another post shows a homeopathic product, which purportedly "enhanced immunity" against COVID-19 and promised "reduced frequency and shorter duration of symptoms." It sells for $49.99 US.

There are no homeopathic remedies that can cure or alleviate COVID-19 symptoms.

"Completely ridiculous and a little bit infuriating," Timothy Caulfield, a health law and policy expert at the University of Alberta, said after he was shown the post. "Homeopathic is an easy one because it's completely scientifically implausible. That one is so clearly wrong and harmful it should be taken down immediately."

This homeopathic remedy, which purports to prevent COVID-19 symptoms, was flagged but remains on Facebook. There are no homeopathic remedies that can cure COVID-19. (CBC)

Caulfield says self-reporting tools on social media must lead to action otherwise people will stop using them, but understands the difficulty of monitoring platforms that have billions of users.

"The numbers of messages that have to be evaluated are just huge so I think that is one of the great challenges of social media: how can you meaningfully monitor all of these posts, but we know we need to," said Caulfield. "The challenge is there but the harm is real."

Over the course of Marketplace's test, Facebook did take down a number of prominent accounts on its platforms, including Robert Kennedy Jr.'s Instagram account, which had close to a million followers — the result of a new policy in February that outright prohibited the posting of any anti-vaccination or COVID misinformation. RFK Jr.'s Facebook account, and the Facebook and Instagram accounts of his group, Children's Health Defense — with a combined following of close to 700,000 — are still up.

The company disputed that some of the posts Marketplace flagged violated its protocols, and said in an emailed statement that it had "removed millions of pieces of content on Facebook and Instagram that violate our COVID-19 and vaccine misinformation policies — including two million since February alone."

YouTube, Twitter performed worst

Of the four platforms Marketplace tested, Twitter and YouTube took the least action.

Twitter initially left up all but two of the 173 posts Marketplace reported — including one by a prominent anti-vaccination leader that called the COVID-19 vaccine a "military-grade, deadly bio-weapon." The post yielded more than 2,100 likes and 1,400 retweets. 

This Twitter post claims the COVID-19 vaccine is a ‘military-grade, deadly bio-weapon.’ Marketplace reported it but it still remained online as of March 28. (CBC)

While Twitter has since removed 18 per cent of the posts Marketplace reported, the company would not say why it initially left up the majority of flagged posts and said it doesn't "directly comment on third-party studies." It pointed to its updated policies, which include a five-strike system for users that would lead to an account deletion.

YouTube didn't take down any of the flagged videos until Marketplace shared its findings. After that, it took down 34 per cent of the reported videos.

But many still remain — including one from a known conspiracist telling his audience that people are sending him information "telling me causes of [COVID] death have been altered." He said he is also receiving information about, "hospitals that are completely dead, nothing happening in there," referencing a viral trend early in the pandemic where people would record videos of empty hospitals to try to back up their claims that COVID-19 wasn't real.

The video has over 700,000 views.

This video showing a prominent conspiracist talking about COVID-19-related deaths being altered is still up online, despite Marketplace reporting the video. (CBC)

YouTube said in a statement that only some of the videos Marketplace reported violated its policies, and said that since February 2020, it had "removed more than 800,000 videos for violations of our COVID-19 misinformation policies."

Ahmed says CBC's results suggest YouTube, Twitter and Facebook may not be paying as close attention to misinformation until news organizations or legislators put them under the microscope. 

"What's really great about this study is that this tells us what they're doing when they think no one is watching."

  • Watch full episodes of Marketplace on CBC Gem, the CBC's streaming service.

With files from Jade Prevost-Manuel and Dexter McMillan

Add some “good” to your morning and evening.

Your daily guide to the coronavirus outbreak. Get the latest news, tips on prevention and your coronavirus questions answered every evening.

...

The next issue of the Coronavirus Brief will soon be in your inbox.

Discover all CBC newsletters in the Subscription Centre.opens new window

This site is protected by reCAPTCHA and the Google Privacy Policy and Google Terms of Service apply.