Quirks and Quarks

The internet is full of misinformation. That's by design, experts say

We are surrounded by information, and yet, 43 per cent of Canadians feel it's harder to decipher the truth from fiction. Several recent studies look at why the information age is so confusing.

New studies find misinformation running rampant on Tik Tok, search engines, and even books listed on Amazon

A phone scrolling TikTok with disinformation in the background.
A study of videos on TikTok about ADHD found that half of the top 100 videos were promoting false information about the disorder. The researchers say thats because TikTok's algorithm prioritizes attention over accuracy. (Chris Delmas/Getty Images)

This current time period we're in has been called the Information Age, and it's easy to see why.

This year, the global amount of data generated is expected to reach 181 zettabytes — or 181 trillion gigabytes — up from just two zettabytes in 2010. Some studies say that there is now more data out there than there are stars in the observable universe.

But all this information comes at a cost.

"We're living at a time that I've categorized as a knowledge crisis," says University of Alberta law professor Tim Caulfield.

 "We have access to more information now than ever before in human history ... despite that, we've never been more misled, more confused."

2023 Statistics Canada survey said 43 per cent of Canadians feel that it's getting harder to decipher what's true and what's fake online — and that was even before the rise of AI and deepfakes. 

Several recent scientific studies have attempted to quantify just how much of this information is actually real.

"Our information environment is completely manipulated, and often people don't realize the degree to which that is the case," said Caulfield.

Analyzing TikTok, Amazon, and search engines

In a 2024 study published in the Journal of Medical Internet Research, Caulfield and his colleagues looked at the quality of books about cancer on Amazon.

"We found 49 per cent of the books had misleading content in it, and some of it was just completely atrocious," said Caulfield.

They also found that on the first page of results on Amazon, 70 per cent of the content was misleading. 

"And once again, sometimes just hardcore bunk," he adds. 

A phone showing a bogus news article.
This photo shows a Facebook 'military interest' page that misrepresented old photos and videos of army operations to falsely claim that Washington was helping its ally Manila prepare for war. It is an example of misinformation that prompts an emotional response to get the reader's attention. (Jam Sta Rosa/Getty Images)

study published in March, led by University of British Columbia PhD student Vasileia Karasavva, took aim at health information presented on TikTok.

The researchers analyzed the top 100 TikTok videos by view count that mentioned ADHD, and shared them with clinical psychologists working with ADHD patients, who reported that half the videos contained some sort of misinformation.

"We sort of saw that a lot of this information didn't match up with the diagnostic criteria," said Karasavva.

"They were presenting things that have more to do with normal human behaviour as symptoms of ADHD."

The team also looked at the creators themselves, and found that half of them stood to make financial gain from this content, posting direct sales links to supposed cures.

Everybody is fighting for your attention, and that makes all of us vulnerable.- University of British Columbia psychologist Friedrich Götz

In another study from March, Tulane University assistant professor Eugina Leung investigated how results on search engines like Google, Bing and ChatGPT differ depending on the search terms used.

"What we find is that people tend to use search terms that lean toward what they already believe is true," said Leung.

"Imagine, we asked participants to search for health effects of caffeine. If they believe that caffeine is quite likely to be harmful, then they're more likely to come up with search terms like dangers of caffeine, caffeine side effects, caffeine health risks." 

These narrow search terms, Leung said, mean that the users are just receiving results that are tied to their beliefs.

"When we try to search for information online, a lot of times we actually are looking to learn something new," said Leung. 

"With the design of a narrow search engines and also our tendency to come up with a narrow search term, the combination of this means that we're often not actually learning something new."

'We are all susceptible'

A 2024 paper by computer scientist Boleslaw Szymansky, published in the journal Nature Human Behaviour, argues that we should consider our information space as part of our natural environment — and acknowledge just how badly it's being polluted by this "data smog."

Our inability to decipher what's true and what's false online is limiting people's capacity to evaluate information and make timely decisions, the authors write, and cites research that suggests this costs the U.S. economy over a trillion dollars annually due to lost productivity.

"We live in a time where the attention economy is dictating a lot of our experiences. So everybody is fighting for your attention," says University of British Columbia psychologist Friedrich Götz. "And that makes all of us vulnerable." 

In his research, Götz wanted to understand who was most susceptible to false information online. In a global study published in the journal Personality and Individual Differences in March, Götz and his team asked over 66,000 participants to discern between fake headlines and real ones.

Most people did poorly.

"The biggest takeaway is that nobody is immune," he said. 

But the study found certain groups were more susceptible than others. This includes women, people with lower levels of education, people who lean conservative politically, and Gen Z.

He points to education, and specifically the development and practicing of critical thinking skills, as being a defining factor in who fell for fake news more often.

A person is using a tablet and a cell phone, and animated notifications are seen popping out of the phone.
By 2025, the global data sphere is expected to reach 181 zettabytes - or 181 trillion gigabytes - up from just 2 zettabytes in 2010. (Shutterstock / THICHA SATAPITANO)

Caulfield echoes the need for more critical thinking skills to help people navigate information in the attention economy, and suggests something as simple as taking a moment to stop and process information can often help. 

"I think that's because it creates this break between your initial emotional response to the content and it allows, even for that moment, your rational mind to kick in."

Researchers also point to other mitigation strategies in the works, such the University of Cambridge's Bad News Game and other programs that walk people through the mechanics of manipulation. There's also Concordia University's SmoothDetector, which harnesses AI and algorithms to parse through the data smog to find misinformation.

"We are now at a stage where I think the interventions could be implemented at scale. They have been tested," said Götz. 

"I think where we are at right now is the need for goodwill on part of the powerful actors who could actually help us implement that."

ABOUT THE AUTHOR

Amanda Buckiewicz is an award-winning science journalist at CBC Radio's legendary science show, Quirks & Quarks. Before landing at CBC, she travelled the world producing science documentaries for Discovery Channel, BBC Earth, Smithsonian, and Amazon Prime.