British Columbia

Why passing laws to curb online disinformation is so difficult, according to tech and legal experts

While many have looked to elected officials and oversight bodies for measures to address disinformation online, some tech and legal experts say government intervention comes with its own challenges.

Determining intent behind posts and who decides what is true or false among challenges, says law professor

An iPad and a woman using her smartphone in the background
A person uses their smartphone as apps are shown on an iPad in Ontario in 2017. According to tech and legal experts, passing laws to address disinformation on social media is not that simple, and in some cases can lead to other problems. (Nathan Denette/Canadian Press)

In September, Maria Ressa was in downtown Vancouver, speaking to a sold-out theatre about the consequences of online disinformation.

"Facts form our shared reality," the veteran Filipino American journalist — and previous co-recipient of the Nobel Peace Prize — said.

"If you have no facts, you have no truth. Without truth, you can't have trust. Without any of these three, we don't have a shared reality. How can we begin to solve the existential problems that we already have?"

Ressa reiterated the need for "guardrails" on social media platforms to stem the spread of disinformation — false information disseminated with the intent to deceive.

It's a call many others have made in recent years.

Disinformation and foreign interference, said former Canada Elections commissioner Yves Côté in June, are two of the biggest threats facing the country's electoral system. Similar concerns have been shared by chief electoral officer Stéphane Perrault and Elections B.C.

WATCH | Yves Côté on the challenges Canada's electoral system faces:

Disinformation and foreign interference are key election challenges, commissioner says

2 years ago
Duration 2:18
Outgoing Commissioner of Canada Elections Yves Côté talks about key challenges facing Canada's electoral system.

While many look to elected officials in calling for measures to deal with the problem, some tech and legal experts say government intervention is complex.

"The ... first reaction of a lot of people is that there ought to be a law against this," said Vivek Krishnamurthy, a law professor at the University of Ottawa and director of the Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic.

"But it is really, really, really hard to find effective laws." 

Fake news laws 'almost always' passed by dictatorships

One of the immediate challenges of addressing disinformation is assessing intent, says Krishnamurthy.

"As opposed to misinformation, which is when people might spread things that are false but without intent to deceive, they may or may not know better," he said. 

Another challenge is determining who defines what is true or false. 

Earlier this year, Canadian Heritage Minister Pablo Rodriguez and Justice Minister and Attorney General of Canada David Lametti assembled an expert advisory group to advise on legislative and regulatory frameworks to address harmful content online.

One of the points the experts — whose backgrounds ranged from law and public policy, to ethics and advocacy — agreed on is that the government cannot be an arbitrator of truth, or of intent.

"There are countries that have tried to ... pass fake news laws, et cetera — those are almost always dictatorships, where fake news is stuff that is criticizing the government," said Krishnamurthy, who was also a member of the advisory group. 

The experts noted that in some cases, disinformation can be easier to identify — including foreign interference, where the actors and intent are clear and involve national security threats.

The advisory group also explored holding companies liable for content on their online platform, but most agreed that doing so is "neither practical nor justifiable," undermining core internet principles and going against international trade agreements.

Technological 'black boxes'

Nevertheless, many experts agree that companies need to be more transparent about how their platforms work.

This includes the use of algorithms, which determine what content is seen or recommended to users.

"Algorithms in general are regarded as 'black boxes,'" said Ahmed Al-rawi, associate professor at Simon Fraser University's school of communication and director of the Disinformation Project.

"We know nothing about them. We know very little about how they function, how they are written and who wrote them."

In a peer-reviewed study on Google's autocomplete search function, for example, Al-rawi and his colleagues found that the search engine displays results that describe well-known conspiracy theorists with labels that are "neutral, positive, but never negative."

Researchers say social media platforms need to be more transparent about the workings of their algorithms, which determine how content and information is prioritized and presented to users. (Chinnapong/Shutterstock)

This is problematic, according to the authors, because it "mainstreams conspiracies within larger social, cultural, and political structures."

"How are people described when it comes to Google searches? What about the ranking of the sites?" Al-rawi said.

"We need more transparency so that we can understand how they function. And in this way we can raise awareness and inform the public about how to search, what to search."

A 'special duty to the public'

When it concluded in June, one of the federal advisory group's recommendations was for the government to focus legislation on the harmful effects of disinformation, or behaviours associated with it, like co-ordinated manipulation using bots.

They also pointed to non-legislative tools, like media literacy programs.

In July, the Canadian Heritage Ministry announced $24 million in funding through its Digital Citizenship Initiative for projects that give Canadians tools to identify online disinformation in the context of the COVID-19 pandemic and Russia's invasion of Ukraine. 

Asked about efforts to address disinformation on social platforms through legislation, the ministry did not answer directly, but pointed instead to Bill C-18.

The so-called Online News Act would require companies like Facebook and Google to compensate Canadian news organizations for using their content, although it does not specifically address the spread of disinformation.

As for leaders in office, Krishnamurthy says they, too, have a responsibility. 

"I am pretty alarmed that we see a growth in Canada of political figures who are in the mainstream and now are in positions of real influence who peddle [false information]," he said. 

"Those elected officials owe a special duty to the public, especially when they're using technology for political communications. They should be held to a higher standard of laws, but certainly by all of us in society, about the way that they conduct themselves."

ABOUT THE AUTHOR

Johna Baylon

Journalist

Johna Baylon is copy editor based in Vancouver who also contributes as a reporter. Email her with story tips at johna.baylon@cbc.ca.

With files from Elizabeth Thompson, Peter Zimonjic