The Current

The mind-blowing future of mind reading (which may be closer than you think)

Reading thoughts and extracting information from our brains may soon be a science reality, but some researchers say we need 'neurorights' to protect the privacy of our minds.
Natasha Drobotenko takes part in a University of Toronto Scarborough study on mind reading. (Danielle Carr/CBC)

Read Story Transcript

A scientific way to read minds may soon become a reality, and it's giving privacy advocates a headache.

Some are even saying we need new 'neurorights,' a set of freshly minted human rights to protect against the invasion of our minds.

In a recent University of Toronto Scarborough study, subjects were shown images of faces while their brain activity was monitored.

Researchers then used the EEG data they collected to recreate those faces, to high levels of accuracy.

"It's crazy. It looks very similar to the face I was looking at," says Natasha Drobotenko, who participated in the study.

"I could pick it out of a line-up."

Adrian Nestor, ​co-author of the study, says this opens the door to recreating images from memories, meaning that a scan of your brain could eventually produce an image of something you're thinking about.

"It's a matter of time, not a matter of if," he says.

"Because we have the right technology and because we have a much better understanding of how we can connect patterns of brain activity to image structures, such as those presenting in a human face."

Nestor's next experiment will focus on recreating words and text from people's minds.

He hopes the technology will one day help with eyewitness cases as well as with communication for people who are non-verbal.

Dan Nemrodov, co-author of the study, prepares Natasha Drobotenko for the experiment. (Danielle Carr/CBC)

Neurorights

While these technologies have the potential to help humanity, some researchers say they could also do great harm.

Marcello Ienca, a researcher with the Health Ethics & Policy Lab, at the Swiss Federal Institute of Technology in Zurich, is part of a team making the case for neurorights.

He wants to protect the privacy of our own thoughts from being extracted, downloaded, or decoded without our consent.

"I'm against any type of outright ban against this type of technological development, because I think that the clinical benefits of this can be extremely important, but I also think that we have to try to minimize the risks before it becomes pervasively distributed in our society."

Ienca says neurotechnology is advancing rapidly and researchers have already shown it's possible to extract sensitive information from our brainwaves.

The four new human rights that Marcello Ienca proposes are:

The right to mental privacy: the right to your own thoughts and protection from having your mind involuntarily read

The right to psychological continuity: preserving one's identity and personality

The right to mental integrity: protection from neural interventions

The right to cognitive liberty: freedom of thought

You can already buy devices which interface directly with your brain. As this technology progresses, so does the risk of these devices being hacked — or brainjacked.

In 2012, a University of Oxford study monitored participants using these kinds of devices.

"[They presented] research participants with certain visual stimuli that could trigger brain responses that correlate with privacy sensitive information," Ienca says.

"By doing that, they were able to... reconstruct the first digits of the participants' PIN codes, or determine whether the participants' had known a certain person, or where their home address was."

The researchers collected EEG data, which was then used to reconstruct an image of what the participant had been looking at. (Danielle Carr/CBC)

Do we need new laws?

Not everyone agrees that we need new human rights.

Ann Cavoukian, the former privacy commissioner of Ontario, says our current laws already protect us.

"With due respect, we don't need new laws, but let me be clear it's an extremely important area," says Cavoukian, who is currently leading the Privacy by Design Center of Excellence at Ryerson University,  

"Public concern over privacy is at an all time high, in the ninetieth percentile."

Cavoukian says the key to ensuring privacy as technology changes is remaining vigilant and ensuring an open public dialogue.

"None of this happens under the covers. So that kind of accountability and transparency will be critical, and I believe we're going to do that."

Nestor says he agrees that the ethics around neuro-applications will become increasingly important as the technology evolves. Particularly as once a technology is out there, anyone can potentially turn it to any use.

"There's always a temptation to use technology in ways in which the experimenters or their designers did not intend," he says.

Listen to the full conversation at the top of this page, where you can also share this article across email, Facebook, Twitter and other platforms.


This segment was produced by The Current's Danielle Carr.