Ideas

Quit using The Terminator as an example of AI gone wrong, argues BBC Reith Lecturer

In part one of the BBC’s Reith Lectures, the founder of UC Berkeley’s AI lab, Stuart Russell says it’s useless to worry about a Terminator-like future, where AI controlled robots and planes to wage war on humanity. He argues we need to worry about tiny intelligent drones silently assassinating targets around the world.

'It makes people think that autonomous weapons are science fiction. They are not. You can buy them today.'

BBC Reith Lecturer Stuart Russell argues that our current approach to Artificial Intelligence is wrong and if we continue down this path, we will have less control over AI at the same time as it has an increasing impact on our lives. (bbc.co.uk )

When journalists and editors need a way of illustrating an article about the risks of Artificial Intelligence, one pop culture source comes up again and again. 

The 1984 film, The Terminator

Stuart Russell, professor of computer science at UC Berkeley and founder of the university's AI lab wishes they would stop.

"I've tried to convince journalists to stop using this image for every single article about autonomous weapons, and I've failed miserably."

Russell is this year's speaker for the BBC Reith Lectures. In his series, he warns that as AI technology spreads rapidly through the worlds of business, warfare, and our personal lives, there is a risk of losing control. But 'SkyNet' — the evil AI from the Terminator franchise — is the wrong metaphor.

"Many films such as The Terminator would have you believe that spooky, emergent consciousness is the problem. If we can just prevent it, then the spontaneous desire for world domination and the hatred of humans can't happen," he told IDEAS.

A robot with red eyes.
The autonomous cyborg Terminator in the 1984 film is originally conceived as a soldier, infiltrator, and assassin that is indestructible. (Paul Gilham/Getty Images)

Russell has three main problems with The Terminator metaphor.

"This Terminator picture is wrong for so many reasons. First of all, the Terminators fire a lot of bullets that miss their targets. Why do they do that? Secondly, it makes people think that autonomous weapons are science fiction. They are not. You can buy them today,"  Russell explained.

"Third, it makes people think that the problem is SkyNet, the global software system that controls the Terminators. It becomes conscious, it hates humans, and it tries to kill us all."

Black Mirror closer to reality

The threat of AI weapons is not that they might turn against us, but that they'll be extremely good at doing what we ask of them, according to Russell.

"SkyNet never was the problem. If you want a better picture from science fiction, think about the TV series Black Mirror and specifically the robot bees from the episode Hated in the Nation. They aren't conscious. They don't hate people. They are precisely programmed by one person to hunt 387,0046 specific humans, burrow into their brains, and kill them."

From the Black Mirror episode ‘Hated in the Nation,’ autonomous bees swarm before killing thousands of people. (Netflix)

Russell says Black Mirror's drones are far more likely to become reality than SkyNet.

"A lethal air-powered quadcopter could be as small as a tin of shoe polish. And this is where the shape charges and explosively formed penetrators come in. About three grams of explosive are enough to kill a person at close range. A weapon like this could be mass produced very cheaply. A regular shipping container could hold a million lethal weapons, and because by definition, no human supervision is required for each weapon they can all be sent to do their work at once.

"And if we know anything about computers, it's that if they can do something once, they can do it a million times. So the inevitable endpoint is that autonomous weapons become cheap, selective weapons of mass destruction."

Slaughterbots pushes a ban on autonomous weapons 

In 2017 Russell and others working for a ban on autonomous weapons systems produced a film called Slaughterbots to show a more realistic scenario.

"It had two storylines. One: a sales pitch by the CEO of an arms manufacturer demonstrating the tiny quadcopter and its use in targeted mass attacks. The other, a series of unattributed atrocities, including the assassination of hundreds of students at the University of Edinburgh. The reactions elsewhere were mostly positive. The film had about 75 million views on the web, and I'm pleased to say that CNN called it the most nightmarish, dystopian film of 2017."

There have been several attempts to negotiate international treaties around the use of autonomous weapons. At the most recent meeting of the UN's Convention on Certain Conventional Weapons in Geneva, most members supported new laws limiting the use of autonomous weapons. The United States and Russia blocked attempts at a ban, instead calling for the creation of a code of conduct. 

In the BBC Reith Lectures, Russell reiterates his calls for a ban on autonomous weapons systems.

"With all due respect, there are eight billion people wondering why you cannot give them some protection against being hunted down and killed by robots. If the technical issues are too complicated, your children can probably explain them."
 

Listen to all four of Russell Stuart's BBC Reith Lectures where he examines the impact of AI on jobs, military conflict and human behaviour. He also shares his suggestions on a way forward based on a new model for AI, one based on machines that learn about and defer to human preferences.

Episode 1: The Biggest Event in Human History

Episode 2: AI in warfare

Episode 3: AI in economy

Episode 4: A Future for Humans


*This IDEAS episode was produced by Matthew Lazin-Ryder.

Add some “good” to your morning and evening.

Subscribe to our newsletter to find out what's on, and what's coming up on Ideas, CBC Radio's premier program of contemporary thought.

...

The next issue of Ideas newsletter will soon be in your inbox.

Discover all CBC newsletters in the Subscription Centre.opens new window

This site is protected by reCAPTCHA and the Google Privacy Policy and Google Terms of Service apply.