The ethics of self-driving cars
This week, a group of tech experts floated a plan to ban human drivers from a 230 kilometre stretch of highway linking Seattle and Vancouver. The report suggests we're ever closer to the reality of self-driving cars, but as engineer and philosopher Jason Millar tells Day 6 host Brent Bambury, the ethical questions that come with autonomous vehicles are far from answered.
"We're shifting responsibility from a human driver to some programmed algorithm," says Millar, who is a postdoctoral research fellow at the University of Ottawa Faculty of Law, and a member of the Open Roboethics Initiative.
To present a sample scenario: if a driverless car with four passengers is heading down the road, and a child runs out in front of the car, should the car swerve into an adjacent wall, likely injuring the car's passengers? Or should the car hit the child, which would injure fewer people?
"Who gets to decide how the car reacts? Is it the manufacturer's responsibility to decide how the car should be reacting in these cases?
"From an ethical perspective, there doesn't seem to be a correct answer here. You equally justify designing the car to swerve into the wall, or designing the car to hit the child, it would seem from the ethical perspective."
Millar notes that this raises a very real design problem for engineers and manufacturers.
"First of all, how do you program the car? Do you program it to go straight or to swerve?" asks Millar. "The other question it raises — which is really interesting and I think a very important question — is who gets to decide how the car reacts? Is it the manufacturer's responsibility to decide how the car should be reacting in these cases? Is it the user's responsibility or the passenger's responsibility? Do they get a say, morally speaking?"
The trolley test
Think of the vigilance and the thousands of split-second decisions an autonomous car has to make just to operate. Choices about when to change lanes and when to pass a slower car are second nature to experienced drivers but those choices have to be programmed into self-driving vehicles.
The trolley test is an ethical thought experiment. Similar to the scenario mentioned earlier, the situation the experiment presents involves an out-of-control train barrelling down the tracks. You are standing beside the tracks and notice that there are five people tied to the tracks unable to move. You can pull a lever and divert the train to a different track, but there is also a person on that track unaware that a train may be heading toward them. So do you do nothing? Or do you pull the lever and save five people while killing one?
Is it ethical for us to be able to pick out, say, old versus young people with our sensors?
"These things all land on a gradient, so decisions have to be made one way or another in very sort of ordinary driving scenarios," says Millar. "So how do we go about making those decisions. How do we go about programming the things into the cars that we all do as drivers, even though they're breaking laws and violating different types of norms and rules that we have."
More scrutiny over safety concerns
On Wednesday, U.S. Transportation Secretary Anthony Foxx said safety is top of mind as the country moves to aggressively shape the emergence of autonomous cars.
"It's in their vested interest to be as upfront and as clear and transparent as possible because there's market risk to putting a product out there that doesn't meet the expectations of the public," Foxx said.
Millar notes that the possibilities are endless in terms of what these vehicles could detect, as are the ethical questions surrounding those possibilities.
"I've been in conversations, where people are asking the question, is it ethical for us to be able to pick out, say, old versus young people with our sensors. Should we even be designing the vision systems on these cars to be able to discriminate between objects that way."
How soon will we see driverless cars?
When asked about how quickly these ethical issues might be addressed, before driverless cars are in common use, Millar acknowledges there is still a lot of work to be done.
"There are many technical hurdles that still have to be overcome before fully self-driving cars are on the road," says Millar.
He points to a recent story about a Tesla car, operating on autopilot, that crashed and killed the driver.
"The ethical questions about how you should be informing the user about how the car responds in certain cases, what are the risks, what are the benefits. These types of considerations, I think, manufacturers, passengers and regulators need to start sorting through today," says Millar. "And I think that's why we're seeing movement in regulations south of the border."
Millar offers a hopeful message when it comes to engineers, manufacturers and regulators addressing the many concerns about self-driving cars.
"I'm optimistic that these systems will actually deliver on some of those promises and so the ethical concerns are very much about building systems that are trustworthy, building systems that people accept into the world, that do the things they expect them to do and that are safe," says Millar.