Training Driverless Cars To Be Ethical
Just as Uber and Google ramp up testing for driverless cars on public streets, Mercedes-Benz and BMW announced at the Frankfort Auto Show they will develop autonomous cars. Other car manufacturers, including Toyota are already doing research in the robot car market, projected to be worth $42 billion by 2025. According to newscientist.com London plans to deploy driverless cars by the end of the year.
But ethicists such as Patrick Lin on Wired.com say hold on. If there's a crash, how will the driverless car know what to avoid? Dr. Lin, director of the Ethics + Emerging Sciences Group at California Polytechnic State University, is doing research into crash-optimization algorithms for robot cars for the National Science Foundation.
He writes, "As a matter of physics, you should choose a collision with a heavier vehicle that can better absorb the impact of a crash which means programming the car to crash into the Volvo." (Volvo SUV rather than a Mini Cooper). But this kind of thinking runs into ethical challenges. "Volvo and other SUV owners may have a legitimate grievance against the manufacturer of robot cars that favor crashing into them over smaller cars, even if physics tells us this is for the best."
What about bicyclists or motorcyclists? Virginia Center for Transportation Innovation and Research scientist and Cal Poly consultant Noah Goodall presents another scenario. "You have two motorcyclists; one with a helmet and one without a helmet. Well, technically it's safer to crash into the one with a helmet because they are more likely to survive but that seems to be really unfair."
Chris Gerdes of Stanford's School of Engineering discusses the challenges in this video:
Engineer and philosopher Jason Millar posed this scenario to participants in an online poll and asked them who should decide (passengers, lawmakers, or car designers) what this driverless car hits. “You’re driving along on a mountain road and you’re approaching the entrance to a tunnel and suddenly a child stumbles into the road and the car has to make a decision about whether to swerve into the wall of the tunnel and likely injure or kill you or continue going straight or hit the child.”
Millar, a PhD candidate at Queens University had a back and forth on Wired with Patrick Lin. He doesn’t want the decision strictly left up to car engineers. He wants the public to have some say, and apparently the people participating in the poll thought the same thing.
“People overwhelmingly picked anything but the designer. So the developers were kind of the least favorite choice among the options that we gave which indicated to us that there is at least something to the question that who is it that should be sorting these algorithms out.”
The poll found 44% chose passengers, 33% said lawmakers should make the decisions and just 12% designated car designers.
Ethical research in autonomous cars is in the very early stages. There are safeguards.
Goodall says the person in the car can still take control. "There are still steering wheels, the federal regulations require, you could take over at a moment’s notice. The other question is- do you have any say over your car’s behavior? Can you tell your car I’d rather you prioritize pedestrians differently than cars? Do you have any say in these settings? I’ve seen some patents where you can select your car’s aggression level.”
Scientists studying these potential problems say in the long run driverless cars will make roads safer with fewer accidents and cheaper infrastructure.
According to Patrick Lin, "... problems don’t mean we should stall the progress of autonomous cars, only that we need to anticipate and prepare for the potholes ahead."
This story originally aired September 21,2015.