September 07, 2014
Moral Dilemmas For Autonomous Vehicles
An essay in Wired points out that a robotic car can be faced with either killings its occupants or killing pedestrians. If the car suddenly finds pedestrians in front of it on the road in some situations fatalities can be unavoidable. The choice could be between hitting the pedestrians or hitting a tree or wall.
Should autonomous vehicles have a moral dilemmas configuration page where you decide what choice to make in these situations?
Suppose the car is going to hit a tree but the computer has a choice: hit the tree with the left or right side and therefore do more damage to the driver or passenger. What choice to make?
What percentage of the public will decide to configure their robotic vehicle to assign the most value to protecting the owner of the vehicle? If a car company doesn't give me that choice I will opt to buy from another vendor.
There's another moral tuning issue with autonomous vehicles: very rare causes of an accident versus traveling speed. For example, suppose it is night time and you are on a curve of a mountain highway which prevents you from seeing very far ahead. Should your vehicle slow down to a speed that will enable a fool stop in event of suddenly seeing a stopped car or pedestrian in the road? That condition could be extremely rare. 99.99999% of people could drive their entire lifetimes and never encounter it. Slow everyone down for that rare case?
Randall Parker, 2014 September 07 12:12 PM
Autonomous vehicles should have RETROROCKETS. Just enough delta-V to bring the vehicle to a stop from the maximum legal speed, enough thrust to do this in the shortest distance which won't kill the occupants.
You wouldn't trust a human driver with the power to bring their car from 70mph to a dead stop in 40 feet. But you might trust a computer to do it.
Oh, and the air bags? They really ought to be on the outside of the vehicle, triggered by computer when a collision is unavoidable, but before impact. There's no good reason the "crush zone" has to be inside the vehicle.
It's a new kind of car, think outside the box.
Autonomous cars could send information to each other saying, "No people or animals on or near the road here," Allowing them to travel faster than any individual car can "see". And of course road surveillance cameras (or drones) are an option.
Who squishes whom is a matter of public health and here at least any overt attempt to favour a car owner's life over others is unlikely to be permitted. After all, here we wouldn't want autonomous taxis favoring the life of the taxi company owner over the lives of taxi passengers. If the life of the owner of a taxi company has priority, then if she experiences chest pains she could not only give her taxi preference over other people on the road including those who are also experiencing medical emergencies, but she could actually slow them down to ensure that the medical professionals will be free to treat her when she arrives. That sort of thing is currently frowned upon where I live, both socially and legally.
You know, there's going to be a lot of truck drivers out of work. Driving's a good job for a lot of folks in our country, and a lot of truck drivers are not going to retrain as computer engineers. What's the point here, concentrating wealth even more than at present? Eliminating the self-respect that comes from having a job and working every day?
JP, I'm in favour of the government creating at least part time jobs for anyone who wants one as well as some form of guaranteed income. This will cost money, but my country is rich, almost as rich as the US, and can afford it. If we don't do this, then I think the cost in human suffering will be considerable.