May 04, 2013
Debate On Self-Driving Cars

The Economist has an interesting debate between Paul Saffo and Andrew Bergbaum on whether self-driving cars will move into normal use in the foreseeable future.

One problem I see: once we reach the point where, say, driverless cars are half as dangerous as humans they'll still cause accidents. So then who is to blame? If the accident occurs in a situation would normally result in criminal charges against a human driver then who gets charged? People are legally responsible agents. They can be jailed or sentenced to death or put on probation.

Another problem: How to tell when the cars are safe enough? Hard to do that in anything besides real world driving conditions.

To maximize lives saved it would make sense to switch the most dangerous drivers to driverless cars first. The youngest, the oldest, and those with assorted disabilities and bad driving records will be more dangerous than self-driving cars before the most skilled drivers. Also, self-driving is more dangerous for the tired, distracted, drunk, or sick. So it would make sense to require some people to give up the wheel before others.

Another consideration: self-driving vehicles have a higher hurdle to beat because they aren't going to be competing with purely human-driven cars. They'll be competing with computer-assisted human drivers. Electronic stability control, adaptive cruise control, collision avoidance systems, and other systems that warn drivers and selectively take over control to make human drivers less dangerous. These computer-assisted systems will get better every year.

I would especially like to see faster development and deployment of collision avoidance technologies for long haul truckers. Their vehicles are harder to control and more dangerous once out of control. Plus, the drivers are often tired from very long hours behind the wheel.

Note that aircraft auto-pilot systems have been around for decades and yet human pilots still land the aircraft. We should see fully automated landing and take-off of aircraft before fully automated cars because the road environment in neighborhoods and cities is much more complex. Bicyclists, kids, dogs, oblivious pedestrians, and other vehicles and road navigation and control a much tougher problem.

Share |      Randall Parker, 2013 May 04 07:03 PM 


Comments
Kudzu Bob said at May 4, 2013 11:06 PM:

John von Neumann really could have used a self-driving car.

Ronald Brak said at May 5, 2013 1:54 AM:

Self driving cars and pilotless planes are not directly comparable.

1. Unlike the average car driver a pilot is someone whose entire job is focused on flying safely and dealing with emergency situations. The roads would be a lot more safer if everyone was screened for alcohol and many other drugs, forbidden to drive while fatigued, required to spend many hours training to deal with emergency situations each year, and lost their job if they ever got a ticket or had a minor accident. A trained pilot has a lot more to offer in terms of increased safety than the average driver.

2. An out of control plane can kill thousands and is almost certain to kill the people on board if it comes in contact with any part of the ground. Most times an out of control car will kill no one.

3. There's not a lot of money in self flying passenger planes. The cost of paying for two pilots is an insignificant part of the cost of a plane flight, but the opportunity cost of time spent driving is the largest cost of driving a car for most people.

It seems likely pilots will be kept in planes as long as they have anything to offer in terms of increased safety, while they will have increasing levels of technical assistance designed to prevent pilot error.

Phil said at May 5, 2013 4:32 AM:

There are two related problems with self-driving cars.

First is that of deskilling. We remain skilled drivers (if we ever manage to attain that level) by continued repetition of the task. When the car takes over, we're going to put all our trust in it.

Then, when its control systems fail in a non-safe manner (even fail-safe systems fail) we'll be ill-prepared and ill-skilled to take approprate action.

Oh, sorry officer, I was cat-napping whilst the car was driving me, I didn't notice...

Brett Bellmore said at May 5, 2013 7:23 AM:

The biggest problem for self-driving cars is that they come *after* our transition to a regulatory dystopia. Under current regulatory conditions, if DRIVEN cars were invented tomorrow, they'd be outlawed. Airplanes, too.

There are whole ranges of products and services we enjoy today just because they were already widespread before the regulatory and tort state achieved criticality. We're coasting now, enjoying the fruits of a different sort of society. Over the counter drugs that would be prescription only. (Asprin wouldn't be approved under current rules, it does too many things at once!) Technologies in wide use, that would have been regulated into still birth.

If nuclear power had been invented in the 1800's, (And it could have been, 1800's technology would have been up to it if they'd had a clue what to do.) home furnaces would be nuclear piles, cars would be powered by radioisotope heat sources driving steam engines, the world would be completely different. And a bit of radiation would phase us about as much as the stink near the expressway. I'm not saying there would be no downside, but we're missing some phenomenal upsides from new technologies being killed in the cradle.

Orion dying was probably the first casualty of this transition, so you can blame the fact that we don't have colonies all over the solar system on this, too.

john personna said at May 5, 2013 8:23 AM:

I think there is a weird psychology around debates of "dream technologies." Perhaps the debate becomes part of the dreaming. I submit that self-driving cars and 3d-printed guns could be given "rational inattention" for some time to come. They each need significant developments before they hit the future-reality debated.

On the other hand we have, and you cover, significant current tech questions. Diet and exercise today, and not too many Margaritas for Cinco De Mayo, are more rational attentions.

john personna said at May 5, 2013 8:26 AM:

BTW, I agree that active safety systems, for human drivers, will come first and save lives. That's current tech, with back-up warning and auto-braking. Gibson's "future not evenly distributed."

Deltac said at May 5, 2013 8:53 AM:

Dear Sir,

Why so much debate about vehicles that are COMPLETELY autonomous? There are varying degrees of autonomy that can be tremendously beneficial.

Besides...

Aren't there already vehicles on the roads that self-correct or automatically break when in danger of collision? Do we sue the manufacturer if the breaks fail? What about a flat tire that causes an accident? There must already be some kind of policy in place regarding liability issues. The ground work is already in place.
A possible way to solve the issue would be to pass laws that drivers are still responsible for controlling the vehicle even in semi-autonomous mode (no sleeping, facing forward, etc.). The driver must be able to take control at a moments notice. The driver can take on a supervisory role until all the technical issues are ironed out.

Engineer-Poet said at May 6, 2013 6:55 AM:

The problem with mandating that a driver who is out of the loop must remain responsible for all exceptions to the situations that the automatic driving system can handle is that it won't work.  It can't work, because a human not acting as the driver will require a significant amount of time to recognize that the algorithms have failed.  If it takes three seconds to recognize the need for an action which must be accomplished in two seconds, it's not going to get done.  It doesn't even get done in airplane cockpits with autopilots (excuse me, "flight directors"), it won't happen on our roads.

Self-driving cars mean that accidents truly ARE accidents.  Nobody is criminally responsible.  Someone is likely to be liable, so there will still be insurance, but it will cost less as there will be less damage overall.  Algorithm designers will compete to lower their accident rate and thus the insurance charges for people who use their products.

Phillep Harding said at May 6, 2013 6:04 PM:

Develop the computer driven car in gated communities with low traffic densities? College campuses?

destructure said at May 7, 2013 1:47 AM:

That would be fine for constant conditions. But it might not work so well for variables, road hazards, etc. I'm thinking ice on a bridge, pot holes, something falls off a truck or a dog, deer or even a kid runs into the street. Even so, I suspect the net accidents and fatalities would be much less with self driving cars at some point. It might be a while but I expect fully automated driving at some point. That would be great for people who have trouble driving now i.e. elderly, epileptics, blind, deaf, etc. People could work while they commuted. Children could even go places without their parents.

The best part of all would be fewer traffic jams. A major cause of slow traffic is people following too close and too fast. So they punch the accelerator and then punch the breaks. The ripple effect for thousands of cars means the stop & go rush hour traffic. If people would just back off a little and slow down then traffic would move smoother and faster. Self driving cars could be programmed for that. For those wanting to know how to drive in slow, heavy traffic -- try to drive so that you're maintaining a more constant speed and not using your breaks as much. You'll notice that when you do that a lot of the cars behind you start doing the same and it becomes obvious that traffic behind you is moving a lot better. The problem is that you can't move faster than the clowns in front who are still using "stop & go". There should be a law against excessive breaking during rush hour similar to the law against excessive lane changes.

Randall Parker said at May 8, 2013 8:15 PM:

destructure,

I think automated vehicles would outperform humans in most difficult conditions. Sensors could recognize icy conditions. Cameras pointed in many directions could detect motion from a dog or deer and react to it much faster than a human could.

Another thought: It would be useful to put cameras on, say, 100,000 heavily driven cars and record all that those cars react to and capture crashes. Such video recordings and other sensor recordings would allow automated vehicle developers to see what it is they need to program for.

destructure said at May 9, 2013 2:45 AM:

Concept simple. Implementation difficult.

Post a comment
Comments:
Name (not anon or anonymous):
Email Address:
URL:
Remember info?

                       
Go Read More Posts On FuturePundit
Site Traffic Info
The contents of this site are copyright