According to the article, Google is now operating so-called autonomous cars in California and Nevada, and last week at the annual Consumer Electronics Show in Las Vegas, Toyota and Audi unveiled prototypes for self-driving cars to sell to ordinary car buyers. (Google co-founder Sergey Brin said last year said he expects his company to have them ready for the general public within five years.)
But as the momentum for self-driving cars grows, one question is getting little attention: should they even be legal? And if they are, how will the laws of driving have to adapt? All of our rules about driving ” from who pays for a speeding ticket to who is liable for a crash ” are based on having a human behind the wheel. That is going to have to change.
Related Blog Post: The Dangers of Distracted Driving
There are some compelling reasons to support self-driving cars. Regular cars are inefficient: the average commuter spends 250 hours a year behind the wheel. They are dangerous. Car crashes are the leading cause of death for Americans from 4-34. In fact, there are about 33, 000 people killed every year in the U.S. and worlwide it’s approximately 1.2 million people. Car crashes cost some $300 billion a year. This technology, as it evolves, could eliminate traffic accidents. The car has lasers and radar mounted on its roof. It senses all of the objects around the vehicle, even some a human driver wouldn’t notice or can’t see. These cars never get tired or distracted; they don’t drink and drive; they don’t textGoogle and other supporters believe that self-driven cars can make driving more efficient and safer by eliminating distracted driving and other human error. Their safety record is impressive so far. In the first 300,000 miles, Google reported that its cars had not had a singe accident. Last August, one got into a minor fender-bender, but Google said it occurred while someone was manually driving it.
As self-driven cars become more common, there will be a flood of new legal questions. If a self-driving car gets into an accident, the human who is œco-piloting may not be fully at fault ” he may even be an injured party. Whom should someone hit by a self-driving car be able to sue? The human in the self-driving car or the car’s manufacturer? New laws will have to be written to sort all of this out. How involved ” and how careful ” are we going to expect the human œco-pilot to be? As a Stanford Law School report asks, œMust the ˜drivers’ remain vigilant, their hands on the wheel and their eyes on the road? If not, what are they allowed to do inside or outside, the vehicle?
Read More: What Consumers Need to Know About Changes to Florida PIP Law
What happens if you pull up to a four-way stop in your car around the same time as another driver, you both proceed into the intersection, and crunch, fenders are bent. Then you learn the other driver wasn’t really driving. In fact, no human was controlling the car “ it was a computer-managed œautonomous car, driving itself. So, who’s at fault, man or machine? No one has squarely addressed that, but the Nevada and California regulations say that the operator, the person who pushes start, is responsible. Thus, as far as tort liability is concerned, you are now strictly liable even though there’s no fault on your part. It could evolve that the standard liability would fall to the operator regardless of the accident cause, and if it was a programming malfunction, the operator could try to shift the liability to the manufacturer.
As you can tell, issues surrounding liability and who is ultimately responsible when robots take the wheel are likely to remain contentious. Stay tuned – this should be interesting.
Read more: NPR: California green lights self driving cars but legal kinks linger