California To License Self-Driving Cars 301
DevotedSkeptic writes "Californian senators have passed a bill that looks set to make the state the second in the US to approve self-driving cars on its roads. The bill was passed unanimously by state senators, and now hits the desk of governor Jerry Brown, who's expected to sign it into law. It calls on the California Department of Motor Vehicles to start developing standards and licensing procedures for autonomous vehicles. 'This bill would require the department to adopt safety standards and performance requirements to ensure the safe operation and testing of 'autonomous vehicles', as defined, on the public roads in this state,' it reads."
Should be done in upstate new york, too (Score:4, Insightful)
Re:Not safe (Score:5, Insightful)
Just look at the wonders of automated flight. Most airline accidents that aren't due to terrorism or mechanical malfunction are due to pilots overriding the autopilots.
Re:what about stuff like code review and liability (Score:5, Insightful)
There are a lot of interesting legal implications for these self driving cars but all that a side I dream of the day when a drunk can stumble out of the bar and fall into the back of his car and wake up in the drive way of his home the next morning.
Anyone who seriously moves to prevent the self driving car from becoming reality regardless of how safe they are is simply against saving lives. I'm sure most people will wonder how anyone could be flat out against self driving cars but people like that do exist and at some point this will move from a legal issue to a political issue when it starts looking like mass adoption might happen and these people will come out.
Re:Should be done in upstate new york, too (Score:5, Insightful)
Considering half the drivers there don't seem to be paying attention to their driving, self-driving cars would probably be a huge improvement.
I got a ticket about 10 years ago and had to go to driving school. Maybe 50-60 people packed into a room. First two things the guy asked were questions on how close you could legally follow another car, and who had right of way in a simple merge situation and in a lane change. About 75% of the people, by show of hands on a multiple choice answer set got the wrong answer. Which means 3/4 of people on the road don't understand the simplest of rules regarding driving.
Couple that with being able to get a handful of questions wrong on the driving test, and rarely if ever re-testing, throw in some distraction since driving a two ton killing machine just isn't that interesting after you've done it a couple of months, and you have driving problems and accidents.
The car knows the rules of the road. It isn't distracted. It wont change lanes every 5 seconds when there's heavy traffic and all lane changing does is increase the likelihood of an accident. It wont tailgate. It won't drive drunk. Its not texting continuously. It wont speed 20mph over the speed limit so as to arrive home 1.5 minutes earlier. In short, it won't do any of the 95,000 things that human drivers do, usually at considerable risk and low to no gain.
Maybe if people actually read and retained the rules of the road, and didn't drive like they were playing a video game with no downsides and no risk, along with unlimited lives...we wouldn't need this.
But...we do.
Good on California legislators for reacting quickly to a potential source of licensing revenue. While they may go for years without addressing serious problems and safety issues, or doing complex things like resurfacing roads...they're pretty quick to respond to an increase in the revenue stream that allows them to continue spending billions on pork every year.
Now I just have to figure out how to trick them into thinking its fun to spend money on roads and schools.
Re:Not safe (Score:5, Insightful)
With reports of Google's self-driving car crashing left [cnet.com] and right [jalopnik.com] how could anyone want to be in one of these vehicles? They just aren't safe. When something happens when you're driving then it's at least your fault and you could do something about it, but not in self-driving cars.
Was this meant to be sarcastic? Both of those posts referred to the same accident. These cars have logged hundreds of thousands of miles, with ONE accident(which may well have been human error). That's far, far safer than the average human driver. If you're in the drivers seat of the self driving car, you CAN take control of it should you feel the need, too.
However, realistically that's not going to be useful. The car will be better at accident avoidance than you are - it's not that big a programming challenge to achieve that. People don't like to admit it - it bruises their delicate little egos - but the car knows *exactly* how fast every car around them is moving, their acceleration, and can put itself exactly where it wants to be every time. No delayed reactions due to inattention, no slight overreaction due to panic.
Yes, self driving cars will be involved in accidents, and will be at fault, from time to time. This does not make them less safe - it's inevitable, particularly when human drivers are involved as well. Human drivers, on the other hand, are extremely unsafe. Everyone wants to think that they are special, and unlike everyone else they're awesome drivers, but the reality remains that human drivers are in accidents extremely regularly.
Don't get me wrong. I'd hate to be in a robotically driven car. Logically, I know I'd be much safer than with a human driver, but I'd be enormously squirrelly about the whole process. And, of course, I love driving - I'd never be comfortable giving that up to a machine. I consider myself a good driver, too (like everyone else), and I've never been in an accident for which I'm at fault, but I can acknowledge that there have definitely been times I've driven with far less than ideal circumstances. Distraction, emotional distress, tiredness, ill health, the list goes on an on. In all those cases, I'm less than 100%.
autopilots acting on bad data or coding issues??? (Score:2, Insightful)
autopilots acting on bad data or coding issues??? had lead to crashes.
What about that air show crash where you had stuff like
Thus he may not have heard these warnings (and thus any other warning or alarm as they sound in cockpit and not always in the headset).
that black boxes had been tampered with. (maybe to cover up the airbus issues with it's autopilot)
In the month prior to the accident, Airbus had posted two Operational Engineering Bulletins (OEBs) indicating possibilities of anomalous behavior in the A320 aircraft. These bulletins were received by Air France, but were not sent out to pilots until after the accident:
A320 crashes
http://www.airdisaster.com/cgi-bin/view_details.cgi?date=03221998®=RP-C3222&airline=Philippine+Airlines [airdisaster.com]
The aircraft overran runway 4 while landing. A malfunction of the onboard flight computers prevented power from being reduced to idle, which inhibited thrust reverse and spoilers from being used. The offending engine was shut down, and brakes applied, but the aircraft was unable to stop before the end of the runway
Re:Not safe (Score:4, Insightful)
It's not safe for the simple reason that the automatic cars will drive the speed limit, and cause accidents because everybody else is going 20 over.
Beginning of the end for driving jobs. (Score:5, Insightful)
Re:Not safe (Score:5, Insightful)
Re:Not safe (Score:5, Insightful)
Where are you getting that the average human driver has an at least one accident every few hundred thousand miles? I wouldn't call this "far, far safer" yet. It has the potential to be.
Also, most of the tests have been in still fairly controlled environments. Meaning, the car wasn't woken up in the middle of the night to get a pregnant woman to the hospital quickly over dirt roads, past nighttime street-racers, etc... Loads of "special cases" exist in the world of cars. It will be quite a long time before we have a really solid understanding of their viability. Right now, a "typical commute" is probably the safest use, or even for standard-route delivery vehicles without a high time sensitivity. Even better if certain roads / routes / lanes get set aside for autonomous vehicles only, which would make them even safer and more efficient.
Re:So who's going to insure these things? (Score:5, Insightful)
Re:Get used the idea, I'm afraid (Score:5, Insightful)
Re:Get used the idea, I'm afraid (Score:4, Insightful)
Re:what about stuff like code review and liability (Score:4, Insightful)
Nobody. Same as now.
Re:Not safe (Score:4, Insightful)
Self-driving cars will eventually be the majority.
Driving 20 over the speed limit may make you get there more quickly, but not having to focus on the road for the whole trip will make the trip more enjoyable and will make it feel like you get there more quickly.
Re:Not safe (Score:4, Insightful)
If all cars are self-driving, then we can happily increase the speed limit -- and probably by a lot!! We might even get a scenario where one speed limit applies to humans, and another (higher) one applies to computer-controlled vehicles.
Re:Not safe (Score:5, Insightful)
They don't have to be safe, as nothing, not even laying in bed, is completely safe. They just have to be safer than what exists now. That is a pretty low bar to reach.
Re:Not safe (Score:3, Insightful)
Your theory isn't holding up in the face of the data. Googles Cars have logged hundreds of thousands of miles and have one accident caused by human error.
Slow vehicle driving significantly black the prevailing speed cause accidents for other vehicles, while seldom getting hit themselves. They cause chain reaction fender benders two or three cars back, which they are seldom even aware of, and drive away, never to show up in accident statistics.
At least that's the theory put forth by those who perpetually drive over the speed limit.
Re:Beginning of the end for driving jobs. (Score:4, Insightful)
unions aren't designed to protect peoples jobs from automation but to represent collective bargaining issues and represent workers in the face of often arbitrary and hostile (and incompetent) management.
the forces that prevent government change for something (or force it upon us) are the corporations that benefit most from them. I'm guessing a well known search engine had something to do with the ability to get a law passed that benefits.... them?
and when lawsuits arise around self-driving cars a well known search engine will hire a high powered PR firm to astro-turf a lobby of "citizens for self-driving robot car rights" and we'll here politicians railing about how small businesses will fail if they have to pay minimum wage to a human driver and the right to own and (autonomously) operate a self-driving car is the American Way.
Politicians have been destroying the power of unions for decades and never really wanted them in the first place. And that's almost certainly because politicians are the dogs and the corporations are the masters who pull their chains (running dogs of capitalism no less!).