Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Science Technology

California To License Self-Driving Cars 301

DevotedSkeptic writes "Californian senators have passed a bill that looks set to make the state the second in the US to approve self-driving cars on its roads. The bill was passed unanimously by state senators, and now hits the desk of governor Jerry Brown, who's expected to sign it into law. It calls on the California Department of Motor Vehicles to start developing standards and licensing procedures for autonomous vehicles. 'This bill would require the department to adopt safety standards and performance requirements to ensure the safe operation and testing of 'autonomous vehicles', as defined, on the public roads in this state,' it reads."
This discussion has been archived. No new comments can be posted.

California To License Self-Driving Cars

Comments Filter:
  • Considering half the drivers there don't seem to be paying attention to their driving, self-driving cars would probably be a huge improvement.
  • Re:Not safe (Score:5, Insightful)

    by O('_')O_Bush ( 1162487 ) on Sunday September 02, 2012 @11:57AM (#41206021)
    Not safe right now... the difference being is that we can make continually make self driving cars more safe, since driving only requires a set of rules and environmental awareness. Humans will never become more safe, in general, because they are inherently mistake prone due to fatigue, poor judgement, distractions, intoxication, and many other factors.

    Just look at the wonders of automated flight. Most airline accidents that aren't due to terrorism or mechanical malfunction are due to pilots overriding the autopilots.
  • by Dyinobal ( 1427207 ) on Sunday September 02, 2012 @12:05PM (#41206067)

    There are a lot of interesting legal implications for these self driving cars but all that a side I dream of the day when a drunk can stumble out of the bar and fall into the back of his car and wake up in the drive way of his home the next morning.

    Anyone who seriously moves to prevent the self driving car from becoming reality regardless of how safe they are is simply against saving lives. I'm sure most people will wonder how anyone could be flat out against self driving cars but people like that do exist and at some point this will move from a legal issue to a political issue when it starts looking like mass adoption might happen and these people will come out.

  • by Cute Fuzzy Bunny ( 2234232 ) on Sunday September 02, 2012 @12:06PM (#41206075)

    Considering half the drivers there don't seem to be paying attention to their driving, self-driving cars would probably be a huge improvement.

    I got a ticket about 10 years ago and had to go to driving school. Maybe 50-60 people packed into a room. First two things the guy asked were questions on how close you could legally follow another car, and who had right of way in a simple merge situation and in a lane change. About 75% of the people, by show of hands on a multiple choice answer set got the wrong answer. Which means 3/4 of people on the road don't understand the simplest of rules regarding driving.

    Couple that with being able to get a handful of questions wrong on the driving test, and rarely if ever re-testing, throw in some distraction since driving a two ton killing machine just isn't that interesting after you've done it a couple of months, and you have driving problems and accidents.

    The car knows the rules of the road. It isn't distracted. It wont change lanes every 5 seconds when there's heavy traffic and all lane changing does is increase the likelihood of an accident. It wont tailgate. It won't drive drunk. Its not texting continuously. It wont speed 20mph over the speed limit so as to arrive home 1.5 minutes earlier. In short, it won't do any of the 95,000 things that human drivers do, usually at considerable risk and low to no gain.

    Maybe if people actually read and retained the rules of the road, and didn't drive like they were playing a video game with no downsides and no risk, along with unlimited lives...we wouldn't need this.

    But...we do.

    Good on California legislators for reacting quickly to a potential source of licensing revenue. While they may go for years without addressing serious problems and safety issues, or doing complex things like resurfacing roads...they're pretty quick to respond to an increase in the revenue stream that allows them to continue spending billions on pork every year.

    Now I just have to figure out how to trick them into thinking its fun to spend money on roads and schools.

  • Re:Not safe (Score:5, Insightful)

    by wintersdark ( 1635191 ) on Sunday September 02, 2012 @12:07PM (#41206089)

    With reports of Google's self-driving car crashing left [] and right [] how could anyone want to be in one of these vehicles? They just aren't safe. When something happens when you're driving then it's at least your fault and you could do something about it, but not in self-driving cars.

    Was this meant to be sarcastic? Both of those posts referred to the same accident. These cars have logged hundreds of thousands of miles, with ONE accident(which may well have been human error). That's far, far safer than the average human driver. If you're in the drivers seat of the self driving car, you CAN take control of it should you feel the need, too.

    However, realistically that's not going to be useful. The car will be better at accident avoidance than you are - it's not that big a programming challenge to achieve that. People don't like to admit it - it bruises their delicate little egos - but the car knows *exactly* how fast every car around them is moving, their acceleration, and can put itself exactly where it wants to be every time. No delayed reactions due to inattention, no slight overreaction due to panic.

    Yes, self driving cars will be involved in accidents, and will be at fault, from time to time. This does not make them less safe - it's inevitable, particularly when human drivers are involved as well. Human drivers, on the other hand, are extremely unsafe. Everyone wants to think that they are special, and unlike everyone else they're awesome drivers, but the reality remains that human drivers are in accidents extremely regularly.

    Don't get me wrong. I'd hate to be in a robotically driven car. Logically, I know I'd be much safer than with a human driver, but I'd be enormously squirrelly about the whole process. And, of course, I love driving - I'd never be comfortable giving that up to a machine. I consider myself a good driver, too (like everyone else), and I've never been in an accident for which I'm at fault, but I can acknowledge that there have definitely been times I've driven with far less than ideal circumstances. Distraction, emotional distress, tiredness, ill health, the list goes on an on. In all those cases, I'm less than 100%.

  • by Joe_Dragon ( 2206452 ) on Sunday September 02, 2012 @12:12PM (#41206105)

    autopilots acting on bad data or coding issues??? had lead to crashes.

    What about that air show crash where you had stuff like

    Thus he may not have heard these warnings (and thus any other warning or alarm as they sound in cockpit and not always in the headset).

    that black boxes had been tampered with. (maybe to cover up the airbus issues with it's autopilot)

    In the month prior to the accident, Airbus had posted two Operational Engineering Bulletins (OEBs) indicating possibilities of anomalous behavior in the A320 aircraft. These bulletins were received by Air France, but were not sent out to pilots until after the accident:

    A320 crashes []
            The aircraft overran runway 4 while landing. A malfunction of the onboard flight computers prevented power from being reduced to idle, which inhibited thrust reverse and spoilers from being used. The offending engine was shut down, and brakes applied, but the aircraft was unable to stop before the end of the runway

  • Re:Not safe (Score:4, Insightful)

    by AmazingRuss ( 555076 ) on Sunday September 02, 2012 @12:12PM (#41206109)

    It's not safe for the simple reason that the automatic cars will drive the speed limit, and cause accidents because everybody else is going 20 over.

  • by physburn ( 1095481 ) on Sunday September 02, 2012 @12:12PM (#41206113) Homepage Journal
    I'm expect a lot of political trouble from trucking unions etc. Driving is many peoples livelihoods.
  • Re:Not safe (Score:5, Insightful)

    by Dyinobal ( 1427207 ) on Sunday September 02, 2012 @12:15PM (#41206129)
    Your theory isn't holding up in the face of the data. Googles Cars have logged hundreds of thousands of miles and have one accident caused by human error.
  • Re:Not safe (Score:5, Insightful)

    by Atryn ( 528846 ) on Sunday September 02, 2012 @12:17PM (#41206141) Homepage

    These cars have logged hundreds of thousands of miles, with ONE accident. That's far, far safer than the average human driver.

    Where are you getting that the average human driver has an at least one accident every few hundred thousand miles? I wouldn't call this "far, far safer" yet. It has the potential to be.

    Also, most of the tests have been in still fairly controlled environments. Meaning, the car wasn't woken up in the middle of the night to get a pregnant woman to the hospital quickly over dirt roads, past nighttime street-racers, etc... Loads of "special cases" exist in the world of cars. It will be quite a long time before we have a really solid understanding of their viability. Right now, a "typical commute" is probably the safest use, or even for standard-route delivery vehicles without a high time sensitivity. Even better if certain roads / routes / lanes get set aside for autonomous vehicles only, which would make them even safer and more efficient.

  • by king neckbeard ( 1801738 ) on Sunday September 02, 2012 @12:36PM (#41206269)
    I would suspect the first waves of cars would be big companies like Google running tests. In that case, they could meet the legal requirements for insurance themselves. After that, we'll probably have enough data to calculate the risks with far greater accuracy than human drivers.
  • by tomhath ( 637240 ) on Sunday September 02, 2012 @12:39PM (#41206289)
    Think how easy it would be for a personal injury lawyer to wheel a child who was injured in front of a jury and get them all crying because the driver didn't use the proven safe self-driving mode. What will a few mega-million dollar suits do to your insurance?
  • by king neckbeard ( 1801738 ) on Sunday September 02, 2012 @12:42PM (#41206317)
    I'm sure there will at least be a market that opens up for drivers who want to personally drive a car. A good stretch of private road and a few boilerplate waivers and we'll all be driving in that same setting car commercials take place in.
  • by ColdWetDog ( 752185 ) on Sunday September 02, 2012 @12:52PM (#41206399) Homepage

    Nobody. Same as now.

  • Re:Not safe (Score:4, Insightful)

    by theedgeofoblivious ( 2474916 ) on Sunday September 02, 2012 @12:53PM (#41206401)

    Self-driving cars will eventually be the majority.

    Driving 20 over the speed limit may make you get there more quickly, but not having to focus on the road for the whole trip will make the trip more enjoyable and will make it feel like you get there more quickly.

  • Re:Not safe (Score:4, Insightful)

    by burisch_research ( 1095299 ) on Sunday September 02, 2012 @12:55PM (#41206419)

    If all cars are self-driving, then we can happily increase the speed limit -- and probably by a lot!! We might even get a scenario where one speed limit applies to humans, and another (higher) one applies to computer-controlled vehicles.

  • Re:Not safe (Score:5, Insightful)

    by O('_')O_Bush ( 1162487 ) on Sunday September 02, 2012 @12:59PM (#41206451)
    Even if you were to combine accidents from software bugs, driving the speed limit, or some other factor, I'd absolutely bet that they would total far fewer than accidents by drunk drivers, falling asleep at the wheel, using cell phones, talking to passengers in the car, highway hypnosis, misunderstanding street signs, or lack of knowledge about right-of-way. Pick one.

    They don't have to be safe, as nothing, not even laying in bed, is completely safe. They just have to be safer than what exists now. That is a pretty low bar to reach.
  • Re:Not safe (Score:3, Insightful)

    by icebike ( 68054 ) * on Sunday September 02, 2012 @01:26PM (#41206675)

    Your theory isn't holding up in the face of the data. Googles Cars have logged hundreds of thousands of miles and have one accident caused by human error.

    Slow vehicle driving significantly black the prevailing speed cause accidents for other vehicles, while seldom getting hit themselves. They cause chain reaction fender benders two or three cars back, which they are seldom even aware of, and drive away, never to show up in accident statistics.

        At least that's the theory put forth by those who perpetually drive over the speed limit.

  • by 0-9a-zA-Z_.+!*'()123 ( 266827 ) on Sunday September 02, 2012 @01:58PM (#41206875) Homepage Journal

    unions aren't designed to protect peoples jobs from automation but to represent collective bargaining issues and represent workers in the face of often arbitrary and hostile (and incompetent) management.

    the forces that prevent government change for something (or force it upon us) are the corporations that benefit most from them. I'm guessing a well known search engine had something to do with the ability to get a law passed that benefits.... them?

    and when lawsuits arise around self-driving cars a well known search engine will hire a high powered PR firm to astro-turf a lobby of "citizens for self-driving robot car rights" and we'll here politicians railing about how small businesses will fail if they have to pay minimum wage to a human driver and the right to own and (autonomously) operate a self-driving car is the American Way.

    Politicians have been destroying the power of unions for decades and never really wanted them in the first place. And that's almost certainly because politicians are the dogs and the corporations are the masters who pull their chains (running dogs of capitalism no less!).

In less than a century, computers will be making substantial progress on ... the overriding problem of war and peace. -- James Slagle