Google Car Loses Its Cherry

The autonomous Google car has had its first at-fault accident. You can see the accident report here, and the intersection where the accident occurred here.

The fact set is straightforward. The car was in the rightmost of three through lanes approaching an intersection where it intended to make a right turn. There is no formal right turn lane, but the rightmost lane is amply wide for parked cars. As there were none near the corner, the right hand side of the lane acted as a de facto additional right turn lane.

As the Google car approached the intersection the light was red and there was a line of cars in the lane ahead of it.  It moved into the far right side of the lane and continued toward the corner. It turned out that its way was blocked by a pile of sand bags around a storm drain. It therefore waited for the light to turn green and the cars lined up in the lane to clear out. It then merged back into the main lane. Sadly, it mistakenly assumed that the bus coming through in the same lane would stop for it. Why, I cannot imagine. Wackiness ensued. Fortunately, this was all pretty low speed, so the wackiness was of limited extent.

I am not a lawyer. I am but a lowly paralegal, and therefore have no legal opinions whatsoever. I also work on traffic accident cases all the time. Were I to have a legal opinion, it would be that the Google car is unambiguously at fault.

Home Page 

Richard Hershberger is a paralegal working in Maryland. When he isn't doing whatever it is that paralegals do, or taking his daughters to Girl Scouts, he is dedicated to the collection and analysis of useless and unremunerative information.

Please do be so kind as to share this post.
TwitterFacebookRedditEmailPrintFriendlyMore options

53 thoughts on “Google Car Loses Its Cherry

  1. Time for a software patch. Can’t say for Cali, but in my state, oncoming traffic has the right of way. Merging traffic must yield.

    Report

    • According to the article I read, the software patch will be to tell the Google-car that buses and other big vehicles won’t yield, but smaller vehicles probably will**, for some reason.

      As Richard says, I have no idea why Google thinks anyone would yield in this scenario. They must be waaay more polite in Mountain View than in every city I’ve ever driven in.

      **

      Google said it has reviewed this incident “and thousands of variations on it in our simulator in detail and made refinements to our software. From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.”

      Report

      • I assume the idea is to “find a large enough space” that it’s appropriate to merge in. Not wait until the entire freaking road is completely devoid of any traffic for miles…

        They really ought to account for the idea that buses are heavier, and less likely to stop on a dime.

        Report

        • The car also is probably programmed not to exceed the speed limit or aggressively accelerate to quickly move into the lane and turn right. Something a person would likely do.

          Report

    • Wonder how long a decision-making history the car keeps.

      My wife tells a story about the early days of cruise missiles, when the missile her firm was developing was on a ground-hugging test flight. First mountain, the missile climbs nicely over the top and returns to low-level flight. Second mountain, same thing. Third mountain, the missile flies smack into the side of the mountain at 300 knots. Some programmer reportedly observed, “Gonna have a problem recovering the trace log from that one.”

      Report

    • Assuming everything was programmed correctly, what is a reasonable expected “glitch” rate. I mean, these things are basically just computers, right? Isn’t there an old joke about how if car technology progressed like computer technology, they’d all get 500 miles to the gallon, would cost $2,000, and explode once a week?

      I assume (perhaps wrongly?) that a glitchless computer system of any type is impossible. So what is the best we can expect?

      And even assuming the computer-error rate was less than the human-error rate, where do you think it will need to be for people to accept it as “preferred”?

      Report

      • Air traffic control systems are pretty nearly glitchless. Certain governmental stuff is also pretty nearly glitchless. We landed people on the moon, for chrissakes! We can do glitchless. (defined as .999999999999% uptime?)

        Glitchless is very, very expensive. And occasionally means the “people we pay to find at least X number of errors in the codeblock” wind up calling the programmer and asking, “Please, can you add some errors in? We can’t approve this without finding 3 errors!”
        **yes, this really happened to a friend of mine. He’s much better at programming than I am.

        Report


      • Five Nines is considered a good error rate (correct 99.999% of the time). Now that does not mean that an AV will have an accident or incident every 1000 decisions for the simple reason that generally most accidents/incidents that result in damage or injury are not the result of a single bad choice, but of a series of (typically 3 or more) bad choices that results in a cascade failure.

        In this case, I’d bet that the problem was a result of the software not yielding the right of way to a bus (either because it didn’t know to do so, or because it failed to recognize the bus), not recognizing fast enough that the bus was changing lanes on top of it, not recognizing the limits a bus has when it comes to braking & maneuverability (such data was not programmed in, or again it failed to recognize the bus for what it was), etc. Couple that with the AIs insistence on obeying the law, and what I suspect is a deficient ability to predict behavior of others, and the car is in a bad spot.

        Honestly, I’d program in a module that allows the car to commit minor violations to avoid accidents, and do a better job of expanding the vehicles situational awareness.

        That’s funny!

        Report

        • Thanks and .

          How much easier will all this be if ALL road vehicles are self-driving and connected/in communication versus a mix of self-driving and human driving? If the answer is that it is MUCH easier and, therefore, considerably safer, I wonder if there’d be a push to ban human driving. I mean, if a fully automated system could reduce automobile deaths from 30K+ a year to 3K a year*, that’d be a compelling argument to make. And to say nothing of increased efficiency and productivity.

          * That’d represent a 90% reduction. I have no idea how realistic that is but my gut says it might be on the low end. We might see an even greater drop, methinks.

          Report

          • I can’t see a ban on human driving occurring within my lifetime, and probably not within my kids’. A modern car has a useful lifespan of about twenty years. This means that for twenty years after the point where effectively all new cars are autonomous, there is going to be a substantial constituency still driving. When (or if) we start seeing cars with no human controls, and when (or if) they become the norm for new cars, then the problem will take care of itself.

            I am, for the same reason, skeptical of pollyanna descriptions of how it will be when all the cars are autonomous, communicating with each other.

            Report

            • Spike gas prices, and you won’t need 20 years. “All cars are autonomous” is easy to get to — just ban non-commercial driving on interstates from 1AM to 8AM or so. Trucks then can speed to efficient levels.

              Report

            • Easier than you think. While I personally think Uber is an object lesson in how most Americans don’t understand depreciation, they (or someone snagging their model) are almost the future of cars.

              Marry Uber to a car rental agency with self-driving cars? You end up with a world in which it’s cheaper to call a car (which will drive itself to you) and do whatever you need than to own one. (It’s not just the cost of the car, but insurance costs you’re saving).

              Now the middle class might still own their own self-driving cars (and get cheaper insurance because in the end, they’re still going to be safer than human drivers 99% of the time, and actuaries are going to note the savings. The rare exception is unlikely to add up to the expensive blunders human drivers make regularly — from drunk driving to falling asleep at the wheel), but I suspect regular cars to phase out quickly.

              Folks owning 20 year old cars are, by and large, owning expensive 20 year old cars because they can’t get the credit or down payment to buy a better one — and end up sinking more into keeping that POS running than they’d pay for a new one. A self-driving car service would likely be cheaper, day to day, for them. And not require loans.

              Report

              • Also, I suspect once we start seeing self driving cars on the road, the auto industry will (if they aren’t working on this already) develop a communications standard and deploy transponders in all cars that continuously broadcast (and listen for) realtime vehicle information such as position, velocity vector, acceleration, type/size, who’s driving, etc. Even older cars without the more modern computers (like GrandPa’s ’68 ‘Cuda) can be fitted with such a thing (hell, any decent smartphone can already figure out position, acceleration, and velocity within a decent measure of accuracy, so making something that is fixed to the car and many times more accurate would not be difficult or expensive).

                I suspect such a device will become required equipment on any registered vehicle at some point so that AVs can be more effective & safer.

                Report

      • It’s more complicated than that. There are (quite rare) hardware glitches. The only really serious study I’ve seen of that concerned Motorola 68020 processors (which says how old it is), and found that there was a bad bit that could be detected on the external pins about once per 30 days. In software, there are all sorts of logic errors. Some are obvious, like code that says “turn(RIGHT)” rather than “turn(LEFT)”. Some, such as race conditions where there are many processors running in parallel, are incredibly subtle. Since this is real-time control software, more important questions are how are glitches detected and how quickly does the system return to a known good state.

        Report

        • Then you probably agree with when he says “Were I to have a legal opinion, it would be that the Google car is unambiguously at fault.” To which I say, pshaw.

          Who else here has actually driven on El Camino Real in Mountain View?* If you got stuck in a lane near an intersection, it could be hours before you had an opportunity to safely merge into another lane. Googlecar was just showing the aggression necessary to keep traffic flowing!

          Granted, Googlecar should have picked a different vehicle than a bus to cut off. Pick one with a shorter brake distance, and more visibility through the windshield to confirm that it isn’t being tailgated.

          * The DMV report has significant whiteout on it — in particular the time of afternoon when the collision occurred. 4:00 p.m. – 7:00 p.m. in Mountain View? Ugh. Thanks, but I’ll take Los Angeles traffic instead. At least there, someone will eventually let you in, if only because they’ve misjudged the distance to the intersection while engaged in texting and thus braked seventy feet too early.

          Report

          • From the report, it sounds like all the Google car had to do was wait for the light, and it could have turned right after the cars in front of it went straight. Instead, it drove into space used for parking, which is not actually a lane, and got stuck. That must be a violation, and since it is, the accident must be the fault of the Google car, right? I don’t know the law, so I can’t say for certain, but that seems like a reasonable interpretation.

            Report

            • The second link in the OP goes to a Google Street View of the intersection. Parallel street parking is allowed in the extra-wide right-hand lane. You need to move the street view forward a click or two, and pan right a bit, and you’ll see the storm drain (it’s small, after all, it never rains here in California so it doesn’t need to be big) less than ten feet before the marking on the pedestrian crosswalk. Imagine that with sandbags around it (for whatever reason, again, it’s not like there was any rain going on).

              After that, draw your own conclusions. (In sobriety, I agree with about liability here. Googlecar cut off the bus. Don’t do that.)

              Report

              • Yeah, I definitely think it was irresponsible to try to go around cars in that lane, if not illegal.

                If that’s what Google cars are programmed to do, some new programming is in order.

                Here it is not uncommon to see cops pull people over for going around cars in the rightmost lane in order to turn right. And I suspect that when that happens, many cheer as I do.

                Report

                • Around here, driving in the parking lane (when no one’s parked there) isn’t at all illegal (and is often used for making right hand turns, as it significantly speeds up traffic). Now, driving into a parking lane where there are significant quantities of parked cars? that’s stupid (but, I don’t think that’s what happened).

                  Report

              • I’m not sure how dickish drivers are in Mountain View. In most cities, were I caught in that situation I would, as Chris notes, wait for the light to turn red. I might be able to zip into the main lane immediately thereafter, since the oncoming cars are coming to a stop anyway. Failing that, I would ease the nose of my car out toward the lane, with my turn signal on, and reasonably expect that the next car would let me in. Since my momma raised me right, I would acknowledge this with a polite wave.

                Were I in a city with dick drivers I would do pretty much the same thing, while imparting body language through my car that I was going to do this regardless of what the next guy back did, and I would skip the polite wave.

                This all does seem rather nuanced for a computer.

                Report

                • And I’d be 6 inches from the car in front of me as you tried to wedge yourself into that space while we decide who was going to yield first….because in my state, YOU don’t have the right of way….

                  :)

                  Report

                  • If only driving was about actually following the rules, rather than trying to intimidate everyone on the road.

                    The creedo of the teenager: “My car is shittier than yours. I’m not yielding, so don’t crash into me.”

                    Report

                  • My uncle once had a slow-motion crash exactly like this (he was the one merging, and was therefore technically at fault. The cop gave both parties a ticket for driving-while-asshole.)

                    Driving like that will gain you 1 or 2 cars behind you instead of in front of you most days, and a huge hassle on the days when you gamble and lose at Chicken.

                    How much is your time worth, even if you’re not at fault? (Also, I wouldn’t assume the cop will see things your way every time)

                    Report

                    • I’ve been rear ended by a speeding drunk who slurred her words to the cop that stopped. He directed us to “get out of the drive lanes and exchange insurance info” and was gone. Cops don’t stop for minor fender benders.

                      Report

              • I live in Mountain View. I go through that intersection all the time. At certain times, yes, it is kind of a nightmare. And that is exactly the sort of condition the Google Car needs to get trained for.

                Every storm drain in Mountain View has had sandbags this winter. I’m not certain why that is, but those drains can get clogged up with leaves very easily in a good hard rainstorm. And we’ve had a few, and anticipated worse.

                Report

  2. The accident rate of Google self-driving cars is higher than that for human-controlled vehicles, and the accidents are being caused by the Google cars. This is because Google cars drive in a way that, when humans do it, is called “brake-checking”.

    You just don’t hear it reported that way, because when a stopped vehicle is rear-ended by another, the vehicle that was in motion is adjudged to be “at fault”.

    Report

Comments are closed.