Google says it's the other drivers' fault
September 1, 2015 10:31 AM   Subscribe

Since 2009, Google cars have been in 16 crashes, mostly fender-benders, and in every single case, the company says, a human was at fault. Researchers in the fledgling field of autonomous vehicles say that one of the biggest challenges facing automated cars is blending them into a world in which humans don’t behave by the book. These robots "have to learn to be aggressive in the right amount, and the right amount depends on the culture.” (SLNYT)
posted by RedOrGreen (87 comments total) 12 users marked this as a favorite
 
Today's roads and traffic patterns will look as unfamiliar to our grandchildren as the trolley and horse clogged streets of the 19th C look to us.
posted by Rock Steady at 10:38 AM on September 1, 2015 [4 favorites]


> and in every single case, the company says, a human was at fault.

Reminds me of economists criticizing people for not behaving economically rationally, when it's their job to study human behavior, not people's job to behave like they predict.
posted by benito.strauss at 10:52 AM on September 1, 2015 [28 favorites]


“The real problem is that the car is too safe,” said Donald Norman, director of the Design Lab at the University of California, San Diego, who studies autonomous vehicles.

This is a sentence that nobody should ever have said. The article says Google cars have been involved in 16 crashes since 2009; even though for years they have insisted that there were no collisions. Either they are now radically less safe than they have been, or, surprise surprise, a for-profit company has yet again been lying about the safety of a product under development and freely externalizing public risks. The last number I saw (from last year) said 700,000 miles driven by Google cars; at 16 accidents, that's one per 43,750 miles of driving. NHTSA's 2013 traffic safety overview [PDF link] shows one accident in a little over 500,000 miles of driving - ten times safer. (Although not every Google car crash was necessarily at the police report level.)
posted by Homeboy Trouble at 10:54 AM on September 1, 2015 [14 favorites]


KEEP SUMMER SAFE
posted by prize bull octorok at 11:00 AM on September 1, 2015 [11 favorites]


Error-Prone - the Oddly Addictive Game Shows That You Stink at Driving (Wired)


“The real problem is that the car is too safe,” said Donald Norman, director of the Design Lab at the University of California, San Diego, who studies autonomous vehicles.

This is a sentence that nobody should ever have said.


I agree, it comes across as someone arguing on technicalities, overlooking the human costs. I would suggest they say "The real problem is that the car is too focused on the rules of the road." The unspoken continuation of that sentence is then "the car is not aware enough of the behavior of other drivers or other erratic events that occur in every-day driving," which makes the car sound less safe. Still, better than "this car is just too damned good."

Although not every Google car crash was necessarily at the police report level

I believe that could quite possibly be the case. I know in the US, crash records are focused on fatal and major injuries, and I've heard here at work that at some times it is more or less standard procedure to get cars off the road when the crash is minor and traffic is heavy, or there's serious weather, because the concern for more serious subsequent crashes. In other rural areas, I've heard that there just aren't enough police to report minor on fender-benders.

In short, I could imagine Google actually reporting significantly more crashes than police would report, for a variety of reasons.
posted by filthy light thief at 11:07 AM on September 1, 2015 [2 favorites]


If everyone you meet is an asshole bad driver... you are the asshole bad driver.
posted by Cosine at 11:09 AM on September 1, 2015 [5 favorites]


I still suspect that we are further from self-driving cars than we are being told.

Currently the Google cars cannot drive at night, at dawn, at dusk, in the rain, beside bicycle lanes, past driveways, in busy downtown streets...

Almost all the safe driving boasts were for kilometers driven in specific test conditions on closed courses.

I could be wrong though, it happens.
posted by Cosine at 11:12 AM on September 1, 2015 [6 favorites]


Currently the Google cars cannot drive at night, at dawn, at dusk, in the rain, beside bicycle lanes, past driveways, in busy downtown streets...

What? The entire point is that they deal with city traffic perfectly fine. The rain? yeah well, the thing about California is...
posted by GuyZero at 11:32 AM on September 1, 2015 [5 favorites]


The last number I saw (from last year) said 700,000 miles driven by Google cars; at 16 accidents, that's one per 43,750 miles of driving.

From the August 2015 Google Self-Driving Car Project Monthly Report:
Thousands of minor accidents happen every day on typical American streets, 94% of them involving human error, and as many as 55% of them go unreported. (And we think this number is low; for more, see here.) In the six years of our project, we’ve been involved in 16 minor accidents during more than 2 million miles of autonomous and manual driving combined. Not once was the self-driving car the cause of the accident.
At this point Google might have more and better data on minor accident rates than anyone else.

Almost all the safe driving boasts were for kilometers driven in specific test conditions on closed courses.

"We’re currently averaging ~10,000 autonomous miles per week on public streets."

Currently the Google cars cannot drive at night, at dawn, at dusk, in the rain, beside bicycle lanes, past driveways, in busy downtown streets.

One of the monthly reports implies night driving:
That cyclist then took a sudden left turn, coming directly at us in our lane. Our car was able to predict that cyclist’s path of travel (turquoise line with circles) so we stopped and yielded. This happened at night, when it would have been very difficult for a human driver to see what was unfolding.
Where are you getting your information that they can't drive past driveways? Or on busy downtown streets?
posted by jjwiseman at 11:33 AM on September 1, 2015 [17 favorites]


Where are you getting your information...

This is about driving, ergo it is about emotion and personal self-worth, not information -- unless that information can be half-remembered and deployed years later in service of an unrelated anecdote.
posted by aramaic at 11:36 AM on September 1, 2015 [20 favorites]


People are just bad drivers. My hell isn't other people, it's other people driving around me, poorly.
posted by Atreides at 11:37 AM on September 1, 2015 [4 favorites]


http://www.technologyreview.com/news/530276/hidden-obstacles-for-googles-self-driving-cars/

The article is a year old, perhaps all issues are fixed.
posted by Cosine at 11:43 AM on September 1, 2015 [3 favorites]


As somebody who spends more time on a bike than in a car, I will feel much safer around cars controller by robots than I do when they're being driven angry humans who can run me over, say I "came out of nowhere" while simultaneously being "in their way" and get off without even a traffic citation.
posted by kevin is... at 11:57 AM on September 1, 2015 [25 favorites]


This is about driving, ergo it is about emotion and personal self-worth, not information -- unless that information can be half-remembered and deployed years later in service of an unrelated anecdote.
aramaic

Or it's about the problems the MIT piece Cosine links to brings up.

But far be it from any of us to question our Benevolent Lord Google. Their own internal reports say they're perfect!
posted by Sangermaine at 12:04 PM on September 1, 2015 [4 favorites]


I would love to hear from the actuaries.
posted by IndigoJones at 12:26 PM on September 1, 2015 [1 favorite]


I live about ten minutes from Google HQ. Swarms of self-driving cars putter all around the neighborhood at all hours of day and night. Once on a busy Saturday I saw one navigating the teeming Costco parking lot on Rengstorff with complete aplomb.

Imperfect though they may be, I can tell you on the basis of daily exposure that they're very, very good companions on the road.
posted by tangerine at 12:26 PM on September 1, 2015 [19 favorites]


This is about driving, ergo it is about emotion and personal self-worth, not information -- unless that information can be half-remembered and deployed years later in service of an unrelated anecdote.
aramaic

Or it's about the problems the MIT piece Cosine links to brings up.


Which Cosine linked to after aramaic rightfully called out the half-remembered details deployed a year later in service of a barely related anecdote.
posted by Etrigan at 12:27 PM on September 1, 2015 [2 favorites]


Self driving cars don't have to be perfect, they just have to be better than the average human driver.
posted by dinty_moore at 12:28 PM on September 1, 2015 [14 favorites]


I live about ten minutes from Google HQ. Swarms of self-driving cars putter all around the neighborhood at all hours of day and night.

After I posted that quote from the monthly report, I began to wonder. The way they said the car sensed the bicyclist, at night, but they said "we stopped" made me question whether the car was actually in autonomous mode. I haven't been able to find any other definitive references to the Google car driving in autonomous mode at night. But maybe you've actually seen it? Assuming you can look and tell whether or not a person is driving...
posted by jjwiseman at 12:32 PM on September 1, 2015


These robots "have to learn to be aggressive in the right amount"

Yes, well, that's what they said about SkyNet.
posted by chavenet at 12:45 PM on September 1, 2015 [3 favorites]


I've been half-joking for years that we should ban human drivers, but really, we should ban human drivers, and I'm thrilled to see any acknowledgement of the fact that humans are terrible fucking drivers.

If anything else on earth killed as many people as driving does, we would have bombed it back to the Stone Age by now.
posted by Itaxpica at 1:13 PM on September 1, 2015 [13 favorites]


kevin is...: "As somebody who spends more time on a bike than in a car, I will feel much safer around cars controller by robots"

Just don't confuse them with a track stand.
posted by exogenous at 1:30 PM on September 1, 2015 [1 favorite]


Thousands of minor accidents happen every day on typical American streets, 94% of them involving human error, and as many as 55% of them go unreported. (And we think this number is low; for more, see here.) In the six years of our project, we’ve been involved in 16 minor accidents during more than 2 million miles of autonomous and manual driving combined. Not once was the self-driving car the cause of the accident.

Thanks for finding these updated numbers; it enables a more rigorous conclusion than I posted earlier. If there are 16 collisions in 2 million miles, that's one per 125k miles. If "as many as" 55% of collisions go unreported, the one per 500,000 mile number I calculated above should be adjusted to 1 in 225k miles. That's still almost twice as bad as the average driver. I have a hard time believing that you can get in twice as many collisions over a broad expanse of time as the average driver and still none of them could possibly be your fault.

For comparison purposes, a population that is typically about twice as likely to get into collisions as the average driver are older drivers in their later 70s.

In the long run, there are absolutely substantial safety benefits that will occur due to increased vehicle automation. (We seem to largely have given up as a society on the substantial safety benefits that occur due to decreased vehicle usage.) I'm not surprised that self-driving cars are twice as bad as human drivers; that's a step on the way to making them better.

I'm just fucking incensed that public streets have been used in this experiment, led by a company that has been less than forthcoming with the truth about how dangerous their cars are right now.

PS to Google: They aren't accidents.
posted by Homeboy Trouble at 1:30 PM on September 1, 2015 [3 favorites]


I haven't been able to find any other definitive references to the Google car driving in autonomous mode at night. But maybe you've actually seen it? Assuming you can look and tell whether or not a person is driving...

It's hard to tell when the Lexus ones are in autonomous mode, but the smaller ones are a little easier since they don't have a steering wheel and have a weird joystick looking thing instead. Also, I'm higher up than they are. I've seen those driving around at night with nobody's hands on anything.
posted by dogwalker at 1:33 PM on September 1, 2015


If "as many as" 55% of collisions go unreported, the one per 500,000 mile number I calculated above should be adjusted to 1 in 225k miles. That's still almost twice as bad as the average driver. I have a hard time believing that you can get in twice as many collisions over a broad expanse of time as the average driver and still none of them could possibly be your fault.

I think at this point that Google may have a better idea of the baseline minor accident rate than the National
Highway Traffic Safety Administration does, and they're saying that they think more than 55% go unreported (which would certainly fit my personal experience). So I tend to think that a likely answer to the contradiction you describe is that they have, so far, been safer than human drivers and been in fewer collisions.

That said, whatever the current level of safety is, it's absolutely inevitable that autonomous cars will become much safer than human drivers, and I think that driving on public roads is a necessary part of developing systems that can handle the extremely wide range of real world conditions and still be safer than humans.
posted by jjwiseman at 1:47 PM on September 1, 2015 [5 favorites]


If there are 16 collisions in 2 million miles, that's one per 125k miles. If "as many as" 55% of collisions go unreported, the one per 500,000 mile number I calculated above should be adjusted to 1 in 225k miles. That's still almost twice as bad as the average driver.

Genuine question: does the fact that we're only accounting for the miles driven by the Google cars make up any of that difference? That is, if the Google cars have driven 2 million miles and had 16 collisions, how do we account for the amount that the other drivers in those 16 collisions have driven -- should we consider it to be 16 collisions over 4 million miles?
posted by Etrigan at 1:50 PM on September 1, 2015 [4 favorites]


Genuine question: does the fact that we're only accounting for the miles driven by the Google cars make up any of that difference? That is, if the Google cars have driven 2 million miles and had 16 collisions, how do we account for the amount that the other drivers in those 16 collisions have driven -- should we consider it to be 16 collisions over 4 million miles?

It seems to me that if you assumed that a collision involves on average two cars (maybe this is reasonable?) then two drivers will be involved in a police-reported crash every 500k miles, so an average driver should expect to be involved in one per 250k.
posted by value of information at 2:08 PM on September 1, 2015


So I could imagine a car having defective braking systems, making it prone to rear-end other vehicles, or to get T-boned at intersections. But of the 16 collisions the google car has been involved in, 11 were caused by a human driver rear-ending the google car. What kind of defect in the google car would make it be at fault for being rear-ended so much?
posted by rustcrumb at 2:12 PM on September 1, 2015 [6 favorites]


The fact that it follows the actual speed limit?
posted by Hatashran at 2:16 PM on September 1, 2015 [5 favorites]


"The fact that it follows the actual speed limit?"

It would surprise me if they did this. If traffic is flowing on the highway at 10 miles above the speed limit, it's not exactly safe driving to follow the speed limit. It is safe to follow the speed of traffic without going significantly faster than any cars on the road.

I'd be curious about how they handle this though, now that you brought it up.
posted by el io at 2:20 PM on September 1, 2015


The Guardian also just published a sceptical piece on autonomous cars. Its major beef was that the things won't work in Devon, because of the narrow roads with high hedges that prevent drivers from knowing what's coming in time to use passing places, resulting in complex negotiations at a human level as to who'll do what between packets of cars...

As someone who's spent no little time driving on exactly those roads, I can think of any number of ways to get around that - and boy, would they make a better job of this than many of the local drivers (AKA maniacs) who threatened the young Devonian's life and limb on lots of occasions, killed one of his friends and put another two in hospital. And at the point when you can reasonably expect most cars to know where each other is at a distance, the game changes dramatically for the better whether they're automatic or human-driven. Even cars that have no sort of transponder (or app on Farmer Blogg's smartphone) will be noticed quickly as such by the other cars, and their location/disposition made known.

Anyway. I don't believe Google is low-balling its accident figures, because why would it? It can have no expectation of going uncorrected. and it will be absolutely focussed on this factor as the single most important data set for the viability of the whole project. Google does bad things, sometimes deliberately, sometimes haphazardly, but this is a remarkably public project. The first time someone is injured or killed by one of its cars, the degree of scrutiny on the project will make your average supernova look like an LED nightlight.

By the way, here's a quite-useful low-noise blog on automated cars, which i look in occasionally to see what's cooking.
posted by Devonian at 2:24 PM on September 1, 2015 [6 favorites]


Homeboy Trouble: NHTSA's 2013 traffic safety overview [PDF link] shows one accident in a little over 500,000 miles of driving - ten times safer.

I'm not seeing where you got the 500,000 number from in the report. I'm probably just missing something. Where are you seeing that or how are you calculating that?
posted by fremen at 2:26 PM on September 1, 2015


A more prosaic explanation for the disparity might just be where these cars are driving. If they are always just tooling around busy local roads near the Google campus, and rarely venturing out on to the freeway, then you would expect a hugely elevated rate of minor collisions like rear-enders when stopped at a light and a substantially decreased risk of serious or fatal collisions, simply by virtue of the kind of street and traffic the driving is being done in.

I mean, in my neighbourhood, which is very dense and urban, I witness FAR more collisions than you would expect for every 500k miles. I would say I actually witness a collision happen once a week or so. So maybe for every two or three hours I am out on streets of Parkdale and surrounds.

I wouldn't be surprised if there was at least a minor collision for every 5,000 miles driven near where I live, not counting the freeway to the south that feeds impatient drivers into a dense urban area that most want to flee as fast as they can.
posted by [expletive deleted] at 2:30 PM on September 1, 2015 [2 favorites]


fremen I did the math, and I think what HT is calculating is 2998 billion miles / 5678000 accidents = 528002 miles per accident in 2013. This data is in Table 1 and Table 2 of the PDF. If you assume that this represents only 45% of actual accidents, then you arrive at a figure of 237601 miles per accident.
posted by rustcrumb at 2:31 PM on September 1, 2015 [1 favorite]


So I could imagine a car having defective braking systems, making it prone to rear-end other vehicles, or to get T-boned at intersections. But of the 16 collisions the google car has been involved in, 11 were caused by a human driver rear-ending the google car. What kind of defect in the google car would make it be at fault for being rear-ended so much?

I suspect it is because the google car is behaving erratically, in ways not normally expected for other cars. For example suppose the google car suddenly slams on its brakes as it approaches a green light intersection because it isn't sure if it can proceed. Or if the google car starts up when the light goes green but suddenly slows again in indecision midway through the intersection.

In both of these cases the rear-ending driver is technically at fault but it is because the google car is behaving in ways irrational to human drivers. Its like trying to play chess against a computer.

If this is the case, these sorts of accidents should become rarer as human drivers become more accustomed to the idiosyncrasies of self-driving cars.
posted by JackFlash at 2:47 PM on September 1, 2015 [2 favorites]


Etrigan: “Which Cosine linked to after aramaic rightfully called out the half-remembered details deployed a year later in service of a barely related anecdote.”

'It doesn't matter that you turned out to be right all along – you didn't cite your source precisely right off the bat, so you lose this round!'
posted by koeselitz at 2:54 PM on September 1, 2015 [1 favorite]


Also, let's be clear: Google themselves have admitted that their so-called "self-driving cars" rely on hyper-precise mapping down to the millimeter (which would cost trillions of dollars if applied across the US) and an extraordinary amount of before-hand knowledge about roads and conditions and potential blockages and such that just isn't even conceivably available.

So when the New York Times publishes an effusive piece about how Google's biggest obstacle is human error, maybe we should see it for what it is: just another PR fluff piece.

"Self-driving cars" are a pipe dream. Before they happen, significant and expensive changes will have to be made to the way American roads work. And seeing as how we haven't even begun to talk about what that will mean, we aren't likely to see them in our lifetimes.
posted by koeselitz at 2:58 PM on September 1, 2015 [1 favorite]


Something a self-driving car could easily do is illuminate its brake lights a half second before it actually applies the brakes, giving trailing cars a little extra warning. I will do this on the freeway sometimes, tapping the brake lightly at first to give a tailgater some warning or even pulsing my brake lights rapidly to get their attention if I'm doing a rapid stop. This would be trivial for a self-driving car which is generally looking far ahead.

It isn't just about making self-driving cars safer. There are things that self-driving cars can do to make human drivers safer. The self-driving car could even indicate how hard it is applying the brakes by a string of brake lights.
posted by JackFlash at 3:03 PM on September 1, 2015


Google themselves have admitted that their so-called "self-driving cars" rely on hyper-precise mapping down to the millimeter (which would cost trillions of dollars if applied across the US)

Street View imagery gets updated from time to time. Google already mapped the entire US from scratch once and they update it incrementally, I doubt doing it again is that big a challenge.
posted by GuyZero at 3:09 PM on September 1, 2015 [4 favorites]


"Self-driving cars" are a pipe dream. Before they happen, significant and expensive changes will have to be made to the way American roads work.

I don't think that will be necessary (even though it is kind of the problem with crazy scale that Google loves). Even if it is necessary now (and I don't believe it is), this is an area with a lot of activity and development, and it's progressing quickly. Give a robot car a bunch of cameras, laser scanners, radars and smart algorithms and eventually it will drive better and more safely than any human, with better situational awareness. Adding as much mapping data as you have makes the task easier, but it's not what makes it possible.
posted by jjwiseman at 3:31 PM on September 1, 2015 [1 favorite]


Also, let's be clear: Google themselves have admitted that their so-called "self-driving cars" rely on hyper-precise mapping down to the millimeter

That's simply not true. The entire reason why this is so difficult is that the cars need to recognize and adapt to rapidly-changing traffic and road conditions. Google's cars are currently able to navigate construction zones and recognize and correctly interpret handheld stop/slow signs.

If the car can do that, I don't really know why a millimeter-level basemap would be necessary (or even be a reasonable thing that one could create -- road conditions change way too rapidly for that sort of survey to remain valid for any reasonable length of time).
posted by schmod at 3:36 PM on September 1, 2015 [4 favorites]


My posting history will show me to be an unabashed advocate and enthusiast of self-driving cars.

That being said...

"If this is the case, these sorts of accidents should become rarer as human drivers become more accustomed to the idiosyncrasies of self-driving cars."

Muahahhahhaha... Hehehehheeh. Um, yeah, (a percentage of) human drivers will continue to be dangerous asshats regardless if they are sharing the road with humans or robots.
posted by el io at 3:38 PM on September 1, 2015


Also, let's be clear: Google themselves have admitted that their so-called "self-driving cars" rely on hyper-precise mapping down to the millimeter.

Yup, and they do this realtime. (as schmod eludes to).
posted by el io at 3:39 PM on September 1, 2015 [1 favorite]


I'm not seeing where you got the 500,000 number from in the report. I'm probably just missing something. Where are you seeing that or how are you calculating that?

rustcrumb beat me to replicating the calculation above. These sort of standard collision frequencies are at the population level, so the whole allocation thing is beside the point; if I drive a million miles without problems and you drive a million miles without problems and then we crash into each other, it's 2 million miles, one collision, so one collision per 2 million Vehicle Miles Travelled (VMT).

Anyway. I don't believe Google is low-balling its accident figures, because why would it?

Historical reporting on this issue has had Google at 300,000 miles, then 700,000 collision free miles. You tell me why a company developing a technology would have any interest at all in understating the danger their technology poses, particularly as they lobby for laws expanding its' use?
posted by Homeboy Trouble at 3:51 PM on September 1, 2015


If you haven't listened to it yet, 99% Invisible recently did an episode on self-driving cars, which was far more nuanced than any other reporting that I've heard on the subject.

It's fantastic. Go listen to it.

They also interviewed a few other AI experts who have been working on this sort of thing for a very long time, and noted that the far more likely scenario is that automation will be gradually phased in over time, and will initially most likely focus on telling a human driver how to drive the car, rather than doing the driving by itself.

This was a very percipient segue from 99% Invisible's previous episode about aircraft automation, which noted that automation works best when it works in conjunction with a human operator, rather than in lieu of one. When highly-automated systems degrade, the experience is often confusing and dangerous to the operator, even in cases like Air France 447, where the cause of the degradation and remediation should have been extremely obvious (loss of airspeed indication in the case of AF447).

In the case of automated cars, I also thought it was interesting that 99% Invisible noted that one of Google's competitors has gone through some fairly extreme lengths to ensure that the operators of its autonomous vehicles are paying extremely close attention to the road while the computer "drives," going as far as to implement eye and hand-tracking devices (and also observed that some of Google's employees seemed to become very distracted while the automated car did its thing).

Even in the absence of automation technologies, something like this eye-tracking technology could go a long way toward improving automotive safety.

The conclusions that 99% Invisible drew are hard to argue with. We're not going to have a quantum leap that leads us to self-driving cars. We're going to automate the easy cases first, and we're going to introduce a number of assistive/safety features along the way -- this is already happening. After that, similar to the evolution of aircraft automation, we're going to progressively introduce more automation that largely focuses on allowing a human operator to make informed decisions about to operate their vehicle. We'll need to make a conscious effort to avoid the paradox of automation, and, like Airbus, despite these efforts, we're probably going to create some dangerous edge cases that future drivers will have difficulty dealing with.

After that, maybe we'll gradually cut the driver out entirely. Maybe. We haven't managed to do that for aircraft yet, even though the technology to do it exists. It's unclear if or when we'll be able to make that leap for cars.
posted by schmod at 3:58 PM on September 1, 2015 [5 favorites]


Schmod, not so sure about automation being supervised by humans. I think when autonomous vehicles arrive, they will have to be fully autonomous.

With aircraft, "supervised automation" works because when something goes wrong with the plane, even if it's something totally catastrophic like the tail falling off or all the engines failing at once, it's going to take a few minutes for the plane to fall all the way to the ground. In any emergency, there's time for the human operators to wake themselves out of their nap, look up from their iPad or whatever, and respond to the emergency.

In a car, problems happen way too fast. The car guidance system can't just hand the controls over to the human driver when it goes into a skid on a patch of ice, or detects an obstacle on the highway while driving at 75mph. The time scale is too short. Even people who are fully in control of their vehicle often fail to respond correctly in these scenarios. Asking them to monitor the operation of the driving system for hours at a time is a non-starter, people are going to get bored and inattentive. People get bored an inattentive even when they are driving manually, it would be even worse to be passively watching the car drive itself.
posted by rustcrumb at 4:14 PM on September 1, 2015 [5 favorites]


schmod, I forgot about that pair of 99% Invisible episodes, and you're right, they're great.

One thing that I think is different about the role of automation in aircraft and ground vehicles is that we have very different tolerances for risk in each. Commercial aviation is so safe that incidents where human factors related to automation, or atrophied human skills due to automation have possibly become a significant contributing factor to the total number of accidents, injuries and fatalities (which point I think is made in that podcast episode). Driving, on the other hand, is so much more dangerous that for a while, at least, automation may almost be pure upside in terms of safety.

(A book I like on this topic is Digital Apollo. One of the points it makes is that in aviation & aerospace, automation became almost inescapable in the 60s--With Apollo, specifically, even though astronauts talked about manually joysticking a landing or whatever, that simply wasn't possible in absolute terms: Autonomy is a spectrum, but in the 60s is when we passed the point where a computer is always mediating and making decisions based on the inputs from the human.)
posted by jjwiseman at 4:15 PM on September 1, 2015 [1 favorite]


Historical reporting on this issue has had Google at 300,000 miles, then 700,000 collision free miles.

I don't understand your objection here; 300K miles in 2012, 700K miles in 2014, more than 2M miles in 2015. Do you think there's something wrong with those numbers?
posted by jjwiseman at 4:25 PM on September 1, 2015


On the subject of what driver can learn from the pilot experience with respect to autonomous navigation, there was an interesting article in Quartz recently pointing out that despite the fact that the autopilot flies the thing 85% of the time, human pilots still tend to get 100% of the blame when something goes wrong. So we'll still be necessary as moral crumple zones for insurance purposes. Hurrah.
posted by Diablevert at 4:28 PM on September 1, 2015 [1 favorite]


schmod: "Google's cars are currently able to navigate construction zones and recognize and correctly interpret handheld stop/slow signs."

This is not actually true. See the article Cosine posted above. Construction sites and handheld signs are miles beyond what these cars are capable of, which is why Google has avoided them strenuously when testing their cars. They have made some inroads, yes - they have hopes that their cars can recognize "almost all" temporary signs, for example - but the main problem here is that Google's system is not designed to be an adaptive system at its core; so when it is forced to be adaptive, the car has to slow down dramatically. Which is - yes - probably why there are so many people rear-ending them.

And this is in areas where Google has done the obsessive mapping necessary for their system to work. A car driving outside that hyper-mapped area wouldn't even be able to get around at anything above 10 mph, if that.

This is what I was talking about: there are obstacles here that Google is not prepared to overcome, and Google knows it. The "self-driving car" project was only ever an exercise in public relations and in stretching mapping capabilities.

You know what we actually need? Better roads, and better public transportation. Those are the things we should be spending our time and energy working toward.
posted by koeselitz at 4:32 PM on September 1, 2015 [2 favorites]


rustcrumb: "Asking them to monitor the operation of the driving system for hours at a time is a non-starter, people are going to get bored and inattentive."

I might have buried it in my post, but that was almost exactly my point. (You said it better)

Until the automation is nearly 100% perfect, it's vastly preferable to implement at form of automation that provides humans with enhanced feedback about how they should be driving the car in cases where human-scale response/reflex times are adequate.

IMO, there probably aren't a huge number of accidents that could have been prevented with superhuman reflexes, particularly if we had systems that could warn drivers of a hazard (eg. a deer off in the woods) well in advance.

ABS is a good example of a system where this is not the case. I think that you'd need to analyze accident data to determine the root causes, and whether or not a fully-autonomous driver can actually do better than a human with better information could.

Unless the system can be trusted 100%, humans need to be kept in the loop. You're absolutely right that we cannot trust drivers to continuously monitor a system that operates fully autonomously 99% of the time, and have them be prepared to jump in for that other 1%.
posted by schmod at 4:36 PM on September 1, 2015 [2 favorites]


koeselitz: "This is not actually true. See the article Cosine posted above. Construction sites and handheld signs are miles beyond what these cars are capable of"

From the article you're mentioning:
Google’s cars can detect and respond to stop signs that aren’t on its map, a feature that was introduced to deal with temporary signs used at construction sites. But in a complex situation like at an unmapped four-way stop the car might fall back to slow, extra cautious driving to avoid making a mistake. Google says that its cars can identify almost all unmapped stop signs, and would remain safe if they miss a sign because the vehicles are always looking out for traffic, pedestrians and other obstacles.
The podcast I linked to (actually a Planet Money excerpt within that podcast) also discusses this. They (fairly IMO) mention that many of these situations are ones that human drivers also have difficulty dealing with.
posted by schmod at 4:39 PM on September 1, 2015 [4 favorites]


Yeah, that's exactly the excerpt I was referring to. Note that it doesn't say the cars handle construction sites successfully - only that it can handle temporary signs okay - and suggests that it can't do unmapped 4-way stops very well.
posted by koeselitz at 4:47 PM on September 1, 2015


(Also, "almost all" unmapped stop signs is... Well, it's not exactly awesome.)
posted by koeselitz at 4:48 PM on September 1, 2015


What about a police officer directing traffic?
posted by Drinky Die at 4:51 PM on September 1, 2015


Exactly - to be clearer here:

If a "construction site" is a place where traffic just slows down a bit, or even where a temporary stop sign has been installed, it sounds like these cars will usually be fine, because the mapping stays pretty much the same, and the only thing the car has to respond to is a stop sign and slowed traffic, which it would respond to anyway. But where a "construction site" includes multiple lanes diverted using complicated signage and maybe a police officer directing traffic as DD says and the same human understanding that is used at impromptu four-way stops, things are not going to work so well. The further we go from the mapped road, the less likely the car is going to handle things effectively. And unfortunately the latter is an incredibly common occurrence in the United States.
posted by koeselitz at 4:53 PM on September 1, 2015


There's a vast gulf between highway driving and city driving - I suspect the initial automation is going to be the former, not the latter. Highway driving is a much simpler problem set - mostly a matter of cruising forward with the pack, changing lanes, and avoiding other fast-moving objects with the benefit of reactions thousands of times faster than any human.

Once you've reached the end of the highway portion of the trip, beep a few times starting a minute or two in advance, and then begin steadily slowing down/pulling over to "encourage" the human to begin driving. Let them handle the city stuff.

Next phase is the AI continues the city stuff, but begins slowing down/pulling over any time it runs into a situation it can't readily parse.

What about a police officer directing traffic?
posted by Drinky Die


That is precisely the correct time for an automated system to gracefully bow out and say "some kind of fucking monkey thing, hell if I know. I'm gonna have a cigarette while you primates sort this shit out."
posted by Ryvar at 4:53 PM on September 1, 2015 [7 favorites]


That is precisely the correct time for an automated system to gracefully bow out and say "some kind of fucking monkey thing, hell if I know. I'm gonna have a cigarette while you primates sort this shit out."

That won't work if the person in the car has no license to operate the vehicle. I was yelled at for my insistence that for the forseeable future a self-driving car would require someone at the wheel licensed to operate the thing.
posted by Justinian at 5:09 PM on September 1, 2015 [3 favorites]


Etrigan: “Which Cosine linked to after aramaic rightfully called out the half-remembered details deployed a year later in service of a barely related anecdote.”

'It doesn't matter that you turned out to be right all along – you didn't cite your source precisely right off the bat, so you lose this round!'


Let's check how "right all along" it was against the source:
Currently the Google cars cannot drive at night

No mention of this in the article.

at dawn, at dusk

One mention of this in a particular case: "The car’s video cameras detect the color of a traffic light; Urmson said his team is still working to prevent them from being blinded when the sun is directly behind a light."

in the rain

They haven't been tested in heavy rains, yes: "Among other unsolved problems, Google has yet to drive in snow, and Urmson says safety concerns preclude testing during heavy rains." But that's different from "cannot drive in the rain", if only by degree.

beside bicycle lanes

Zero mentions of "bicycle", "bike", "cycle"...

past driveways

Eh, maybe: "Google often leaves the impression that, as a Google executive once wrote, the cars can 'drive anywhere a car can legally drive.' However, that’s true only if intricate preparations have been made beforehand, with the car’s exact route, including driveways, extensively mapped."

in busy downtown streets...

No indication of this in the article.

So out of seven claims, four of them are partially supported by the article. Are you still so certain of how "right all along" that comment was?
posted by Etrigan at 5:12 PM on September 1, 2015 [4 favorites]


That won't work if the person in the car has no license to operate the vehicle. I was yelled at for my insistence that for the forseeable future a self-driving car would require someone at the wheel licensed to operate the thing.
posted by Justinian


Christ I hope it wasn't me doing the yelling because that's an incredibly stupid thing for someone to yell at you.

The point of automated driving isn't, for the first few decades at least, to handle all conveyance from point A to point B. It's to allow me to put my feet on the dashboard and work on a laptop while commuting to my job so The Company can squeeze an extra bit of productivity out of me.

Put differently: the point of automated driving IS, for the first few decades at least, to help the 1% exploit the 99% 1% more effectively for 99% of their commutes. The tricky 1% of the commute remaining is 100% the commuter's problem.
posted by Ryvar at 5:20 PM on September 1, 2015


Construction sites and handheld signs are miles beyond what these cars are capable of, which is why Google has avoided them strenuously when testing their cars.

In the Google video on this page, you see

0:30 - Car navigates a construction zone with signs and cones blocking a lane.
1:05 - Car detects a bicyclist's hand signal.

From 2014, "The First Look at How Google's Self-Driving Car Handles City Streets":
We stopped at a construction worker holding a temporary STOP sign and proceeded when he flipped it to SLOW — proof the car can read and respond to dynamic surroundings, making it less reliant on pre-programmed maps.
I can't find a video I saw this morning, of the car recognizing a police officer's "stop" and "go" hand signals.

The cars aren't done yet, and there are still some big questions, but they're not decades away from having the capabilities they need, and they're not snake oil and pure PR.

On the subject of "big questions", this recent RAND report "Using Future Internet Technologies to Strengthen Criminal Justice" has a short, interesting section on unmanned vehicles and law enforcement:
Imagine a law enforcement officer interacting with a vehicle that has sensors connected to the Internet. With the appropriate judicial clearances, an officer could ask the vehicle to identify its occupants and location histories. The officer then could use Semantic Web technologies to review the criminal histories and search for any outstanding warrants of the occupants across dozens or even hundreds of local, state, and federal repositories—even repositories that do not contain data in a traditional “compatible data format” (but that are semantically tagged). Or, if the vehicle is unmanned but capable of autonomous movement and in an undesirable location (for example, parked illegally or in the immediate vicinity of an emergency), an officer could direct the vehicle to move to a new location (with the vehicle’s intelligent agents recognizing “officer” and “directions to move”) and automatically notify its owner and occupants.
posted by jjwiseman at 5:26 PM on September 1, 2015 [1 favorite]


“They have to learn to be aggressive in the right amount, and the right amount depends on the culture.”

And that's just it. "The culture" considers tens of thousands of deaths a year to be an acceptable amount of collateral damage. If Google's project goes along with that, it's liable. If it doesn't, it will always be hamstrung.
posted by alexei at 5:32 PM on September 1, 2015


And that's just it. "The culture" considers tens of thousands of deaths a year to be an acceptable amount of collateral damage. If Google's project goes along with that, it's liable. If it doesn't, it will always be hamstrung.

I'd say it's the exact opposite. We consider the deaths acceptable because, as has been pointed out, 90-odd percent of accidents are caused by human error, and I drive fine, it's all the other assholes who don't know what they're doing.

You can see their strategy at the top of this very page --- they program the car to obey the law, if they get hit because the human driver in the other car expected them to bend the rules the way a human would, liability falls on the human driver of the other car. They were the one who broke the law. And google will be able to prove it, to the millisecond, with nine different camera angles. You ever see those dudes who show up at traffic court with paperboard blow ups diagramming the intersection and construction paper cutouts of the cars? Well, they ain't got nothin' on Chris Urmson.

Ultimately this all will come down to insurance, really. I think that's going to be the fascinating question. If automation has the same effect on the car accident rate that it did on the plane accident rate, and very few people even own their own cars, will the car insurance industry cease to exist? State Farm's starting to get a little freaked out about this. Then again, pretty much every American homeowner is forced to buy title insurance, and that's a completely useless boondoggle, maybe the same will happen to car insurance.

It's the transitional period that may determine which way the monkey jumps --- imagine a world in which, say, 50 percent of cars are autonomous, and every time there's an accident between a human-driven and an autonomous car, the insurance company lawyers show up to court to see the nice man from Google and his 3D holographic so-mo projection and supplementary charts showing why the accident was very definitely, 100%, down the millisecond, their client's fault. Will insurance company start forcing their clients to install similar tracking software or they'll refuse to issue a policy of the car? They're already offering discounts for this. Will they merely charge usurious rates if you decline to be tracked? If the tracking software proves an accident was your fault, will you be able to get insurance on your next car? Will you have to buy a car will full auto-driving features, the way some drunks are forced to have breathalyzers installed on the ignition?
posted by Diablevert at 6:18 PM on September 1, 2015 [2 favorites]


That is precisely the correct time for an automated system to gracefully bow out and say "some kind of fucking monkey thing, hell if I know. I'm gonna have a cigarette while you primates sort this shit out."

Yeah, like, you can program the car to recognize hand signals but can it tell the difference between some kid flashing signals to fuck with it and a legitimate source? Remember, we are part of a species in which some of our fellow people point lasers at cockpits just to fuck with pilots.
posted by Drinky Die at 7:13 PM on September 1, 2015 [1 favorite]


As a cyclist all I can say is bring on the driverless cars.
posted by photoslob at 7:22 PM on September 1, 2015 [1 favorite]


but can it tell the difference between some kid flashing signals to fuck with it and a legitimate source?

Yeah, this is pretty much why the NHTSA specifically calls out the Google Car (and more recently, Freightliner Inspiration) as examples of what it calls Level 3 automation - occasional driver involvement within several seconds of handoff notification is expected, the aim of the system is more to handle the seriously-a-trained-dog-could-do-this parts of your commute / freight route.

Total automation (Level 4) would require Turing-level AI or something very like it for exactly the reason you're describing. Goddamn primates and their banana nonsense. If anyone wants to start a betting pool on whether we'll see Minority Report-style commuter pods on rails as the dominant form of transportation before then, well...I pretty much left the cognitive science undergrad program I was in the moment I realized anything approaching that grade of machine intelligence falls firmly into the "long after we're all dead" category.
posted by Ryvar at 8:00 PM on September 1, 2015 [3 favorites]


Metafilter: Goddamn primates and their banana nonsense.
posted by Drinky Die at 8:07 PM on September 1, 2015 [3 favorites]


I get a strong sense that the deep sceptics of driverless cars are a bit like those people saying "horseless carriages? They'll never catch on! Too noisy, too unreliable!". Sure, google isn't just about to crack this, it wont be a big thing for a while yet, but I'd bet dollars to doughnuts that all or nearly all cars will be driverless while I'm still in need of personal vehicular services (I'm 43). I still expect my 9 year old son to get a drivers licence, I don't expect him to spend the majority of his life in primary control of a vehicle.
posted by wilful at 8:21 PM on September 1, 2015 [3 favorites]


I think we get 90+% the way there, and semi-automated cars become more common, there will be a big push to put in support infrastructure to do the last 10%. Things like requiring transponders in all licensed cars, road edge markers, etc.
posted by fings at 8:22 PM on September 1, 2015 [1 favorite]


Put differently: the point of automated driving IS, for the first few decades at least, to help the 1% exploit the 99% 1% more effectively for 99% of their commutes. The tricky 1% of the commute remaining is 100% the commuter's problem.

The other point is for the 1% to act more like the .01% and sip martinis in the back seat on the way to work without all of the costs of paying your own driver a living wage or disrupting the cab economy or whatever. As someone who enjoys being drunk a lot more than I enjoy driving I think you can reasonably guess my opinion on driverless cars.

And this goal is exactly why I think any situation where you need humans to be in control in extreme situations will result in, if anything, elevated risks of drunk driving. Someone who pays double for a car that can get him home from the bar isn't going to be able to take the wheel. Humans make poor decisions, and humans with a (incredibly common) addiction much moreso.
posted by sandswipe at 8:37 PM on September 1, 2015 [3 favorites]


there will be a big push to put in support infrastructure to do the last 10%. Things like requiring transponders in all licensed cars, road edge markers, etc.

The unspoken implication there is that unlicensed cars will simply not be permitted on the roads. And cyclists, and people...

without all of the costs of paying your own driver a living wage or disrupting the cab economy or whatever

If and when driverless cars arrive, the cab economy won't just be disrupted-- it'll be unrecognizable.
posted by alexei at 9:37 PM on September 1, 2015


We don't even have a sustainable or even workable national transportation infrastructure, much less an effective intercity and interstate public transportation network, and we're dreaming about self-driving cars. This is nonsense; it's a transportation engineer in 1948 being asked about the decline of the railway system, and responding "yes, but flying cars"
posted by koeselitz at 10:03 PM on September 1, 2015 [2 favorites]


(And - I should say, yeah, we can do multiple things at once. But for every "Maybe we could fix public transit?" article I see, there are at least a dozen "OMFG CARS THAT DRIVE THEMSELVES!!!" articles. And the whole thing about the idea of self-driving cars is that they're really a tiny, insignificant step forward where a lot of this stuff is concerned. We, as American humans, are still selfishly really just hoping to be individually transported in luxury wherever we want to go, society and ecology be damned. We need to let go of this. As others have said, cars that drive themselves would change everything anyway. Maybe we could be ahead of the curve, and think non-selfishly about what we want that change to look like, for once in our lives? Instead of driving change forward based solely on our desire to see ourselves and our companies profit?)
posted by koeselitz at 10:09 PM on September 1, 2015


koeselitz: News is event driven. And local transit stories are not globally distributed and consumed. I live in a city that is in the midst of a multibillion dollar public transit infrastructure project; stories about this project dominate the local paper. And these stories aren't reported on much at all in the city an hours drive from me. Because it doesn't effect those people.

The university here is also doing some research into some automated vehicle technology. I've seen a single story about this.

Which is a good focus for the local paper to have - write a lot of stories about the project that will impact everyone in the area and then write a small story about a research project the university is doing.

It makes sense that transportation projects (unless they fuck up on an epic scale - I'm looking at you, Boston dig, and Seattle shitstorm) will only be reported on locally, while technology advances or news are widely reported on globally.

Despite my continuing interest in self-driving cars, 95% of my travel is either walking or on public transit.
posted by el io at 10:45 PM on September 1, 2015


it's a transportation engineer in 1948 being asked about the decline of the railway system, and responding "yes, but flying cars"

Maybe we could be ahead of the curve, and think non-selfishly about what we want that change to look like, for once in our lives? Instead of driving change forward based solely on our desire to see ourselves and our companies profit?

koeselitz: I sympathize with what you're saying, but what you're proposing boils down to "yes, but flying pigs"

I am a utopian socialist at heart and yet I know damned well that my statements re: limits of AI above notwithstanding, I will be fucking chauffeured around by a contrite Skynet before people start behaving that way.
posted by Ryvar at 10:54 PM on September 1, 2015 [1 favorite]


Koeselitz, "we" indeed.
posted by wilful at 1:09 AM on September 2, 2015


Well, however we get there, I just can't wait for the driverless car to revitalize the car-centric suburb model that made America great. Got an awful, soul-crushing 2 hour commute to [BIG CITY]? No problem! Watch TV while the computer takes care of everything!
posted by indubitable at 5:24 AM on September 2, 2015 [1 favorite]


The estimates above that have the average accident rate for humans at one per 500,000 seem completely wrong to me. There are various organizations that give commercial truckers an award if they drive a million miles without an accident. See here as an example, a guy who drove 2.2 million miles over 32 years.

It doesn't seem at all likely to me that the average of all human drivers comes out to only slightly worse than the very best professional commercial truck drivers. I mean, 500,000 miles is a 20 mile roundtrip commute, on weekdays, for nearly a hundred years. (Or a 40 mile commute for fifty years, or a 100 mile commute for twenty years.) And only getting into a single fenderbender-or-worse accident in all those miles. Or to think of it another way, it's going through the ownership of five cars, putting 100,000 miles on each of them, with only a single accident amongst them all. The average of all drivers can't be anywhere near that low.
posted by XMLicious at 6:12 AM on September 2, 2015 [6 favorites]


It's really weird to watch people claiming rational skepticism while doing the math backwards to count human-caused accidents against Google, not to mentions failing to use the right metric (collisions per hour) or control for environment (rural interstates have more miles without collisions than the city driving Google has focused on).

I don't think this is just routine fear of change, however, but rather fear of continuity. Most stories like this will get a comment like indubitable's worrying that self-driving cars will revive the suburban lifestyle's appeal. That's a valid concern, although I think less powerful than many fear since the absolute time, health, and fuel costs will not change, but it's important enough to raise directly rather than hiding behind safety concerns. Trying to spin scant evidence into “these things are unsafe” is going to kill credibility a lot more effectively than the suburb.
posted by adamsc at 6:37 AM on September 2, 2015 [1 favorite]


How about instead of worrying about whether the suburbs have a resurgence we worry about whether people with mobility impairments can have some independence in their lives and access a mode of transportation that's affordable.

Paratransit in most major cities is terrible and cabs are expensive. Self-driving cars are going to remake the world for some people.
posted by GuyZero at 7:51 AM on September 2, 2015 [6 favorites]


As long as we're throwing around anecdotal evidence, I was re-ended 3 times in one month at the same intersection. It was a right turn lane that had a stop sign. Each time I was hit, I would stop at the stop sign and the person behind me would smack into my bumper because they were used to rolling through the stop.

I finally had to ignore the stop sign myself to avoid being hit. A self-driving car doesn't have to be erratic to be read-ended, it just has to be a stickler for stopping when it is supposed to.

Automation is coming much faster than people realize. You can already buy a car that: 1) adjusts the speed of your cruise control to match the car in front of you, 2) stops your car automatically if you are about to hit something, 3) warns when you are drifting out of your lane, 4) warns when another car is in your blind-spot, 5) can automatically parallel park your car.

Items 1 through 4 are already damn close to handling a highway route autonomously. I'll be shocked if there isn't a fully automatic cruise control offered for highway driving within the next five years.
posted by Eddie Mars at 7:54 AM on September 2, 2015


I've been half-joking for years that we should ban human drivers, but really, we should ban human drivers, and I'm thrilled to see any acknowledgement of the fact that humans are terrible fucking drivers.

If anything else on earth killed as many people as driving does, we would have bombed it back to the Stone Age by now.


I for one am super ready for when robot cars become scientifically sensible and cost efficient, and also "The Government wants to take our cars!!!" blowback that will certainly come with it. I wonder what will happen first in America, widespread robot driving or a serious level of gun control that would make other first world nations not cringe when they look at our gun situation.
posted by DynamiteToast at 1:01 PM on September 2, 2015


also "The Government wants to take our cars!!!"

Today, literally today, the government will prohibit some number of people from driving and a small fraction of those will be banned from driving for life.

So what is going to change exactly?
posted by GuyZero at 1:28 PM on September 2, 2015


I'm curious whether “The Government wants to take our cars!!!” will actually prove to be a significant barrier. I'm expecting the insurance companies to make most of the change happen as soon as it's a clear savings: starting with discounts and eventually penalties for manual operation. If that happens, the showdown between “drivers rights” activists calling for legislation to prevent that versus top-tier lobbyists will be interesting…
posted by adamsc at 3:29 PM on September 2, 2015


Every time we talk about self-driving cars I'm reminded of Inkoate's comment in the sideways-elevator thread:

Oh, thank god for Metafilter. I'm sure the designers hadn't thought of failsafes yet. We can only hope and pray that one of their engineers reads this page before a tragic accident occurs.
posted by ethand at 5:30 PM on September 2, 2015 [6 favorites]


Diablevert: "insurance company lawyers show up to court to see the nice man from Google and his 3D holographic so-mo projection and supplementary charts showing why the accident was very definitely, 100%, down the millisecond, their client's fault."

Should be interesting when self driving cars get moving violations. Google sure as heck isn't going to indemnify users against it.
posted by Mitheral at 9:59 AM on September 4, 2015


Should be interesting when self driving cars get moving violations.

Or, an even more hilarious scenario: Cops quit writing tickets and start writing bug reports.
posted by el io at 11:47 AM on September 4, 2015 [1 favorite]


« Older A Hideo Kojima Game   |   Hello, this is Lenny Newer »


This thread has been archived and is closed to new comments