Autonomous vehicles and fatal accidents
July 1, 2016 1:08 PM   Subscribe

On May 7th, 2016, Joshua Brown, 40, of Canton, Ohio, was driving via the Autopilot feature of his 2015 Model S in Williston, FL, when it collided with a tractor-trailer making a left turn. This marks the first known fatal accident involving an autonomous or semi-autonomous vehicle, and is being investigated by NHTSA. Tesla declined to answer if it will disable Autopilot, noting that, "...[t]his is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles.” In light of recent accidents (Wired, Geekwire), who's to blame when a self-driving car crashes? And how should autonomous vehicles respond to an impending collision (Science, open access)?

From the Science article:
We found that participants in six Amazon Mechanical Turk studies approved of utilitarian AVs (that is, AVs that sacrifice their passengers for the greater good) and would like others to buy them, but they would themselves prefer to ride in AVs that protect their passengers at all costs.
Writer Steve Hanley predicted a fatal accident involving a self-driving car based on statistics and Elon Musk’s own assessment of the safety of autonomous vehicles as twice as safe as human drivers. Writers for the National Law Review argue that some liability may reside with the automaker, but they do not need to be shielded from lawsuits.

Autonomous vehicles previously, previously, previously, previously, previously, and more.
posted by Existential Dread (126 comments total) 20 users marked this as a favorite
 
""...[t]his is the first known fatality in just over 130 million miles where Autopilot was activated."

I'm uncomfortable at the fact that they collect/retain that kind of data to begin with.
posted by schmod at 1:11 PM on July 1, 2016 [1 favorite]


Owners & lessees are asked to send data to Tesla when they buy the car.
posted by Monochrome at 1:12 PM on July 1, 2016 [2 favorites]


Does that mean that the car tracks and reports your movements too?
posted by Zedcaster at 1:14 PM on July 1, 2016


I wrote this in the thread 6 Principles thread regarding Tesla's statement on the crash:

Interesting description of the liability avoidance measures of the Tesla AutoPilot system:
When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.
So the cars have the ability to verify that the operators hands are on the wheel. And the feature requires you to acknowledge that you will keep your hands on the wheel. Yet Tesla doesn't keep a running check that you are keeping your hands on the wheel instead only checking at unspecified intervals. Why enable behaviour (with additional programming to boot (the timer)) that enables incorrect use of your system? Maybe because youtube video showing drivers with their hands on the wheel won't garner millions of hits?
posted by Mitheral at 1:16 PM on July 1, 2016 [10 favorites]


I think there's a comparison to be made to general aviation, where manufacturers are not shielded from lawsuits. In fact, the 1980s were so bad for lawsuits against small plane manufacturers that it almost completely killed the GA industry. Even now, a fatal crash just about guarantees a slew of lawsuits against the airframer, avionics manufacturer, maintenance shops, aircraft owner, and anyone else remotely related to the purchase and upkeep of the aircraft.

I'm not saying that autonomous vehicle manufacturers need protection from lawsuits, but it did take an act of Congress to revitalize GA after the 1980s to limit liability and reduce lawsuit action and costs. We'll probably see much more civil legal action (probably resulting in the destruction of many companies) before the autonomous vehicle market stabilizes.
posted by backseatpilot at 1:22 PM on July 1, 2016 [10 favorites]


In the past, Elon Musk, the Tesla chief executive, has praised the company’s self-driving feature, introduced in the Model S last fall, as “probably better than a person right now.”

But in its statement on Thursday, the company cautioned that it was still only a test feature and noted that its use ‘‘requires explicit acknowledgment that the system is new technology.’’

It noted that when a driver activated the system, an acknowledgment box popped up, explaining that the autopilot mode “is an assist feature that requires you to keep your hands on the steering wheel at all times.”
Hi! Please click this box to acknowledge that you are an unpaid beta tester for a technology whose malfunction is likely to kill you! Thanks!
posted by indubitable at 1:23 PM on July 1, 2016 [32 favorites]


I think it's interesting that the vehicle didn't see the truck with its various systems (neither did the driver, as the brakes weren't applied either autonomously or by the driver).

Even in the case where the driver does not see it (or is not paying attention), I think this indicates a significant gap in the vehicle's sensor package. Surely a radar would have detected it?
posted by chimaera at 1:23 PM on July 1, 2016 [7 favorites]


There's some reports that the driver was watching a DVD at the time of the crash. Kind of the definition of contributory negligence.
posted by T.D. Strange at 1:23 PM on July 1, 2016 [11 favorites]


Also, "Who's to blame" is a terrible question to be asking.

Aviation has an excellent track record, largely because the issue of blame/liability takes a back-seat to identifying root-causes, and determining remediations to prevent similar accidents from recurring.

Simply put, "Human error" isn't an acceptable finding in an accident report. No pilot wants to die, and no airline wants its planes to crash. Figure out why the error was made, and make changes to prevent it in the future (including sweeping and systemic reforms if necessary).

The fact that commercial aviation is as safe as it is is a massive triumph of science, and it's insane that we don't apply the same methodologies to automotive travel.

Not only have we internalized and accepted the fact that cars are insanely lethal, but when things do go wrong, we seem determined to throw somebody in prison instead of working to correct the systemic issues that led to the casualty.

My $0.02 is that the current generation of automation places us in a very hazardous middle-ground, where drivers need to pay full attention, but only in a small subset of cases that the automatic systems cannot handle. This is particularly alarming, because drivers won't necessarily be accustomed to making manual inputs, won't be aware of all of the scenarios where the automatic systems won't work properly, and can easily suffer from mode-confusion when those systems disengage or degrade.

Tesla's autopilot feature has the potential to significantly improve safety, but the current implementation and marketing of the feature are grossly irresponsible, and may very well lead to the opposite happening.

Pointing fingers does not make us safer.

We need to use science to determine the parameters that allow vehicle automation to improve safety, and hold manufacturers accountable for manufacturing vehicles that only operate inside of those parameters.
posted by schmod at 1:24 PM on July 1, 2016 [114 favorites]


These numbers:
".[t]his is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles."
are obviously not comparing the same thing because AutoPilot can only be engaged on non-residential streets with a centre divider. IE: pretty well the safest driving people already do and certainly much safer than the all driving average. Really I wouldn't be surprised if 1 death in 130 million miles on divided, non-residential streets is worse than humans.
posted by Mitheral at 1:25 PM on July 1, 2016 [24 favorites]


Yeah, this is my biggest problem with autopilot. Cameras are fine, but should not be the only sensor on the car that can see more than 25 feet or so.

That said, wtf truck driver? Did you have as much trouble seeing the Tesla as it did you?
posted by wierdo at 1:27 PM on July 1, 2016 [3 favorites]


contributory negligence

In this case, it's possibly applicable (but, again, Tesla seem to be marketing the system this way), but in general, contributory negligence is often horribly abused, often making it impossible for a victim who is 1% negligent to recover any damages.

Some of these laws probably seemed like a good idea at the time, because it kept most auto insurance claims out of the courts (and things generally balanced out evenly for the insurance companies), but the general concept is indistinguishable from victim-blaming.
posted by schmod at 1:27 PM on July 1, 2016 [5 favorites]


Wouldn't this accident have been considered the truck driver's fault? So really, the autopilot didn't so much cause an accident as fail to avoid an avoidable one.

Also, let's not ignore the chronic problems in the trucking industry that probably contributed to the deficiencies on that end (overwork, sleep deprivation, drugs, high turnover, etc.)
posted by Mitrovarr at 1:31 PM on July 1, 2016 [9 favorites]


I agree with schmod. The "Autopilot" feature as it currently exists is named misleadingly and encourages over-confidence. This is Level 2 automation at best, and people are acting like we're already at Level 3 or 4.

In this situation, there's no indication that the system actually caused the accident (i.e. by steering into opposing traffic), though I'm sure that's inevitable at some point as well—after all, non-autonomous cars are involved in over 100 fatal accidents every day in the U.S.

The system simply failed to prevent an accident. The question is whether the presence of the system encourages drivers to pay less attention than they otherwise would. This may be discouraged officially, but unofficially kind of seems like the whole point of having something called Autopilot.
posted by designbot at 1:42 PM on July 1, 2016 [5 favorites]


As a point of fact, lawsuits have essentially killed the general aviation industry. It required the General Aviation Revitalization Act to get anyone to even consider staying in the game. And now it is reduced it to a fraction of the size it would be as an industry with the ongoing fact that lawsuits are a too routine part of any accident aftermath.

It isn't at all clear that lawsuits have done in any way improved anyone's safety. And, this is a similar moral situation. If self-driving cars are twice as safe as human drivers there will still be an enormous number of fatalities - at the same time that it is better than the alternative of maintaining human control.

Right now there is enormous potential for hard cases to make bad law.
posted by meinvt at 1:44 PM on July 1, 2016 [9 favorites]


[These numbers] are obviously not comparing the same thing because AutoPilot can only be engaged on non-residential streets with a centre divider. IE: pretty well the safest driving people already do and certainly much safer than the all driving average. Really I wouldn't be surprised if 1 death in 130 million miles on divided, non-residential streets is worse than humans.

Not that simple - highway driving involves much higher speeds, and even if accidents per mile are lower, fatalities per accident are likely to be higher. Anyone have actual data?
posted by kleinsteradikaleminderheit at 1:45 PM on July 1, 2016 [5 favorites]


If the tractor trailer was turning left, it seems like the Tesla had the right of way. Can they tell if this was avoidable by a human driver in the Tesla's position?
posted by advicepig at 1:54 PM on July 1, 2016 [2 favorites]


I'll take my chances on a road filled with "autoaotos" every time rather than on a road where ten percent of the drivers are busy talking, texting and day dreaming.
posted by notreally at 1:56 PM on July 1, 2016 [8 favorites]


I would say Tesla's Autopilot is a quite advanced level 2 autonomous vehicle but that is exactly the wrong level of autonomy. It is autonomous is enough that it seems to the user like it is fully autonomous (despite "Text text text text answer yes to do cool stuff: Yes/No") but it is only autonomous enough to still demand that the user can take over at any moment. This, of course, leads to the user becoming distracted and not paying attention to the road, while the vehicle drives along until it gets itself into a situation it can't handle.

I am all for active safety systems in cars and autonomous vehicles but you can't gradually develop the former until you get the latter. If the vehicle allows the driver to not pay attention it should also allow the driver to never pay attention and still act as a better-than-average driver.
posted by Stood far back when the gravitas was handed out at 1:56 PM on July 1, 2016 [15 favorites]


I was so pissed off at Tesla's framing of the accident I wrote a blog post about it. It is completely disingenuous for them to claim the driver didn't see the truck. They have no idea what the driver saw; the driver was decapitated. They say they think the driver didn't apply the brakes, but that could mean all sorts of things. (It's also framing to call the victim "the driver", when apparently he wasn't driving at all in the moments before his death.)

Note also this accident happened two months ago. My guess is Tesla waited until the last possible moment to disclose it, when the NHTSA report was about to be announced. Gave them plenty of time to get their story straight.

The frustrating thing is I'm a fan of self-driving cars, I think in the near term they will be much, much safer than human-driven cars. But this current autopilot "driver assist" is dangerous; once this much driving is automated you can't rely on the driver to take over in emergency situations. I've experienced this phenomenon frequently when flying airplanes with autopilot; it's very difficult to get "back into the plane" after 30 minutes of zoning out in straight and level flight. I experience it every time I drive on cruise control and need to quickly slow down because some yahoo pulled in front of me. It takes longer than if I'd had my foot on the accelerator all the time. It's an acceptable risk IMHO but it is a risk. A measurable one.

The solution to all this is fully autonomous cars. Ideally without driver controls at all so that the human can't leap in and do something stupid.
posted by Nelson at 2:01 PM on July 1, 2016 [25 favorites]


Mitheral, that's an interesting point, and I'm having trouble finding current data for interstate travel vs other travel. It clearly exists, because I am seeing various news stories saying such and such state has a high interstate fatality rate, but the various DOT sources I can find aren't breaking it out by type of road. I found an old table, but that's 20 years ago and there's been considerable car safety improvements since so I don't know if that's close to what it is today.

From the chart, it looks like a) the truck driver was primarily at fault because you aren't supposed to make a left turn unless you can clear traffic and b) it was a pretty horrible autopilot failure. It's not like the truck had just barely moved into range, the sensors should have had several seconds to kick in the brakes. So maybe not Tesla's fault per se, but a pretty huge hit on their autopilot being consumer-safe. There's all sorts of scenarios where the truck could have been across the travel lane without it being the driver's fault -- jacknifed trying to avoid another vehicle, hit by another vehicle and gone out of control, etc.
posted by tavella at 2:03 PM on July 1, 2016


We found that participants in six Amazon Mechanical Turk studies approved of utilitarian AVs (that is, AVs that sacrifice their passengers for the greater good) and would like others to buy them, but they would themselves prefer to ride in AVs that protect their passengers at all costs.

Randian automobiles were prototyped but were rejected when they refused to expend any energy hauling meaty looters around.

ABS is ABS. No contradiction.
posted by delfin at 2:06 PM on July 1, 2016 [5 favorites]


Not that simple - highway driving involves much higher speeds, and even if accidents per mile are lower, fatalities per accident are likely to be higher. Anyone have actual data?

Here is a relevant document which states (on pg. 165) that over half of fatal crashes occur on arterial roads. But according to this web-site, far more miles are traveled by the average person on arterial roads than on non-arterial roads. On balance, the fatality rate per mile for vehicles driving autonomously seems comparable to that for vehicles driven by a human. (I am assuming that we can equate arterial roads with the roads on which AutoPilot can be activated.)
posted by Abelian Grape at 2:11 PM on July 1, 2016 [2 favorites]


This case is both inevitable and extremely sad because of the loss of life. I will say that I was pretty surprised that this feature is standard on Tesla's -- I thought that all autonomous driving (beyond say, predictive braking to avoid rear-ending people) was still happening under test conditions.

Note also this accident happened two months ago. My guess is Tesla waited until the last possible moment to disclose it, when the NHTSA report was about to be announced. Gave them plenty of time to get their story straight.

If, as Tesla stated, they reported this to the NHTSA immediately, I see no problem with them waiting before going public. Partly out of respect for the family of the deceased, and also because it helps make a better discussion if more of the facts are known.

Regardless of who had the right of way, or whether the car or the truck was at fault or what the victim was doing when this happened, it's clear that Tesla's sensor technology has a literal blind spot. Maybe it was something that they had predicted in the past but thought too unlikely to ever happen, or maybe this was just a worst case scenario that slipped through their validation testing. The best thing that Tesla can do now is figure out how this case was missed, and be transparent with the NHTSA about their findings and how they intend to fix it.
posted by sparklemotion at 2:12 PM on July 1, 2016 [5 favorites]


Really I wouldn't be surprised if 1 death in 130 million miles on divided, non-residential streets is worse than humans.

The rate (~10-9) is low enough that this is really not a big enough sample size to tell.
posted by en forme de poire at 2:13 PM on July 1, 2016 [10 favorites]


all this, because driving in Florida is really boring
posted by eustatic at 2:23 PM on July 1, 2016 [1 favorite]


Agreed that it was foolish of Tesla to make a claim about what the driver saw, based I guess purely on inference from brake pedal activity. That really hurts their credibility.
posted by chinston at 2:37 PM on July 1, 2016


Zedcaster: "Does that mean that the car tracks and reports your movements too?"

Welcome to surveillance capitalism.
posted by boo_radley at 2:38 PM on July 1, 2016 [4 favorites]


Some reports say this dude was watching a Harry Potter movie on his iPad. Bad way to go.
posted by Coda Tronca at 2:40 PM on July 1, 2016 [1 favorite]


Agreed that it was foolish of Tesla to make a claim about what the driver saw, based I guess purely on inference from brake pedal activity.
I believe all Tesla said was, "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied." Just curious, how do you feel that it was foolish to note?
posted by ArmandoAkimbo at 2:45 PM on July 1, 2016 [1 favorite]


Right, either the driver didn't see the trailer or he decided to drive under it. I think the first is a reasonable assumption.
posted by Justinian at 2:47 PM on July 1, 2016 [1 favorite]


I had trouble with the wording because it implies the truck was unseeable. The driver not seeing it was probably the driver not paying attention.
posted by Mitheral at 2:47 PM on July 1, 2016 [4 favorites]


Perhaps the driver did notice, and was filled with abject terror, frantically scrambling to uncross his ankles and stab at the brake pedal, but couldn't manage that while traveling at 70mph.

They can say that the driver did not apply the brake, but they cannot say what the driver noticed. They've presented a comforting scenario that happens to be the best one for them.
posted by pwinn at 2:48 PM on July 1, 2016 [23 favorites]


Wouldn't this accident have been considered the truck driver's fault? So really, the autopilot didn't so much cause an accident as fail to avoid an avoidable one.

From a legal-responsibility point of view, I imagine you're right. But if we're examining the situation in hopes of preventing a repeat, there's a lot more to look at. I think there are at least three major factors to consider here.

1. Unless we reach a point where even pedestrian movements are controlled, other road users are always going to make dangerous mistakes. Dealing with them as best as possible is part of everyone's responsibility as a driver (or cyclist, or pedestrian). I don't know of any statistics around accidental and deliberate risky behaviors, but I don't get the sense that they're improving: my city is fairly uncrowded and my commute is much shorter than it used to be: I still encounter "Jesus! Are you trying to kill us both?" situations at least weekly when driving, and much more often when on foot. Improving driver training, teaching techniques to maintain focus, distracted-driving and failure to-yield-enforcement could help, but I don't see any near-term opportunity for radically reducing the frequency with which any given road user needs to deal with another road user's dumb-ass stunt (like turning left in front of traffic).

2. This automated car's ability to detect and deal with a dangerous situation is much, much worse than I'd thought, and I was already damned skeptical. It overlooked a semi-trailer because it was white and in front of a bright sky!? How in heaven's name can it possibly hope to deal with stuff that's actually hard, like pedestrians in dark clothes, white vehicles in heavy snow, cyclists ahead of you when driving into a low sun, distinguishing between a moderate animal and a small child in situations where panic-braking is dangerous? This situation can and must be improved, by a lot, but they fact that they allowed people to use a system that could make a mistake this bad is deeply disturbing.

3. The human driver didn't take preventative action either, and may have been massively distracted. Not any kind of a surprise, unfortunately: distracted driving is a huge problem even for unassisted human drivers. In response to this situation the manufacturer seems to claim that the human driver should always be monitoring the situation as closely as if the automation didn't exist, and ready to react as quickly as if they were in control. Unfortunately, both their marketing and their interface design convey a very strong message that this isn't necessary.

Given what we now know about the quality of the sensors, this level of focus seems like it really would be the only way the system could be used safely. Unfortunately, it's basically impossible. The human brain is really bad at dealing with low-frequency, high-significance events. Monitor a automated system that's almost always right for even a few weeks, and your brain will come to trust it and let your attention wander almost no matter what you do. Watch it successfully handle a couple of emergent situations and it doesn't what you conspicuously tell yourself about the possibility of failure, behavioral you has been trained to trust the system and wants you to focus on something else.

I don't think there's any practical way to solve this problem at all, until the automation gets so good that it never needs humans to take over. Clearly it's nowhere near that good yet. Even in airplane cockpits, where everything people can think of is done to make this "humans as backup to an automatic system" process work, and where the needed response times are usually not in the 0-3 second range like on the road, see really notable failures. The idea that the average driver is going to be able to manage an "almost good enough" autopilot safely is absurd. Even real pilots can't, and they get explicit training on keeping their focus, recurring training in simulated failures, a strong and explicit culture of caution, an assistant to spot what they don't, and usually much, much more time to respond.

Automated cars will be great once they're good enough to drive by themselves in every emergency. Obviously they're nowhere near that good yet. But because real-world human drivers have a very limited ability to backstop automation, for all practical purposes they're already "flying solo" and will continue to do so.
posted by CHoldredge at 3:03 PM on July 1, 2016 [21 favorites]


Tesla's quote about the driver noticing the truck is deliberately written by a masterful PR person. It's not some accidental sentence that just happens to be misinterpreted, it is constructed to make the reader think the driver didn't see the truck. Overtly they are making an argument that the accident isn't really the autopilot's fault, the dumb driver screwed up. Subliminally this is also reassuring, that the driver never saw the death that was coming.

But Tesla has zero evidence about what the driver saw. For evidence, Tesla says their telemetry says the "brake was not applied". Let's accept that at face value, and the victim never touched the brake. How do we jump from that fact to knowing what the driver saw?

It's also weirdly specific to call out the "white side of the tractor trailer against a brightly lit sky", as if that were a well known blind spot in human vision systems. It most certainly isn't.
posted by Nelson at 3:05 PM on July 1, 2016 [36 favorites]


Tesla's autopilot is a system licensed from Mobileye. They put out a statement that among other things, says their vision system is not capable of avoiding this kind of collision.
posted by Nelson at 3:06 PM on July 1, 2016 [11 favorites]


Not that simple - highway driving involves much higher speeds, and even if accidents per mile are lower, fatalities per accident are likely to be higher. Anyone have actual data?

Using VMT (vehicle miles travelled) for 2010 from the 2013 FHWA Conditions and Performance report (exhibit 2-8) and 2012 fatality rates from the NHTSA Traffic Safety Facts report [PDF] (table 108) yields the following results (yes the years are different, but the rates are depressingly constant):
Facility type              Fatalities   VMT (x10^6)   Million vehicle miles/fatality
Rural interstate               1,814      246,109            135.7 
Urban interstate               2,160      482,726            223.5 
Other freeway/expressway       1,137      241,505            212.4 
Other principal arterial       8,582      666,714             77.7 
Minor arterial                 6,488      529,355             81.6 
Collector                      6,425      412,386             64.2 
Local                          6,626      406,301             61.3 

All roads                     33,232    2,985,096             89.8 
Freeways only                  5,111      970,340            189.9 
Freeways and princip. art.    13,693    1,637,054            119.6 
Minor arterial roads don't usually have centre dividers, where major arterials may or may not. Freeways and interstates do have either dividers or wide medians. So the freeways and principal arterial numbers seem the best to me as a comparison point.

While it's not broken out with the sources I used (except for freeways), rural roads have much higher fatality rates, I suspect largely due to much longer delays in receiving medical treatment in addition to differences in fleet mix (more trucks) and collision/driving types. I would note that the typical Tesla driver is much more likely to be an urban driver; for socioeconomic reasons as well as the basic fact that they are driving a range-limited vehicle.

Further, Teslas are sold globally and I assume autopilot is available elsewhere. The US is generally on the high end of road death rates - Norway, reputedly Tesla's #2 market, has a fatality rate about 62% of the US rate, and other European markets with presumably high Tesla usage (based on supercharger rates) such as Sweden, Denmark, Austria and Switzerland are all closer to Norway's fatality rate than the US.

So it would seem to me that Teslas on autopilot are not safer than humans, and indeed, based on current evidence, are likely to be more dangerous.
posted by Homeboy Trouble at 3:15 PM on July 1, 2016 [10 favorites]


I don't think you can possibly talk about the "rate" of safety from a single data point. There's nowhere near enough data on autopilot to compare to the gigantic piles of data on human piloting.

I also don't see any evidence (or really any way to prove) that the accident would not have occurred with the human driving. If he had been looking at his cellphone at that moment, it seems like the accident would still have happened, right? And I see people texting/looking at cellphone at freeway speed all day every day here.

Thats not to say it couldn't be some specific issue with Autopilot, but generalizing from one accident with incomplete investigation to talk about the rate of fatalities is not valid.

All that said, I'll still be happier when we can move to fully-automated vehicles. It is true that partial automation has unique dangers, in that it will make people more likely to not pay attention and thus be unable to react if the car suddenly says "hey, you need to handle this NOW!".
posted by thefoxgod at 3:28 PM on July 1, 2016 [2 favorites]


Ok the first person dying does not mean that everybody has been dying so much more than the other people that died the other way.
posted by oceanjesse at 3:37 PM on July 1, 2016 [3 favorites]


I own a Tesla with autopilot. Not all of Tesla's vehicles have the feature-- only those equipped with the tech package, which is an option when you purchase the car.

The autopilot features have been rolled out over the course of the past year, and have been getting steadily better over time. That said, I have been astonished by various reports of people falling asleep, watching TV, or assorted other activities while the car drives itself. Autopilot is not at all ready to be used in that fashion, and the company has gone out of its way to make that clear to owners of the car. The dash tells you to keep your hands on the wheel at all times, and it will chime at a regular interval if you leave your hands off. Aside from that, use of the feature for more than about 30 seconds is enough to teach any reasonable person that it is not at all a replacement for an attentive driver. At best, it's a really fancy cruise control that can follow a well-marked lane on the highway. I have no doubt that it will eventually become something much more powerful, but Tesla has already indicated that current cars equipped with autopilot likely don't have the necessary hardware for fully-autonomous driving. If you compare the tech in Google's cars to Tesla, you find that Tesla has a serious disadvantage in terms of the sensor suite and the data.

The loss of life is tragic. I'm not a huge fan of the way that Tesla represents itself in public, either in terms of its PR work or in the form of Musk's adolescent posturing. I do believe in autonomous vehicles, though, and the likelihood that they are going to dramatically reduce the number of vehicular fatalities and injuries over the next ten years. I used autopilot twice today to drive about 20 miles. Even in its current nascent form, the technology is remarkable. Just like operating any other motor vehicle, though, it's incredibly important to understand the limitations and operate within the safety guidelines. Accidents happen, and some are unavoidable, but most are caused by human error. Not paying attention to your 5000lb projectile as it hurtles down the highway at 70mph is certainly an error.
posted by drklahn at 3:40 PM on July 1, 2016 [11 favorites]


while I agree with schmod, I also have a prejudice that any car safety feature should prioritize the safety of *everyone else* first: other cars, cyclists, pedestrians. the cost of safety automation must sit on those who profit from it and those who adopt it, fully.
posted by j_curiouser at 3:45 PM on July 1, 2016 [8 favorites]


I'm kindof of with atrios - I don't believe these things will work for any real definition of 'work' in my lifetime. that said, if you care about traffic deaths & injuries, we can assign a profit motive to discouraging distracted driving.

I'm betting it's a lot easier to build in-cab instrumentation to detect (then penalize) driver-distraction than it is to consistently hit the autopilot edge-case (e.g. cyclist in low sun).

usaf human factors engineers know how to measure pilot distraction. driver distraction seems analogous on first blush.

spend resources on "making cool shit" or "reducing distraction and death"?
posted by j_curiouser at 3:57 PM on July 1, 2016 [5 favorites]


thefoxgod: "I don't think you can possibly talk about the "rate" of safety from a single data point. There's nowhere near enough data on autopilot to compare to the gigantic piles of data on human piloting. "

True. Tesla should have avoided that number with a ten foot pole instead of trying to use it to make their tech look good.
posted by Mitheral at 4:03 PM on July 1, 2016 [4 favorites]


Maybe I don't understand something about this technology. If, to use this safely you have to be monitoring the car with the same attentiveness as a normal driver, hands on the wheel the whole time, you are driving. Driving is like, 90% maintaining awareness of everything around you and making judgement calls accordingly. (Who's swerving weird? Shift lanes so the semi doesn't have to? Do I need to keep an eye on that person who keeps passing people unsafely?)

What does this buy you? It sounds like at best, it's making you less attentive than if you had been just driving normally. I dunno. I feel like partial automation is a dangerous half-measure, and I dream of the day when we have good, fully-autonomous vehicles. Mostly because I'm not a fan of driving. (Incidentally, I feel the same way about cruise control. I still have to pay attention to the gas and brake and be ready to use them exactly as I would without it, so.. why?)
posted by mrgoat at 4:07 PM on July 1, 2016 [21 favorites]


Tesla's obstacle detection hardware is not sufficient, as explained by Jalopnik.

The forward-looking RADAR sensor is mounted low and could see a clear path forward, which unfortunately was UNDER the sideways tractor-trailer. The rearview-mirror-height forward facing camera, as previously described, failed to differentiate the side of the trailer from the sky.
At the very least the Tesla needs another RADAR unit at roof level.

Google's self-driving cars have a much more robust (and expensive) system based on roof mounted LIDAR and or RADAR hardware.
posted by w0mbat at 4:08 PM on July 1, 2016 [6 favorites]


Does that mean that the car tracks and reports your movements too?

No, and it certainly doesn't weigh you to find out when you've had a movement. That would be intrusive, and The Autopilot brought you to Wendy's because you need to eat - this is your normal lunchtime, isn't it?
posted by sneebler at 4:28 PM on July 1, 2016 [6 favorites]


The part that I find disturbing is the driver caught his own near-crash with a dash cam a few months ago, and in his own description stated plainly that it was a complete surprise when his Tesla swerved to avoid an accident because he wasn't paying attention to the driving.

I don't like to trash a victim in these cases (especially someone that died as a result) but this guy sounds fairly overconfident with level 2 autonomous driving and it seems like it was only a matter of time before an odd situation presented itself and got someone into trouble.

I can't wait for real autonomous driving to arrive, I think it will be an amazing boon to auto safety but the next decade or so with these first versions is going to be rocky.
posted by mathowie at 4:54 PM on July 1, 2016 [12 favorites]


Driving is like, 90% maintaining awareness of everything around you and making judgement calls accordingly. (Who's swerving weird? Shift lanes so the semi doesn't have to? Do I need to keep an eye on that person who keeps passing people unsafely?)

What does this buy you? It sounds like at best, it's making you less attentive than if you had been just driving normally.


The way I see it, these assist features reduce the cognitive load of the driver and free up their attention to direct to the bigger picture of what's going on around them. For example, on simple cruise control in traffic, you have to pay close attention to the car in front of you and constantly adjust your speed to maintain distance without exceeding the speed limit. Or in rain, you must constantly adjust your wiper speed. Or on a dark country road, you must constantly be aware of whether your brights are on and switch them off for infrequent oncoming traffic. All of these things distract your attention from what's going on around you because you have to think about it each time you manipulate one of these controls and you have to be constantly monitoring stuff that is secondary to piloting the vehicle. It is exhausting.

Would you also say that designing cars to be statically stable (e.g., the steering wheel tends to return to center) results in less attentive drivers? If a car darts around unbidden whenever the road surface undulates, is that safer? It will certainly get your attention.
posted by indubitable at 5:01 PM on July 1, 2016 [4 favorites]


and constantly adjust your speed to maintain distance without exceeding the speed limit.

Hahaha ahahaha hahahah hahh....

oh you're serious.

Not in L.A. you don't.
posted by Justinian at 5:13 PM on July 1, 2016 [1 favorite]


these assist features reduce the cognitive load of the driver and free up their attention to direct to the bigger video picture

So we've already had incidents where people were watching movies on laptops or their phones while driving vehicles that didn't have an autopilot. I don't mean to be insensitive, but these people are stupid. Why would we encourage this kind of stupidity?
posted by sneebler at 5:16 PM on July 1, 2016


I can't recommend this video enough if anyone is interested in digging deeper into the technical challenges. You might want to skip over roughly minutes 10-20 which are a blow-by-blow of the DARPA Urban Challenge, but don't miss the discussion and set of videos that start at about the 20 minute mark, where he goes through all of the unsolved challenges in autonomous navigation (especially the way Tesla is trying to do it, without LIDAR). This exact scenario (with roles reversed) is called out at the 35 minute mark.

I'm convinced that because of the issues of mode-switching, inattention blindness, and other realities of our cognition, we need to stick to "autopilots" that require constant user interaction until we're ready to sell fully-autonomous, robot-always-drives, Level 4 systems. It may mean we never reach level 4 (unless an actor like Google is able to drop bottomless supplies of cash on the problem) because otherwise most companies would need to sell their prototypes (at level 3) to fund continuing work (that's the financing model for, eg, the iPhone, which had a very successful pre-alpha prototype called an "iPod.")
posted by range at 5:16 PM on July 1, 2016 [12 favorites]


So it would seem to me that Teslas on autopilot are not safer than humans, and indeed, based on current evidence, are likely to be more dangerous.

First, thanks for that table of data, it's super useful.

However, I read it and came to a different conclusion.

I see Tesla's 134 million miles vs. the table's 212 for "other freeway/expressway" (which I think is a better comparison here) and think "wow, that's about the same".

But whether it's 77 vs 134 or 212 vs 134 isn't that important because Tesla's system is in beta.

One could imagine Tesla's system making a 10x improvement, at which point it will easily be way better than the average human driver.
posted by soylent00FF00 at 5:59 PM on July 1, 2016


I really don't think you can call a system being sold to real live consumers "beta". It's on the market now. They're advertising it as safe, today. As long as they're putting these on real roads with live drivers, they're going to be subject to regulation and liability as the current system performs now, not 10 years from now.
posted by T.D. Strange at 6:06 PM on July 1, 2016 [9 favorites]


CBC As It Happens today had a brief interview with Robotics expert Missy Cummings about Tesla's new tech, with some nuanced comments about the future of autonomous driving...
posted by ovvl at 6:10 PM on July 1, 2016


I agree with your point (that perhaps beta-testing autopilot software on real drivers is a little scary) but Tesla does call it beta:

From Tesla's Website

"Autosteer (Beta)"


Speaking of live beta tests on the public, is Gmail still in Beta?
posted by soylent00FF00 at 6:11 PM on July 1, 2016 [1 favorite]


Honestly, I think the only way automated driving is going to work is if it's managed the way a network is, with a car having to log on to the roads in order to operate on them. That way, every vehicle is aware of the movements of every other vehicle within its influence circle - even if the vehicle is manually operated.

It's not like every vehicle after model year 2010 isn't phoning home anyways.
posted by Mooski at 6:14 PM on July 1, 2016 [2 favorites]


The way I see it, these assist features reduce the cognitive load of the driver and free up their attention to direct to the bigger picture of what's going on around them.

Fair enough answer. The problem as I see it though, is that it doesn't. You have to pay attention to all the same things if the autopilot isn't to be trusted (which according to Tesla, it is not), and you have to be ready to adjust at the same reaction time as if driving. It's seems more like an auto-wiper that can't be trusted to wipe consistently, or automatic headlights that sometimes don't adjust, or go full bright because the sensor got dirt on it, so you have to be just as aware of what they do, only to to be able to take over at a moment's notice. Same as cruise control, you can't just take your foot away from the gas and brake and pay attention to steering alone - at best, you have to hover it there at the pedals, ready to go. You're not driving safely otherwise.

It doesn't seem to me, in the same category of having a stable car, which increases the consistency of how the car reacts to your input. I'd put automatic transmission in the same category.

I guess it just comes down to whether the computer system handling the car functions are better, or worse than the human. If they're worse, then the human has to be in charge, and the system doesn't serve a purpose. This is Tesla's admission. If the computer is better, you don't need a driver - what Google is doing. But you can't have both at the same time.
posted by mrgoat at 6:27 PM on July 1, 2016 [3 favorites]


Before this accident the results from a fatal crash were that the intelligence driving the vehicle was destroyed and that the remaining intelligences operating other vehicles were unlikely to change behavior. The result of this crash will be an update to every Tesla, and potentially other self driving systems. This is a huge a milestone in the rise of AI -- it fills me with a mix of awe and fear.
posted by humanfont at 6:39 PM on July 1, 2016 [8 favorites]


The bloody car should be constantly detecting your hands on the wheel. People die all the time texting and doing other stupid shit in regular cars, of course they are going to completely space out in a car like this if you let them. Ugh.
posted by weretable and the undead chairs at 6:42 PM on July 1, 2016 [7 favorites]


If they're worse, then the human has to be in charge, and the system doesn't serve a purpose. This is Tesla's admission. If the computer is better, you don't need a driver - what Google is doing. But you can't have both at the same time.

I disagree with this - take the example of cruise control - we have extremely strict speed limit automated cameras on many roads here - you can be fined A$200 for traveling 4kmph over the speed limit (there is a 3kmph grace window you are allowed to exceed. Cruise control is not better than me at driving, but it frees up the necessity for me to stare at the speedometer and lets me look out for actual hazards like schoolchildren (schools are a 40kmph zone, which ironically means I spend more time looking at my speedometer in schoolzone than outside it).

Automation is going to leverage the best parts of both human and machine with hybrid synergy. There's already so much automation - ABS, ESC, torque vectoring, engine management systems, lane keeping assist, auto city stop, emergency brake assist, automatic transmissions - which work so seamlessly with human driving to help turn intention into action without requiring all the tiny steps in between. Whenever new technology is introduced there are always people saying it is bad (eg ABS is worse in some situations, automatics are "dangerous" because you don't have control over the car...)
posted by xdvesper at 7:03 PM on July 1, 2016 [2 favorites]


I'm no UX designer, but... if Tesla acknowledges that it's dangerous to leave your hands off the wheel during autopilot, and Tesla's cars can detect when your hands are not on the wheel, maybe instead of just chiming politely every thirty seconds, the autopilot could -- just spitballing here -- bring the fucking car to a fucking halt and stop endangering the fucking lives of everyone unfortunate enough to be in its motherfucking proximity, not to mention the fucking moron with his or her hands off the goddamn motherfucking wheel?
posted by No-sword at 7:22 PM on July 1, 2016 [29 favorites]


According to The Guardian, he was watching Harry Potter on a portable DVD player and went full speed under the truck. He also apparently had a bunch of speeding tickets.

At least in that article, Tesla says the car's sensors didn't detect the white trailer against the bright sky--not the driver.
posted by Anonymous at 7:23 PM on July 1, 2016


That actually sounds more like a problem with speed zone enforcement than it does one with safely operating a vehicle. Sounds more like the automated cameras were creating a hazard by forcing you watch the speedometer so closely.

I agree with you on most of the rest of those systems though, because they make the vehicle easier to handle, rather than trying to handle the vehicle in lieu of you.
posted by mrgoat at 7:23 PM on July 1, 2016


the autopilot could -- just spitballing here -- bring the fucking car to a fucking halt and stop endangering the fucking lives of everyone unfortunate enough to be in its motherfucking proximity

I thought it did if you left your hands off.
posted by Anonymous at 7:24 PM on July 1, 2016


It's probably a good point that if you have to sit like you're driving but you're not actually DOING anything, you WILL space out of complete boredom. Hell, that happens in traffic jams.
posted by jenfullmoon at 7:33 PM on July 1, 2016 [2 favorites]


RIP, you pioneer of advanced technology. I feel like that's being overlooked here, probably because this guy died using such an out-there technology that we all know is about to change the way we get around---while we can't not talk about the tech, let's not forget to pour one out for its user.
posted by resurrexit at 7:43 PM on July 1, 2016 [6 favorites]


This one is creepy for me on a personal level. My name is Joshua D. Brown, I turn 40 in November, and I live in Cleveland, Ohio. And my brother worked for Tesla for a while. Weird.
posted by starvingartist at 7:46 PM on July 1, 2016 [9 favorites]


AFAIK, very few (no?) cars check to see if your hands are off the wheel at all when texting, fiddling with the radio, eating, etc. Yet somehow, people in this thread are apoplectic that the Tesla doesn't check often enough.
posted by paulcole at 7:46 PM on July 1, 2016 [3 favorites]


I'll take my chances on a road filled with "autoaotos" every time rather than on a road where ten percent of the drivers are busy talking, texting and day dreaming.

You left out "drunk."
posted by 3urypteris at 8:00 PM on July 1, 2016 [5 favorites]


the autopilot could -- just spitballing here -- bring the fucking car to a fucking halt and stop endangering the fucking lives of everyone unfortunate enough to be in its motherfucking proximity

It does, eventually. But it reportedly looks for wheel torque, not "fingers detected", so if you have a light touch on the wheel it might still give you the warning. Which is probably why they chose "warn when beginning to get confused" instead of "warn if user has not turned wheel in N seconds."
posted by RobotVoodooPower at 8:03 PM on July 1, 2016 [2 favorites]


There's a fantastic TED talk given by the head of Google's self-driving vehicle program in which the moderator briefly sums up why Google chose to go with true self-driving vehicles as opposed to Tesla's driver assisted approach (tl;dr: human psychology):

On this debate between driver-assisted and fully driverless -- I mean, there's a real debate going on out there right now. So some of the companies, for example, Tesla, are going the driver-assisted route. What you're saying is that that's kind of going to be a dead end because you can't just keep improving that route and get to fully driverless at some point, and then a driver is going to say, "This feels safe," and climb into the back, and something ugly will happen.
posted by longdaysjourney at 8:26 PM on July 1, 2016 [11 favorites]


Incremental automation seems like a bad idea - just like how cruise control always freaks me out til I turn it off. If your car can suddenly do some small thing you used to have to do for yourself, what are you going to do with that newly freed attention/brainpower? Pay closer attention to the rest of driving? Or space out, check your texts, etc? Seriously. It's all or nothing.
posted by gottabefunky at 9:06 PM on July 1, 2016 [4 favorites]


You know I just realized I have been seeing some Model S's driving around my neighborhod. So, in addition to everything else to be aware of on the road, it might be good to identify cars that have this driver assist feature and be a little more careful around them. Yes, this technology is still improving, but just like cell phones and any other new tech, it apparently faces both a growing user base and a new user learning curve.

And, I own a car that's color is literally sky blue.
posted by FJT at 10:09 PM on July 1, 2016 [1 favorite]


Cruise control is a godsend on rural interstates. That aside, I've never found much use for it.
posted by Automocar at 10:54 PM on July 1, 2016 [2 favorites]


Mooski: "Honestly, I think the only way automated driving is going to work is if it's managed the way a network is, with a car having to log on to the roads in order to operate on them. That way, every vehicle is aware of the movements of every other vehicle within its influence circle - even if the vehicle is manually operated."

Then it'll never happen. Pedestrians, cyclists, moose, and the existing fleet aren't going to log into the system and if that is the only way to make it safe then it''ll be a non starter.
posted by Mitheral at 12:00 AM on July 2, 2016 [7 favorites]


paulcole: "AFAIK, very few (no?) cars check to see if your hands are off the wheel at all when texting, fiddling with the radio, eating, etc. Yet somehow, people in this thread are apoplectic that the Tesla doesn't check often enough."

Because the instructions when you engage this feature tell you to keep your hands on the wheel and Tesla has the ability to monitor that.
posted by Mitheral at 12:03 AM on July 2, 2016


Using cruise control can become a motor skill, upon which it is much safer. I use cruise control all the time (literally all the time -- it's only not on when I'm accelerating from a stop or braking), even though the majority of my driving is urban. The reason I use cruise control is to ensure that I drive at a safe speed. Usually I let the speed limit dictate what that is, except that I've learned that in Richmond, BC. where the speed limits are 50km/h but almost everyone drives 65+ that I have to drive 53km/h -- for some reason the drivers who will go completely bonkers if you are driving the speed limit are much less belligerent if you are going just a little over it. Also I drive less than the speed limit on freeways when it is raining and it's the first rain in a while.

Once you become skilled with it, it is much, much safer than using the speedometer to maintain speed. And, it becomes impossible to fool yourself that you drive a moderate speed, when really the "exceptions" when you are driving in excess of what you would consciously consider a moderate speed really constitute the majority of your driving, which is what I've noticed happens to pretty much everyone (and certainly to me, before I developed the cruise control discipline)

For me, the only way for it to become a motor skill was to use it literally all the time.
posted by lastobelus at 12:17 AM on July 2, 2016 [6 favorites]


Does Florida, or any other US state, require hi-visibility markings on heavy goods vehicles in the same way that EU countries do?

It is easier to make a machine vision system recognise a specific pattern than it is to get it to extract speed and distance data from a series of images, so I can imagine that there might be the opportunity to have agreement between vehicle licensing authorities and manufacturers of autonomous driving systems about marking schemes for large vehicles that are readily identifiable by such systems.

Also, over here long trailers have been required for years to have anti-underrun bars - are those common in the USA? Mind you, I doubt they would have done much good at 70 mph.

(Sadly, a good friend of mine was killed 24 years ago in a similar accident, which took place at night in a poorly-lit area when a trailer driver tried to make an illegal u-turn across a road. This was before the requirement for hi-viz reflective markings, side lights and underrun bars came in.)
posted by Major Clanger at 12:54 AM on July 2, 2016 [1 favorite]


I have a Tesla and have used autopilot daily since the feature was activated. Some of the information in this thread is inaccurate. The most significant is the idea that it only functions on divided highways. There are three categories of roads as relates to AutoPilot: divided highways where AutoPilot allows you to set the car to operate without restriction (subject to a capped top speed), non-divided highways where the car limits your speed to 5mph over the speed limit and roads where AutoPilot is not capable of operating at all (generally roads without painted lane markings). It operates fairly happily on both two lane rural roads and pretty much every road outside of 25mph neighborhood lanes, where the probability of a fatal accident is probably near zero. So I'd imagine that it is pretty fair to compare fatal accident rates per mile driven.

I'm also quite sure that the problem was not that the radar is mounted low so cannot see the truck. The radar distinguishes between cars and trucks and displays them on the dash appropriately, so it clearly sees objects that tall. I believe the problem is a pattern matching one -- a truck sideways across the lane of travel looks pretty similar to a bridge on radar. When radar can see a clear path under a stationary object, it doesn't register it as another vehicle. Based on a presentation I saw from a Tesla AutoPilot engineer about how the system works a year ago, I'd wager that the deadly combination was the clear path under the truck and the fact that it wasn't moving.

These kind of features are going to lead us to difficult decisions that we aren't used to making. We'll have to determine what the acceptable level of failure is for these functions. Obviously we'd rather not have any fatal accidents caused by failures of the technology but probably the right metric is some level of performance where it exceeds unassisted drivers. When I think of all the horribly dangerous behavior I've witnessed from humans driving their cars, I'm still much more comfortable with the decisions AutoPilot makes than what I see every day from other cars.

Having said that, I'm convinced that Tesla will end up being much more restrictive about the degree to which you are required to keep your hand on the wheel. It won't eliminate the problem, however. You can rest your hand on the wheel such that the sensors detect you while texting, watching TV or even sleeping. Technology can't entirely prevent you from exercising poor judgment.
posted by Lame_username at 4:23 AM on July 2, 2016 [17 favorites]


Bridges do not appear and disappear overnight. Wouldn't the auto driving software have a map of road features to check against?
posted by rdr at 8:56 AM on July 2, 2016


" maybe instead of just chiming politely every thirty seconds, the autopilot could -- just spitballing here -- bring the fucking car to a fucking halt and stop endangering the fucking lives of everyone unfortunate enough to be in its motherfucking proximity, not to mention the fucking moron with his or her hands off the goddamn motherfucking wheel?"

Slamming on the brakes on a crowded highway does not sound like the safest course of action.
posted by I-baLL at 9:41 AM on July 2, 2016 [11 favorites]


I believe the problem is a pattern matching one -- a truck sideways across the lane of travel looks pretty similar to a bridge on radar. When radar can see a clear path under a stationary object, it doesn't register it as another vehicle. Based on a presentation I saw from a Tesla AutoPilot engineer about how the system works a year ago, I'd wager that the deadly combination was the clear path under the truck and the fact that it wasn't moving.

Where are you getting the idea that the truck was stationary?
posted by indubitable at 10:38 AM on July 2, 2016 [1 favorite]


It was incredibly disingenuous of Tesla to compare their fatality rate against that of the entire corpus of automobile miles logged in the US; the two datasets are not remotely on the same scale or of the same composition. Tesla is measuring an extremely small subset of older, more affluent early-adopters - many of whom are either personally invested in using the tech correctly or have not yet developed a level of familiarity with it to use it distractedly - engaged in one of the safer categories of driving. Tesla would have to measure billions more miles before the data is remotely comparable; as of right now it of very little predictive value without smuggling in a whole host of demographic and technological assumptions.
posted by Svejk at 10:48 AM on July 2, 2016 [5 favorites]


Where are you getting the idea that the truck was stationary?
I should have been more clear. It was stationary with regard to the direction of travel of the Tesla. The technology is designed to track objects moving closer or further based on the relative speeds of the two vehicles. It does not seem to attempt to track objects moving across the field of view as part of the adaptive cruise control functionality (which governs the speed of the car). Similarly, it does not effectively detect animals coming onto the roadway and struggles when you approach parked traffic at high speed. It will normally stop just short of colliding with parked cars on the roadway from 70mph, but it does so in a pretty terrifying fashion and I would assume every driver who sees the stopped traffic overrides the adaptive cruise control and brakes manually.

FWIW, this is not unique to Tesla. Mercedes, Infiniti, Volvo and other makes all offer this same adaptive cruise control software. I'm familiar with the Mercedes and Infiniti versions and they would have ran into the truck crossing the road as well. The problem with Tesla is that it is actually better than the competing products. The auto-steering, adaptive cruise control combination in the Mercedes is so poor at handling so many situations that no one with any sense at all would consider reading or watching a video while it drove. No one should do that in a Tesla either, but it appears that is what happened here.

FWIW, Tesla issued another statement more or less confirming my guess as to why it failed:
Since January 2016, Autopilot activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature. In the case of this accident, the high, white side of the box truck, combined with a radar signature that would have looked very similar to an overhead sign, caused automatic braking not to fire.
posted by Lame_username at 11:26 AM on July 2, 2016 [7 favorites]


There is a big hole I see in the discussion of autopilot cars; the real world and money. I'm not sure why this never comes up, but it seems to me that if completely self driving cars became reality, which I doubt for various reasons involving the real world and money, and were absolutely proven to be safer, things like this could happen;

Insurance rates would go down for SD cars and up for everyone else to the point where you'd be punished for not having a SD car, perhaps to the point of poor people not being able to drive. How would this not happen?

As they got older the same pressure would apply to buy a new SD car because they're "even safer"

Eventually, since you have to enter information about your trip anyway, your trip could be rated on how dangerous it was and you could be charged for it. "Warning; July 4th weekend is a high traffic period. Surcharges apply."

"Do you really need to go to the bar tonight?" "Is this trip necessary?"

"You're coverage has been exceeded for the month"

Ads, of course. Your insurance may even be paid for by ads, so opting out would be hard.

I just don't see SD cars as some great future of freedom, I see them as public transport that you won't really be able to control and have to pay for and maintain yourself. In that case more and better public transport would be better. Eventually the whole idea of owning them would seem silly, so it really might just come down to calling an automatic cab for middle class people, and only rich people could actually drive a car. Poor people would walk or take public transport with no option to have even a shitty car.
posted by bongo_x at 11:28 AM on July 2, 2016 [6 favorites]


I should add that I do see SD cars becoming the norm eventually, just not soon. Because why wouldn't car companies, Google, insurance companies, law enforcement, etc. want to know where you're going at all times and even have control over it? Why wouldn't Kroger want to give you a discount for going there instead of Safeway?

The temporary financial incentives will be what makes it happen for the average person, not need or convenience.
posted by bongo_x at 11:56 AM on July 2, 2016 [1 favorite]



bongo_x: "Insurance rates would go down for SD cars and up for everyone else to the point where you'd be punished for not having a SD car, perhaps to the point of poor people not being able to drive. How would this not happen?"

Insurance doesn't work that way. Companies charge mostly on payouts; a decline in payouts in one area doesn't mean they are paying out more in other areas. If anything self driving cars that have few accidents will reduce the premiums across the board because they will have fewer accidents with non-self driving cars.

Time, special purpose, miles driven insurance already exists. Great for specialty vehicles but doesn't have much take up for daily drivers.
posted by Mitheral at 12:05 PM on July 2, 2016 [4 favorites]


Insurance doesn't work that way because most people have the same risk. If laws are changed to establish liability for SD cars the whole landscape will change. You can already get a discount if you put a tracking device in your car, if that is not optional but part of the makeup of the car it seems the trend will expand.

Why wouldn't there be discounts for cars that are said to be unlikely to crash regardless of the actions of the driver?
posted by bongo_x at 12:13 PM on July 2, 2016


If the reports of the driver watching a dvd are accurate, it raises some interesting liability question for Tesla's Autopilot versus the driver's obvious distraction. And I suppose it's fortunate that there was only one fatality in this crash.
posted by Existential Dread at 12:19 PM on July 2, 2016 [1 favorite]


bongo_x: "Why wouldn't there be discounts for cars that are said to be unlikely to crash regardless of the actions of the driver?"

Probably would be be but only proportional to the claims saving. They aren't going to offer discounts exceeding that and then make up the difference by charging other people more. Insurance is really competitive; a company that overcharges one segment like that is going to lose that segment.
posted by Mitheral at 1:17 PM on July 2, 2016 [2 favorites]


.
posted by limeonaire at 2:30 PM on July 2, 2016 [2 favorites]


I think it is telling that we are discussing the safety of a car that makes news whenever it crashes.
posted by psycho-alchemy at 6:48 PM on July 2, 2016 [5 favorites]


I haven't seen reports of the precise timing, but it seems like even a mildly distracted human driver might have hit the truck. It was turning into incoming traffic and the driver probably would only have had a few seconds to react. Braking and going under the trailer at 25 mph would have still been a bad day.
posted by RobotVoodooPower at 8:05 PM on July 2, 2016 [1 favorite]


I think most of the discussions about the specifics of the crash miss the point. Ultimately I just can't get away from the idea that:
1. Tesla is beta testing autonomus driving on me and every otther driver on the road.
2. When there is a failure, they send out a press release telling me that they are doing a beta test, that could kill me, or my kids, or my neighbor, and there isn't any sort of pushback.

Do we just accept that its OK for a technology company to risk our lives while they figure out how to get things right?
posted by herda05 at 10:50 PM on July 2, 2016 [8 favorites]


It depends on whether the chances of the beta test killing you are greater or less than the chances of someone not in the beta test killing you.
posted by Justinian at 11:30 PM on July 2, 2016 [4 favorites]


But my fear is delegating that decision to a coporation that is not making the decision on the statistical analysis, but rather the cost benefit analysis with regard to their bottom line.

I think we're taking on faith that Tesla has society's best interests in mind, which I'm personally skeptical about.
posted by herda05 at 11:49 PM on July 2, 2016 [4 favorites]


The "Beta" is to give an out for the lawyers. While that might work in the software world, the automotive industry has a long tradition of really expensive wrongful death settlements.
posted by sideshow at 1:17 AM on July 3, 2016


I see them as public transport that you won't really be able to control and have to pay for and maintain yourself.
Somewhere in Silicon Valley a sign labelled DISRUPTION OPPORTUNITY just lit up.
posted by fullerine at 5:37 AM on July 3, 2016 [2 favorites]


interest in self driving cars seems a little misplaced... what is the goal here? to reduce traffic/cars on the road? to reduce total emissions for transporting a given # of people? to reduce accidents? or to make someone in silicon valley even richer?

ridesharing tech and a cultural shift in hitchhiking attitudes(and something like hitchhiking "bus-stops" on highway on ramps?) seem to me much better opportunities for improvement(safety, gas use, congestion) than self driving cars, which seem to continue to encourage the 1 person/car pattern that is quite strong all over the usa.

outside of the yelling kids and drinking situations, i would think more people in cars leads to safer/more alert driving(such a research study exists??).
more people in each car certainly leads to less congestion and less gas/person.
posted by danjo at 6:19 AM on July 3, 2016 [3 favorites]


I just don't see SD cars as some great future of freedom, I see them as public transport that you won't really be able to control and have to pay for and maintain yourself. In that case more and better public transport would be better. Eventually the whole idea of owning them would seem silly, so it really might just come down to calling an automatic cab for middle class people, and only rich people could actually drive a car. Poor people would walk or take public transport with no option to have even a shitty car.

I agree with bongo_x and django; while I can see improved car safety through assistive systems as inevitable, I don't think they are the sole or best future of personal transportation, and public money is better spent elsewhere.

The train/subway/LRT/streetcar is the safest and most efficient mode of surface mass transit with buses a reasonable second. Once adequate mass transit is in place, the remaining requirements for urban personal transport can be met with smaller electric or hybrid vehicles, or bikes, or walking. And taxis or ridesharing, as described by django.

As this thread has identified, these automated or assisted systems mainly work for freeway-type driving, and do about nothing for urban congestion... in fact I'd say that if car automation causes the overall number of car trips to increase, it will make urban congestion worse. The only answer to urban traffic congestion is either massive investment in urban road infrastructure (surrendering to the car, basically)... or fewer vehicle trips, period.

I went looking for some affirmation of my point of view from Europe; instead I found that they are going full steam ahead with testing as well, so maybe I'm off base. However, it seems that they are more focussed on something that integrates with their existing and extensive rail system ... the last mile problem of getting people to and from the train stations. There also seems to be more emphasis on shared self-driving "people movers" as opposed to a self-driving car in every driveway.
posted by Artful Codger at 8:09 AM on July 3, 2016 [3 favorites]


The endgame of self-driving cars is no one owns cars anymore. You simply summon a robotic car to pick you up where you are and drop you off elsewhere. Like Uber / Lyft, only without the drivers' jobs.
posted by Nelson at 8:49 AM on July 3, 2016 [1 favorite]


Well, someone owns the cars, except now it's the rentiers rather than you.
posted by indubitable at 9:27 AM on July 3, 2016 [5 favorites]


Sure, although one (unlikely) endgame is it's the city that owns the cars.
posted by Nelson at 9:31 AM on July 3, 2016


Do we just accept that its OK for a technology company to risk our lives while they figure out how to get things right?

Somehow we're OK with this in other areas like medicine, e.g. new drug trials.
posted by RobotVoodooPower at 9:35 AM on July 3, 2016


There are rather stringent safety and ethics reviews before new medications can be tested on people, and most people on clinical trials aren't jeopardizing non-participants' health in the process (except in the abstract "This person is taking a spot in the trial which means someone else cannot have that spot"). I'm getting the impression the car industry is not regulated quite so tightly.
posted by lazuli at 9:44 AM on July 3, 2016 [4 favorites]


It operates fairly happily on both two lane rural roads and pretty much every road outside of 25mph neighborhood lanes, where the probability of a fatal accident is probably near zero.
Stuff like this is just ... man, I don't even. This is such a reflection of a car-centric mindset that I don't even know where to start. Around 17% of all traffic fatalities in the US aren't in cars and the vast majority of the the fatalities from < 35 mph zones (the lowest NHTSA category) aren't in cars.

Fatal accident != driver getting killed. Not by a long shot.
posted by introp at 10:12 AM on July 3, 2016 [9 favorites]


RobotVoodooPower: "Somehow we're OK with this in other areas like medicine, e.g. new drug trials."

If my neighbour starts taking a new treatment for male pattern baldness I don't experience any side effects.
posted by Mitheral at 10:49 AM on July 3, 2016 [3 favorites]


The endgame of self-driving cars is no one owns cars anymore. You simply summon a robotic car to pick you up where you are and drop you off elsewhere. Like Uber / Lyft, only without the drivers' jobs.

Sure, although one (unlikely) endgame is it's the city that owns the cars.


Yeah, the future and technology rarely turns out the way we expect. As I get older it seems that Unintended are usually the more powerful of the consequences.

I'm not sure makes sense in the long run to own a self driving car, and if you did it would be so controlled by someone else that it would be debatable that you owned it at all. A lot of it could be subsidized by ads and such, and letting other people use it (like driver-less Uber), so maybe you wouldn't really even have to pay for it. Which means it's really a public car that's stored at your house. How liable are you for it?

If the city or even a company owned the cars and they're not coming from your house, then there seems like there would be even more traffic and waste picking people up and dropping them off. You might as well pickup other people along the way so as to not be wasteful. Which means you have a bus.

Do I believe in the future of self driving buses? Yes. Maybe fleets of many van sized buses, like a road full of airport shuttles.

One thing I can't quite get; traffic in cities is caused a lot by the ripple effect of people stopping and going. Isn't this going to be worse if you have a large number of SD cars that err on the side of caution? If pedestrians and others are confident about the cars not hitting them how would you even get through a city area? (not blaming pedestrians at all).

I really don't see how individual SD cars wouldn't make traffic worse in a lot of ways.
posted by bongo_x at 10:54 AM on July 3, 2016 [1 favorite]


I guess I see SD cars as a Segway thing. It's going to be really useful in some situations like public transport, but not the "everyone's going to have one" situation that's predicted.

People get really worked up about tech.
posted by bongo_x at 1:09 PM on July 3, 2016 [1 favorite]


If pedestrians and others are confident about the cars not hitting them how would you even get through a city area?

This seems like a feature, not a bug.
posted by Automocar at 2:27 PM on July 3, 2016 [4 favorites]


what is the goal here? to reduce traffic/cars on the road? to reduce total emissions for transporting a given # of people? to reduce accidents?

The last one only, but that's still super important. Driving is the single most dangerous thing that most Americans do regularly, both with respect to themselves and to others. And yet people still persist in driving, even when tired/drunk/distracted, because mostly --especially in America outside of maybe two or three major city centers nationwide -- they don't have other good options. Fully automated cars should solve this problem even in areas where better public transit isn't feasible (politically or logistically).

You're right that by itself, it will not have significant environmental impacts and may actually worsen congestion in cities. Autonomous cars don't make public transit obsolete. It drives me nuts when people argue that. They're never going to move people as efficiently as trains, or even buses if they have their own ROW. But they could still save a lot of lives in the long run if they are even somewhat better than human drivers.
posted by en forme de poire at 3:53 PM on July 3, 2016


It's also weirdly specific to call out the "white side of the tractor trailer against a brightly lit sky", as if that were a well known blind spot in human vision systems. It most certainly isn't.


I have on many occasions had trouble seeing a white or grey car when the sun is low enough to backlight the object and it is even slightly hazy. CCDs (and CMOS sensors) have less dynamic range than the human eye, so it isn't surprising that a car that only has a camera for long range sensing missed something like this. That is but one reason why Tesla explicitly states that Autopilot is only an assistance feature and not capable of fully autonomous operation. This is made clear in marketing materials and the actual instructions.

It does seem that even with the sensor limitation it's at least as good as human drivers on average (not that that says much), though. If it had radar and/or an IR camera, I'd be willing to trust it with my life under its stated operational limits after a bit more testing. With nothing but a video camera, though? I'd be watching that thing like a hawk.

Autopilot and similar active safety systems are good for backing up the driver, but they should not be used the other way around. You need a Google-style system to have enough confidence in the sensors to have the human backing up the car rather than the other way around.
posted by wierdo at 6:18 PM on July 3, 2016


Stuff like this is just ... man, I don't even. This is such a reflection of a car-centric mindset that I don't even know where to start. Around 17% of all traffic fatalities in the US aren't in cars and the vast majority of the the fatalities from < 35 mph zones (the lowest NHTSA category) aren't in cars.
Well, I was trying to address the differences between driving with AutoPilot on or off. Hitting pedestrians or bicyclists is prevented by a different system. In fact, the safety features designed to protect pedestrians and bicyclists are never off, so they aren't relevant to the AutoPilot discussion. Autonomous Emergency Braking is supposed to detect and prevent collisions with pedestrians at slower speeds. Also, in the unfortunate event that a Tesla does collide with a pedestrian, in cars suitably equipped the hood automatically elevates to provide a relatively softer surface to cushion the impact. Its a pretty cool technology, essentially an airbag for those outside the car.

So, I believe that the risk of fatalities to both occupants and non-occupants at slower speeds is greatly reduced, although perhaps "near zero" was exaggerating the case a bit. I'd love to see a chart that compared traffic fatalities at <35mph for non-occupants between cars equipped with AEB and those without. I'm assuming that the difference is significant, but I've not seen hard data. I did find a pretty cool video showing the technology at work.
posted by Lame_username at 10:06 PM on July 3, 2016 [1 favorite]


It does seem that even with the sensor limitation it's at least as good as human drivers on average (not that that says much), though. If it had radar and/or an IR camera, I'd be willing to trust it with my life under its stated operational limits after a bit more testing. With nothing but a video camera, though? I'd be watching that thing like a hawk.
FWIW, it has the forward facing camera and radar as well as 360 degree ultrasonic "sonar" sensors.
posted by Lame_username at 10:10 PM on July 3, 2016 [1 favorite]


If pedestrians and others are confident about the cars not hitting them how would you even get through a city area?

I am an, err, assertive pedestrian. I can tell you that I haven't been hit by a car. I've been hit by two drivers not inside their cars in the last few months, though. I wasn't blocking the path of where either driver wanted to drive, nor were they driving on through routes, yet they still assaulted me*. I think people who want to drive through a city area will be OK.

*Technically, they physically assaulted me after they assaulted me by driving their cars at me, I guess. And I assaulted them by letting my hand touch their car, too, I guess.
posted by ambrosen at 2:17 PM on July 4, 2016 [1 favorite]


If pedestrians and others are confident about the cars not hitting them how would you even get through a city area?

I am an, err, assertive pedestrian. I can tell you that I haven't been hit by a car.


I think I phrased that wrong. My concern was not that pedestrians would run amok, but that if you have a car that errs on the side of stopping for everything in front of it, as it should, then isn't it going to just be moving through downtown areas a foot at a time? That's nearly the case now in some places, I can't see how it would be even do-able in a dense area.
posted by bongo_x at 2:41 PM on July 4, 2016


I think that people will be keeping out of the way of automatic cars in built up areas because if they don't, the driver will get out of the car and punch them. So they'll get their way through fine, and drivers will continue running amok and pedestrians will continue to be cowed.
posted by ambrosen at 3:30 PM on July 4, 2016 [1 favorite]


It disturbed me to hear that the autopilot actually carried on driving after the crash, as it steered around some trees, although it hit a few fences and eventually a utility pole. I'm wondering what would have happened if it hadn't hit the pole? Would it have got back on the road and driven home, complete with headless driver? You would imagine that it would be programmed to detect that an accident had happened and just stop, apparently not the case.
posted by w0mbat at 11:40 AM on July 5, 2016 [1 favorite]


(Unless it had so much momentum it was just unable to stop at all before it hit the pole)
posted by w0mbat at 11:46 AM on July 5, 2016


(Unless it had so much momentum it was just unable to stop at all before it hit the pole)

That would be my guess. 70 mph in a multi-ton vehicle is a lot of momentum, and if the wheelbase were undamaged and just the top sheared off I could see the lower portion of the vehicle continuing on a ways.

your hypothetical does making for a chilling short story premise or a bad CSI episode
posted by Existential Dread at 4:25 PM on July 5, 2016 [1 favorite]


It disturbed me to hear that the autopilot actually carried on driving after the crash, as it steered around some trees, although it hit a few fences and eventually a utility pole. I'm wondering what would have happened if it hadn't hit the pole?
The camera is mounted right at the roofline above the rear view mirror. When the accident demolished the camera, autopilot would have shut down immediately and the car would just coast. You can prove this by covering the camera with a post-it note (or your hand) through the sunroof and autopilot will immediately disengage. Unfortunately, the crash detection is tied to the airbag deployment, so the car didn't know it was in an accident until it hit the tree.

If autopilot was still on and operational, it would have continued driving down the road, staying between the lines. Even when on, it would disengage as soon as it couldn't detect lane lines, so it wouldn't steer around trees.
posted by Lame_username at 6:30 PM on July 5, 2016 [4 favorites]


From EETimes - Tesla's Fatal Crash: 6 Unanswered Questions (printable view, single page)

Regarding the cameras:
Direct sunlight flashing in front of the camera could have caused the CMOS image sensor not to see the truck clearly, said one industry analyst. “Of course, it all depends on the CMOS image sensor’s sensitivity and contrast.”

Phil Magney, founder & principal advisor of Vision Systems Intelligence, LLC., told us, “It’s odd that the camera could not identify the truck.” At 4:30PM on May 7 in Florida, when Tesla crashed, the effect of the sun on the front camera is unknown. But Magney added, “My guess is that the vision sensor [EyeQ3] simply didn’t know how to classify” whatever the camera saw.

“The cameras are programmed to see certain things (such as lanes) but ignore others that they cannot identify.”

It appears that the camera ignored what it couldn’t classify.
Regarding the radar:
Demler doesn’t believe radar was at fault.

He explained that radar is “most commonly used for adaptive cruise control. It prevents rear-end collisions by detecting the distance to a vehicle up ahead. For autonomous vehicles, radar can provide longer distance object detection than cameras and Lidar.”

“Radar doesn’t provide detail for object identification,” he acknowledged. But Radar also provides better sensing at night and through fog and precipitation, headed.

Demler pointed to a statement Tesla issued to the media last Friday, which explained that its autopilot system “activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature.”

In that case, Demler said, “I suspect the radar missed the truck because it is focused too close to the ground and only saw the opening.”
posted by Existential Dread at 10:13 AM on July 7, 2016


The linked article has a drawing of the scene which I found helpful.
posted by Mitheral at 11:38 AM on July 7, 2016 [2 favorites]


Here's the intersection on street view.
posted by Mitheral at 12:18 PM on July 7, 2016 [1 favorite]




Silicon Valley-Driven Hype for Self-Driving Cars, a remarkably pessimistic op/ed in the NYTimes.
posted by Nelson at 9:19 AM on July 10, 2016 [1 favorite]


« Older A colorist is really just there to make sure the...   |   "We wear flashy boots." Newer »


This thread has been archived and is closed to new comments