normal operation
June 11, 2018 12:39 PM   Subscribe

 
TL;DR: Self-driving cars are not ready for prime time.
posted by grumpybear69 at 1:10 PM on June 11, 2018 [7 favorites]


Perfect time for Tesla to go all-in on self-driving technology.
posted by tobascodagama at 1:17 PM on June 11, 2018 [1 favorite]


My position that we’re not capable of doing this right as imperfect human beings living under an inhumane capitalist system only gets stronger.
posted by bleep at 1:27 PM on June 11, 2018 [8 favorites]


Having, quite literally, spent 20 minutes in group meeting this afternoon talking about driving as a problem of information acquisition, this fits right in. Yes, if you build an autonomous system out of disparate parts and then you have humans treat it as though it's an integrated system (e.g., lane departure warning and advanced cruise control, which are separate systems), you're going to have a bad time. You're also going to have a bad time if your more autonomous system (e.g., Super Cruise) doesn't understand that you've still got a human in the driver's seat. They're trying to engineer around the driver, which is cute (e.g., driver monitoring), but the road mapping approach is a real non-starter in a lot of ways.

It's interesting that they've mapped that much roadway down to 5 cm, and that their system is geolocked to mapped roadways only, but that makes it a pretty limited system in a lot of ways, and a very brittle one. How many roads really stay the same over time at the kind of scale they're describing? Overall stability, sure, but there are a lot of things that can change here, and a lot of potential for this system to fail. It's not a bad approach (and it's what I'd expect out of GM, in a lot of ways), but I'm not sure if it's a good approach long-term, particularly in terms of its scalability.

From where I sit, Super Cruise feels like a very reactive approach to mid-level automation in terms of design; in that we know drivers will want to look away from the road, so let's eyetrack them and figure out when they do it. Let's not trust that our autonomous driving system will be all that good at figuring out the environment, so let's map it beforehand. Admittedly, I'd rather have GM's approach, even if I think it's flawed, than Tesla's (which, originally, was "you can try to use this anywhere"), because it is more safety-conscious, but there's a large gap here, and it's the problem of the human. It's not enough for the vehicle to know what's in its environment; the human needs to know as well, because the vehicle will hand off control, and you'd like to do that safely (that, and the system may be brittle, and you really, really want to catch that).
posted by Making You Bored For Science at 1:38 PM on June 11, 2018 [8 favorites]


I have adaptive cruise control and the low-speed auto-braking, and while I had no illusions that my car would brake to avoid a stationary object at 70mph it's interesting to read exactly why the low-speed auto-braking is low-speed-only.
posted by EndsOfInvention at 1:40 PM on June 11, 2018


I mean dude usually I would just suggest spinning off a nonsentient submind to watch for crashes but you do you
posted by GCU Sweet and Full of Grace at 1:54 PM on June 11, 2018 [25 favorites]


It's cool I'm backed up
posted by EndsOfInvention at 1:57 PM on June 11, 2018 [11 favorites]


These are scary systems. I can't tell if I am inappropriately pessimistic about both systems and humans, or if the folks building these are inappropriately optimistic.

Personally, I use cruise control for less than a minute at a time just to shift my seating position and wiggle my ankles around so I don't stiffen up while driving long distances. I'd kind of like a "give me a minute, I want to take my sweatshirt off over my head" button, though.
posted by rmd1023 at 1:58 PM on June 11, 2018 [1 favorite]


What will end Tesla first, its poor finances or the upcoming lawsuits?
posted by Foci for Analysis at 2:05 PM on June 11, 2018


I keep thinking about integrated flight decks and the training required to use them every time I read stories like this. During training, you spend some time learning what the system can do and a lot of time learning what the system can't do. You also learn the expected behavior for a number of scenarios, which doesn't seem to be the case for these driver assist technologies.

One area of commonality for most aircraft systems, though, is just what is mentioned in the article - give the operator the opportunity to take control. Sure, the plane can resolve a traffic alert on its own but it's going to be screaming at the flight crew for a good long while before it decides to take action. Having a car just... go off and do its own thing without providing any indication of what it's trying to do is a little frightening.
posted by backseatpilot at 2:34 PM on June 11, 2018 [2 favorites]


This was a really interesting read, speaking as someone who is pretty bullish on self-driving technologies (mostly because I am very, very skeptical of humans' ability to drive themselves safety, as borne out by decades of evidence).

I can see why those challenges in recognizing and safely addressing stationary objects at high speed are tough, and damn, that's frustrating - I have no insight on potential solutions, but I hope that engineers and ever-improving tech will elucidate new approaches before too long.
posted by mosst at 2:42 PM on June 11, 2018 [2 favorites]


Personally, I use cruise control for less than a minute at a time...

I can’t recall the last time I was on a highway that wasn’t too crowded to effectively use cruise control.
posted by Thorzdad at 2:42 PM on June 11, 2018 [2 favorites]


mostly because I am very, very skeptical of humans' ability to drive themselves safety, as borne out by decades of evidence

Based on the current state of all of our society & technology, I feel a lot better about letting people make their own decisions. Our attempts at letting technology make decisions for us aren't going too good.
posted by bleep at 2:47 PM on June 11, 2018


Cruise control is for when you're on the freeway and the highway patrol car/bike is right there next to you, just waiting for someone not using cruise control to leap out in front in road rage that everyone in front of them is driving at the speed limit and not 1 mile per hour more.
posted by linux at 2:48 PM on June 11, 2018 [4 favorites]


It seems like current automated driving systems are almost purely reactionary. Until they involve multiple branches of predicted scenarios - which are part and parcel of being a good human driver - as well as a holistic environmental analysis that can, for example, differentiate between a stationary object that is not a hazard and a stationary object that is in the oncoming path of the vehicle, they are not going to be able to avoid these sorts of accidents. And an autonomous car depending on GPS map data is likely to be about as good as a human, which is to say some cars will end up in lakes.

Highways (and other cars) should be outfitted with transponders to help alleviate the need for lidar etc. to do basic object detection. It seems unlikely that any computerized driving system is going to be able to replace the human brain any time soon when it comes to information processing and decision making.
posted by grumpybear69 at 2:51 PM on June 11, 2018


"challenges in recognizing and safely addressing stationary objects at high speed are tough"

No, this is a solved problem. Range gating, pulse gating, and ISAR are old technologies that are well known and proven. Same with LIDAR and sensor fusion of the various subsystems. There are no knew hardware technologies here and there are none forthcoming.

The manufacturers have declined to use them due to cost and development time of the software. Much more profitable to sell what they have now and then claim that things are hard. Oopsie-Woopsie!
posted by pdoege at 2:51 PM on June 11, 2018 [5 favorites]


Riffing off backseatpilot's comment; aviation is (usually) the obvious place to go to look at operator/automation interactions, situational awareness and all of the associated bits and pieces, and it's where the driving research world has looked in the past. However, while there are superficial similarities between the two, they're very different worlds, and so the analogy only goes so far. Where does it really break down? Complexity, and proximity.

There are an amazing number of things that can go wrong on the road, and while we can train you for some of them, it's impractical to train you for all of them (ask me how I know this... the answer comes out of building a stimulus set of "things going wrong but not going that wrong because I don't want to show subjects the really ugly stuff without a reason). The other problem is that emerging problems on the road are fast and close; it's not "oh, that other aircraft is here and maybe that's a little closer than my radar wants" it's "holy shit that's a moose and he wants to say hi."

We usually term this "situational awareness"- which dates back to WWI - but in driving, dates back to the late 1980s (and the seminal paper is 1995), but the entire driving literature here isn't great at looking at it mechanistically. We all agree that you need situational awareness to be a safe driver, but no one really studies how you get it, much less studies how you get it and tries to look for answers that wouldn't result in a bad case of dead. The largest problem here, broadly speaking, is that driving (compared to aviation) is close, which means your perception/action loop needs to be fast, which means that theories and models which assume the driver is going to fart around looking for what you think they need to know before they deliberate and act are probably wrong. If the world was going to wait patiently for the driver to figure things out, that might be OK. Except that the world doesn't wait for our brains to catch up.
posted by Making You Bored For Science at 2:53 PM on June 11, 2018 [4 favorites]


Perfect time for Tesla to go all-in on self-driving technology.

And updating nag-alert.
posted by Thorzdad at 2:58 PM on June 11, 2018 [1 favorite]


Cruise control is for when you're on the freeway and the highway patrol car/bike is right there next to you, just waiting for someone not using cruise control to leap out in front in road rage that everyone in front of them is driving at the speed limit and not 1 mile per hour more.

I set it about 7-8 mph above the speed limit, and I have never gotten a ticket that way
posted by thelonius at 2:59 PM on June 11, 2018 [2 favorites]


Based on the current state of all of our society & technology, I feel a lot better about letting people make their own decisions.

Indeed. Driving kills more people globally than all forms of cancer except lung (and other respiratory system) cancers. It's 29% of deaths due to injury.

This doesn't feel like something where we can say that the system's working. It deserves proper epidemiological research, and proper regulatory enforcement. Because machines can do it better, and we should try to work out how to make that happen.
posted by ambrosen at 3:00 PM on June 11, 2018 [6 favorites]


I just think that the idea that machines can do this particular thing better hasn't been proven yet. I'm happy to be wrong, but if we can't do it without losing human lives in the process, I don't think it's worth it. There was no reason for the death mentioned in the article to occur, and especially no reason for this to cause innocent deaths. If we could get this done without killing anyone I'd be all for it.
posted by bleep at 3:03 PM on June 11, 2018 [3 favorites]


Designers assumed it would still be the job of the human driver to pay attention to the road and intervene if there was an obstacle directly in the roadway.

Hahahaha never assume anything positive about humans.

It's cool I'm backed up

Wait till they figure how to crash you and convince you to destroy your stack before the backup so you think you were murdered by the car.

During training, you spend some time learning what the system can do and a lot of time learning what the system can't do.

This is one thing that I think should be required for buying cars with these systems. There should be at least a simulator where you show the dealership you can use these systems and understand their limitations.

I was excited to have a car with adaptive cruising and lane deviation alerts and it was a great novelty until I realized that despite all my years of driving, the temptation of having them drive for me was very high and very risky. Once I realized that, I had to do real time tests of what it can do and what it can't. The one thing I can't and won't test is how my automatic braking system respond to stationary things at high speeds, it's probably not feasible to test and also, I would never want to rely on it. But having used the system with my car extensively, I was more confident to test out other cars' systems. One thing I've noticed is that while mine (a Subaru) will actually slow down to a full stop when it encounters heavy traffic, others (like Toyota) will disengage cruise control if the car slows down to less than 25 mph, and then you yourself have to slow it down to a complete stop. So that illustrates the point that even when you have training with one system, a similar one might need some additional training. I would rate myself a confident driver with decent situational awareness (borne out of many past mistakes), and I don't want to imagine anyone with less experience even attempting to fully rely on these systems. That would be disastrous.
posted by numaner at 3:04 PM on June 11, 2018 [4 favorites]


it's "pretty much universal" that "vehicles are programmed to ignore stationary objects at higher speeds."

I'd not heard of this design rationale before, it's, umm, interesting.

Maybe the Tesla software thought it was overtaking on the left, and did not recognize that it was actually heading towards a lane divider for an exit? Although presumably this exit layout should have been on the GPS.
posted by carter at 3:21 PM on June 11, 2018


Why not just put the cars on rails?
posted by littlejohnnyjewel at 3:31 PM on June 11, 2018 [9 favorites]


Hey everyone: it seems like cars are bad.
posted by latkes at 3:48 PM on June 11, 2018 [9 favorites]


Here's my hubristic proclamation: we'll have safe and effective self-driving cars the moment that we have cars that fly, and not a moment sooner.
posted by tobascodagama at 3:57 PM on June 11, 2018 [3 favorites]


It’d be more accurate to say that their hardware forces them to program the vehicle to ignore stationary objects, since that’s the only way to make it work reliably at all. (As the article points out, occasionally performing an emergency stop on a false positive in highway traffic would not be preferable.) That’ll improve, but radar and cameras just aren’t good enough for self-driving, pending a huge breakthrough in computer vision.

Lidar-based systems (like Waymo) are going to be the exception there.
posted by emmalemma at 4:00 PM on June 11, 2018


I just think that the idea that machines can do this particular thing better hasn't been proven yet.

Waymo (google) has 7 million road miles without a serious accident.

The bus around here is a couple bucks, Uber a little over double that. When you can get a ride for less than the bus quicker than a taxi, which will you choose?

The main thing that's holding back the revolution is tooling/cost of sensors. That will change soon. In 20 years they'll be removing most stop lights, no more speeding tickets, many social conventions will be adjusted, what ever happened to payphones? What happens to the trucking industry when the new guy charges half the cost for twice as fast delivery? Google's Early Rider Program has been running without safety drivers for months. Not Uber, maybe not Tesla but SDC's are here.
posted by sammyo at 4:04 PM on June 11, 2018 [3 favorites]


The Tesla autodrive is the best in a lot of traffic on a crowded, slow moving freeway traffic jam. No more "brake, drive, brake, drive" etc. I dislike cruise control, but love the autodrive.
posted by Windopaene at 4:15 PM on June 11, 2018


I'm trying to popularize the conspiracy theory that self-driving car technology has been perfected but our tech overlords in Silicon Valley will only allow it to trickle out as it suits their needs. Like how people believe that Detroit invented cars that get 150 mpg but the oil companies won't allow them to be sold, I'm pitching he idea that our tech overlords rely on driving to remain dangerous because they have other priorities. That the real synergy of surveillance capitalism and self-driving cars is to identify suitable organ transplant donors to selectively harvest so people like Peter Thiel can achieve immortality. Why else would the hospital named after Zuckerberg host one of the world's premier organ transplant centers?

It seems like an easier way to get people to think twice about the wisdom of putting my life at risk because they think their Tesla autopilot is smart enough to allow them to read a book while driving. And because almost nobody wants to have this discussion about the limits of the self-driving technology stack and its interaction with human drivers plus the lack of a robust legal and regulatory framework.
posted by peeedro at 4:22 PM on June 11, 2018 [3 favorites]


The Tesla system seems to be actually attracted to those road divider end points you get when a freeway branches. I was wondering if the lines of the up pointing chevrons on the barrier end point get scored mistakenly as lane markings converging in the distance. The lane keeping system is programmed to look for a line on each side, converging in the distance, and to aim for the convergence point.
posted by w0mbat at 4:28 PM on June 11, 2018 [4 favorites]


In 20 years they'll be removing most stop lights,

Forty years maybe. Traffic lights will have to be around until there's nearly no human drivers left and I don't see that happening in just twenty years. There will be people that keep driving because they like to drive or don't trust the computer. There are people that won't be able to afford to upgrade to a driver-less car for a long time.
posted by drezdn at 5:53 PM on June 11, 2018 [6 favorites]


I like the driver aids on my current vehicle, which seem more focused on education / training / reminding me to better driver, rather than replacing me as a driver.

For example, reminding me what a "safe" following distance is at highway speeds (green / yellow / red depending on how close I am following)

Reminding me to stay in lane when I start to stray.

Warning me when I try to execute an unsafe lane change and there's a vehicle in my blind spot.

Even showing me precisely how to get a perfect parallel park done, 2-3 inches from the curb. Sure, I learned, from other people, and by trial and error, but watching a machine take control of the wheel and precisely slot the car into the space in one go was pretty educational...

It's the same technology, but just a change of perspective, instead of doing our job for us, it's there to educate and train us to achieve a higher level of driving proficiency, and step in in emergencies.
posted by xdvesper at 6:01 PM on June 11, 2018 [8 favorites]


If auto-braking makes things safe, and auto-braking cannot be enabled at speed for $REASONS, then change the problem, reduce the speeds.

Fewer deaths, fewer GHGs, much easier problem for automated vehicles to handle, less energy required overall so electric vehicles perform better. Do that and maybe advanced human civilization survives. (previously, and same).
posted by ecco at 6:12 PM on June 11, 2018


In 20 years they'll be removing most stop lights...

If the northern suburbs of Indianapolis are any indication, it might be sooner. They’re all going insane for roundabouts up there. Every week it seems yet another intersection is being rebuilt as a roundabout. Now, I see the utility in them, but it gets to be ridiculous when you have to navigate four or five roundabouts within a half-mile stretch of road. But, hey, I guess you save on not having to run stoplights.
posted by Thorzdad at 6:39 PM on June 11, 2018


I still believe the right way to make "autonomous" vehicles part of the automotive ecosystem is to have them be a backup system for the driver, rather than the other way around. We have had traction control of various stripes for quite a long time, and when someone *still* manages to overdrive it and crashes, they have no one to blame but themselves...yet so many accidents are prevented by traction control, and we don't talk about it, because a lack of an accident is a non-event.

Or put another way: I don't want a car to drive me home, where I have to step in if something goes wrong. I want a car that I drive home, but if it senses something's wrong it throws some signals to see if I'm paying attention, and if I don't respond within a few seconds, uses its "autonomy" to slowly pull me over to the side of the road and stop. If that were the default reaction to hands-off-the-wheel, looking-at-my-phone behavior, we wouldn't have these unrealistic expectations, and we'd be grateful when the car intervened on our behalf instead of outraged when it didn't.
posted by davejay at 7:13 PM on June 11, 2018 [4 favorites]


During Elon Musk's recent twitter weirdness someone tweeted back to him about how he builds "Self-Crashing Cars" and now I can't really think of them any other way.
posted by the duck by the oboe at 8:57 PM on June 11, 2018 [7 favorites]


Oh hey, it's my job.

So stationary objects in radar are a pain, mainly because of how long it takes to figure out if it's something you'll actually run into. So if you're moving at 70mph (approximately 30 m/s) and a standard range for these radars are about 180, you've got at most 3-4 seconds to figure out what kind of object it is and what to do(as you normally want to alert the driver about 2 seconds in advance). Thing is, most detections that far away are weaker and you dont really get a sense of if they're actual obstacles until about 100m, if you're lucky (and as mentioned above this is mostly because companies want cheap sensors that are *good enough* rather than the absolute best). Overhead signs look the same as a car until you're close to them, and sharp turns make anything on the side of the road look like they're in your path (there's a huge problem with exit dividers since the radar doesnt know where you're going and that guardrail sure looks like something you can run into). Because these were built as driver assistance and not for autonomous driving, there is a ton of pressure on avoiding any false positive reactions, as you never want to cause an accident.

People are starting to look into doing more and more sensor fusion though, bringing all these disparate systems together, but since they were developed independently of each other its extra work that would have been avoided if the system was designed to be a single self driving solution rather than smaller packages to sell as driver assistance functions.

It is terrifying to me that companies market these technologies as something they are not, because when you dont know enough they look self driving in certain scenarios, and when the customers don't know the actual capabilities of the tech and are lead to believe it can do more than what it's designed for, it can absolutely lead to people being killed, you know the thing we actively want to prevent.
posted by tealNoise at 9:43 PM on June 11, 2018 [13 favorites]


The Tesla autodrive is the best in a lot of traffic on a crowded, slow moving freeway traffic jam. No more "brake, drive, brake, drive" etc.

That's actually also my favorite thing about the Subaru adaptive cruise control. I can set it to 55 mph when I know I'm hitting the daily stop-go traffic, because I can just have an easy drive and keep the car in the lane (the lane deviation sensor for mine is just an alert, it doesn't actually steer the wheel too) while I'm checking emails or something. It can come to a complete stop and get going if the stoppage is less than 2 seconds. I've very rarely had to reengage the cruise control.

As I said above, unless Toyota fixed it, do not do that with their adaptive cruise control, since it will just disengage when you drop down below 25 mph.
posted by numaner at 10:55 PM on June 11, 2018


@tealNoise: what kind of progress has there been with video processing, in terms of taking a stereoscopic video feed and identifying objects and their various positions and trajectories? Is that even a thing?
posted by grumpybear69 at 9:36 AM on June 12, 2018


what kind of progress has there been with video processing, in terms of taking a stereoscopic video feed and identifying objects and their various positions and trajectories? Is that even a thing?

So I don't have much involvement with camera work, but I'll try to answer this. Right now, everyone is pretty much just using a mono-camera as they're cheaper and the advantages of stereo aren't great enough for there to be a large push in adopting them(or rather, most of the industry is using whatever mobileye is selling because no other supplier can really match their performance right now).

You can do camera tracking, although it's sensitive to the calibration of the camera, and if you want to figure out trajectories and positions radar and lidar are way more accurate (and can work in bad weather and different lighting conditions), so the big thing everyone is going after is sensor fusion. You can get a rough estimate of where an object is from the camera and compare it to what the other sensors are seeing to doubly confirm that there's something there and that the kinematics are right (most emergency braking functions actually require that you have confirmation of an obstacle in front of you from both radar and camera before doing anything).

I do know that they definitely keep more of this kind of stuff under wraps, even internally, so there might be more efforts that I'm not seeing (the stuff I see often is lane detection, sign recognition, and some pedestrian detection). I do know that initially Elon just wanted to use pure camera for everything in his cars, but it's just not technically feasible yet which is why he has extra sensors (although I'm sure they're actively working on realizing such a goal seeing how they ended their deal with mobileye and are developing a bunch of camera stuff in-house).
posted by tealNoise at 4:24 PM on June 12, 2018 [2 favorites]


As an involuntary pedestrian/cyclist beta-tester in Tesla's self-driving car experiments, I think I am entitled to financial compensation.
posted by littlejohnnyjewel at 6:58 PM on June 12, 2018 [2 favorites]


$deity can you imagine the headline when one of these not ready for prime time systems encounters a stopped school bus on a rural highway? And I really want to see success stories from this technology operating some place with serious winter.

Thorzdad: "I can’t recall the last time I was on a highway that wasn’t too crowded to effectively use cruise control."

On the other hand on my last 3 hour commute I encountered maybe a dozen cars. But also six deer (one with fawns) and maybe the same number of rocks large enough to require action. My new to me car doesn't have cruise control and I really miss it.

sammyo: "
Waymo (google) has 7 million road miles without a serious accident.
"

Most of those miles at 25mph or less.

drezdn: "Forty years maybe. Traffic lights will have to be around until there's nearly no human drivers left and I don't see that happening in just twenty years. There will be people that keep driving because they like to drive or don't trust the computer. There are people that won't be able to afford to upgrade to a driver-less car for a long time."

Also we are unlikely to see self driving bikes or pedestrians anytime soon.
posted by Mitheral at 8:53 AM on June 13, 2018 [3 favorites]


Not until 2011 at least.
posted by sammyo at 7:33 AM on June 15, 2018


I wonder what an automated system would have done in my recent accident. I was driving at 70 mph in the right lane on a freeway. A semi was on the shoulder and pulled in front of me. I saw my choices as 1) brake, but won't have enough stopping room, so hit the rear of the semi; or 2) pull into the other lane and cross my fingers. I chose #2 after quickly glancing in the side mirror; unfortunately there was a car in my blind spot that I swiped. No injuries and only superficial damage, but from what the article says, I guess an advanced braking system wouldn't kick in, so I'd be severely injured or dead.
posted by AFABulous at 2:24 PM on June 17, 2018


Fatalities vs False Positives
In one bad week in March, two people were indirectly killed by automated driving systems. A Tesla vehicle drove into a barrier, killing its driver, and an Uber vehicle hit and killed a pedestrian crossing the street. The National Transportation Safety Board’s preliminary reports on both accidents came out recently, and these bring us as close as we’re going to get to a definitive view of what actually happened. What can we learn from these two crashes?

There is one outstanding factor that makes these two crashes look different on the surface: Tesla’s algorithm misidentified a lane split and actively accelerated into the barrier, while the Uber system eventually correctly identified the cyclist crossing the street and probably had time to stop, but it was disabled. You might say that if the Tesla driver died from trusting the system too much, the Uber fatality arose from trusting the system too little.

But you’d be wrong. The forward-facing radar in the Tesla should have prevented the accident by seeing the barrier and slamming on the brakes, but the Tesla algorithm places more weight on the cameras than the radar. Why? For exactly the same reason that the Uber emergency-braking system was turned off: there are “too many” false positives and the result is that far too often the cars brake needlessly under normal driving circumstances.

The crux of the self-driving at the moment is precisely figuring out when to slam on the brakes and when not. Brake too often, and the passengers are annoyed or the car gets rear-ended. Brake too infrequently, and the consequences can be worse. Indeed, this is the central problem of autonomous vehicle safety, and neither Tesla nor Uber have it figured out yet.
posted by the man of twists and turns at 8:00 AM on June 19, 2018 [3 favorites]


« Older Openly fascist philosophies of eternal conflict   |   The Women in Toronto's Comedy Scene Aren't Going... Newer »


This thread has been archived and is closed to new comments