internet of things that screw you over
October 4, 2017 9:53 AM   Subscribe

Cory Doctorow wriotes for Locus: The Demon Haunted World
Wannacry was a precursor to a new kind of cheating: cheating the independent investigator, rather than the government. Imagine that the next Dieselgate doesn’t attempt to trick the almighty pollution regulator (who has the power to visit billions in fines upon the cheater): instead, it tries to trick the reviewers, attempting to determine if it’s landed on a Car and Driver test-lot, and then switching into a high-pollution, high-fuel-efficiency mode. The rest of the time, it switches back to its default state: polluting less, burning more diesel. This is already happening.

Bruce Schneier: Volkswagen and Cheating Software
Most of them don't affect normal operations, which is why your software generally works just fine. Some of them do, which is why your software occasionally fails, and needs constant updates. By making cheating software appear to be a programming mistake, the cheating looks like an accident. And, unfortunately, this type of deniable cheating is easier than people think.

Computer-security experts believe that intelligence agencies have been doing this sort of thing for years, both with the consent of the software developers and surreptitiously.

This problem won't be solved through computer security as we normally think of it. Conventional computer security is designed to prevent outside hackers from breaking into your computers and networks. The car analogue would be security software that prevented an owner from tweaking his own engine to run faster but in the process emit more pollutants. What we need to contend with is a very different threat: malfeasance programmed in at the design stage.
posted by the man of twists and turns (21 comments total) 24 users marked this as a favorite
 
Ethics and morals in software design.
posted by ZeusHumms at 9:56 AM on October 4 [4 favorites]


...voting software...
posted by amtho at 10:05 AM on October 4 [3 favorites]


And a regular component of the underhanded C contest
posted by k5.user at 10:27 AM on October 4 [3 favorites]


This is both incredibly interesting and incredibly depressing.
posted by corb at 10:43 AM on October 4 [1 favorite]


Yeah, this type of thing has been on the radar of the security community for a long time. Back in 2003, someone tried to introduce a backdoor into the Linux kernel, masquerading as a typo. It was caught and reverted very quickly, mainly because it was snuck into the codebase by breaking into the source code repository.

In that case, the method by which the bug was introduced made it impossible to maintain plausible deniability. Also, the code change itself was pretty suspicious (it was triggered by a combination of flags that would normally never happen). But if a subtler bug had been made by a real developer — perhaps through bribery or blackmail — it might have gone undetected for a lot longer. And after enough time had passed, it could be explained away as an unfortunate mistake.

Obviously, this kind of thing is much easier to get away with (a) in a closed-source development model, and (b) when the project's management is incentivized toward complicity.
posted by teraflop at 11:11 AM on October 4 [4 favorites]


Side note about software design, morals and ethics:

I returned a leased vehicle that had a companion app for locating the car, honking the horn, unlocking the doors and turning on the climate control remotely. I tried to "disconnect" my email address from the car, but was unable to find a mechanism to do so.

Many weeks later, I received a notice that "my" car's battery was low. Out of curiosity, I used the app to locate it (several miles away, in a holding lot), drove over and tried to honk the horn via app. Sure enough, it worked (oh, hey, there's "my" car.) I have no doubt I could have used the other features, including door unlocking.

The car was eventually sold to someone in Arizona (guess how I know this), and I began to receive email updates about the car's status. Out of curiosity, I told the app to honk the horn from my home in California, and sure enough the app registered success. I've uninstalled the app, but I could install it again at any point...and I'm still getting email notices to this day.

The point being that in this case there's no reason to think this situation was an intentional or immoral choice by the software developers; it was probably a simple oversight or a de-prioritized item on a list. Nevertheless, an accidental but still very real security/privacy risk exists for the 2nd-hand owners of these cars now, and that underscores the importance of oversight for intentional malfeasance and inadequate engineering diligence.
posted by davejay at 11:16 AM on October 4 [32 favorites]


Reflections on Trusting Trust, Ken Thompson, 1984.
posted by ardgedee at 11:40 AM on October 4 [2 favorites]


I really appreciate that the Doctorow article gave me much clearer understanding of stuff that has read as very technical and difficult to understand before. Also provided specifics on what I should want regulatory-wise.
posted by latkes at 11:47 AM on October 4 [2 favorites]


On a somewhat parallel topic, I really wish there was some way of forcing manufacturers to stop adding shovelware and forcing software updates on limited capacity items such as tablets.

(Especially on things like my tablet, where I get nagged to update to the latest version of the shovelware, which does NOT run on the current version of the OS available for said tablet.)
posted by Samizdata at 12:46 PM on October 4 [3 favorites]


I've wondered about connecting my iPhone to a rental car via Bluetooth to pay music - is that opening up a vulnerability too?
posted by gottabefunky at 1:00 PM on October 4


is that opening up a vulnerability too?

Assume everything is a vulnerability. Every account has been compromised, every email stolen, everything you put on the internet kept in a file somewhere with your name on it.

It's safer than putting confidence in the bullshit house of cards our modern economy and the systems the underpin it support.

Why, Yahoo just announced yesterday that rather than the billion-odd accounts they'd previously announced were compromised back in 20fucking13, it is in fact all of them. All 3 billion. Every single one.

Don't trust these systems, but use them. But do not trust them, and expect them to disappoint you, and understand that there is nothing you can do to prevent it and still live a "normal" life.

This is how we live now.
posted by turntraitor at 1:08 PM on October 4 [17 favorites]


I've wondered about connecting my iPhone to a rental car via Bluetooth to pay music - is that opening up a vulnerability too?


I wouldn't do it. I don't rent cars very often, but I've seen way too many rentals with names and phone numbers loaded into the car from the last person's phone.
posted by Nonsteroidal Anti-Inflammatory Drug at 1:20 PM on October 4 [7 favorites]


I was peripherally involved in developing benchmark software for PCs and various subsystems in the 90s, about the time that video cards became graphics accelerators, on their way to becoming GPUs. The benchmarks became one of the standard stick-the-score-on-the-box metrics for cards, and shortly afterwards we found the first drivers that had 'optimisations' for the benchmarks.

Anomalous results stick out, especially to people who are doing this for a living, and there was quite a skirmish for a while - the card makers got told to send in new drivers and if they did that again they'd be written about, and in the end the benchmarks were close enough to (ie, used enough actual examples of) real application/gaming code that if the makers wanted to optimise for it, then it was probably going to benefit users IRL. There was a time, though, when it was a bit of an arms race.

So I think the answer to at least some of the cheating by design problems is to test what you fly, fly what you test - simulate reality as closely as you can during evaluation. These days, I'd probably be thinking along the lines of encouraging third-party instrumentation to gather actual real-world data from everyone and report back - this brings in its own design and security issues, of course, but you run the risk/reward equations and decide what it is you're most scared of, you apply proper design methodologies, and you go for it.
posted by Devonian at 1:31 PM on October 4 [2 favorites]


Ethics training for engineers is terrible. Not only is it basically an afterthought in most programs, it's extremely tightly focussed on preventing accidents.

Which is an important thing for safety, of course. Nobody wants their space shuttle to explode or their bridge to fall down. Obviously.

And then you spend a little time on whistleblowing, because if your space shuttle might explode or your bridge might fall down, then people should know about it even if your employer doesn't want them to. Obviously.

But nobody talks about things that aren't directly life-threatening. At least not in any of the ethics courses I had to take.

Maybe the VW thing comes up in the classes now, I dunno. I hope so. But are we teaching young engineers that it was wrong because it was a bad thing for society? Or are we teaching them it was wrong because it was illegal?

And more than that, we didn't talk about the ethics of collecting data on our users. We didn't talk about the ethics of "disrupting" housing markets by making it easy for people to turn single-family homes and condos into hotels. We didn't talk about the ethics of making a bump-fire stock that lets people fire a semi-automatic gun almost as fast as a fully-automatic gun.

There's just too fucking much we don't talk about.
posted by tobascodagama at 2:10 PM on October 4 [5 favorites]


That's not surprising but it is disturbing. My daughter goes to a fairly prestigious STEM high school and they offer little to no ethics curriculum either.
posted by latkes at 3:17 PM on October 4


And freaks me out (well, a little anyways) every time my phone apps ask for update... You work already, what you doin? Spying on me better? Fuckin yourself up, removing some functionality?
posted by Meatbomb at 7:32 PM on October 4 [4 favorites]


And freaks me out (well, a little anyways) every time my phone apps ask for update... You work already, what you doin? Spying on me better? Fuckin yourself up, removing some functionality?

Or, you know, bug fixes?
posted by Samizdata at 8:49 PM on October 4 [1 favorite]


I make sure I can delete the previous person's info from the rental car (and in one case, the apple tv provided by the hotel) before I put my own info in. And it goes on the checklist for "before turn-in (/checkout)" Also I assume anything I do is logged for whatever purpose they want. I totally expect, "Oh hi, mr MF. How was your night? [glances at computer screen] Enjoy your movie?"
posted by ctmf at 10:07 PM on October 4


(same when I use the hotel's router without using a vpn)
(and maybe even then)
posted by ctmf at 10:08 PM on October 4 [1 favorite]


Or, you know, bug fixes?

Yeah that's what they want you to think!
posted by Meatbomb at 10:19 PM on October 4 [3 favorites]


I seem to remember Grace Murray Hopper telling a story about a Navy program for validating COBOL compilers and a COBOL compiler that was only able to compile the program the Navy used for validation.
posted by Obscure Reference at 6:07 AM on October 5 [4 favorites]


« Older Untitled Goose Project   |   Christ in the Garden of Endless Breadsticks Newer »


This thread has been archived and is closed to new comments