Everything is broken
May 21, 2014 9:52 AM   Subscribe

Everything is broken Next time you think your grandma is uncool, give her credit for her time helping dangerous Russian criminals extort money from offshore casinos with DDoS attacks.
Quinn Norton [previously] breaks down the reasons why computers are so hackable by exploring the realities of how software is made and used.
posted by dobie (63 comments total) 30 users marked this as a favorite
 
Of course it's all broken. Have you SEEN the people* they let write this stuff?

* Disclaimer: I write this stuff
posted by blue_beetle at 9:59 AM on May 21, 2014 [24 favorites]


Was discussing this in another place...

If everything is broken, how come it works? By the article's measure, our own biochemistry is a mishmash of evolutionary detritus, all mixed up and only kinda working, and vulnerable to all sorts of internal and external factors that can bring it down. Yet, here we are... we could be wiped out tomorrow by a Horrible Plague, and by heavens we've had a few, yet here we are, the inheritors of billions of years of slapdash haphazard shit that wasn't even put together with any sort of a plan.

So, yes. There is bad software. There's a lot of bad software. It is dangerous. Nevertheless, my and yours and Quinn's experience of IT has improved and improved and improved, even as we ladle in all this unplanned networking and code written by amphetamine-crazed anteaters and the Great Mass of Criminality intent on draining our lifeblood out through our cable modems.

Moreover, the new spirit of automating the testing of software components, instrumenting code and, y'know, applying some engineering and industrial nous, is rather heartening.

Things are not that bleak.
posted by Devonian at 10:17 AM on May 21, 2014 [6 favorites]


I'm a functional software tester.

There is no better career for making the decision to pare down your digital life to near-ascetic proportions. You get to see all the craziness. Business side, dev side, support side, marketing side...

But by all means, more articles like this, please. My career and salary will thank you.
posted by fraula at 10:18 AM on May 21, 2014 [13 favorites]


"helping dangerous Russian criminals extort money from offshore casinos..."

from the purely hypothetical perspective, how would one go about getting a piece of this action without having to do any work?
posted by bruce at 10:24 AM on May 21, 2014 [4 favorites]


The author's bio gave me a laugh.
A journalist of Hackers, Bodies, Technologies and Internets. ‘’Useless in terms of… tactical details’’ -Stratfor
posted by ChurchHatesTucker at 10:25 AM on May 21, 2014 [1 favorite]


If everything is broken, how come it works?

I think the title is kind of shorthand for "Everything is easily breakable if you attempt to do even mildly off-nominal things, accidentally or intentionally".
posted by kiltedtaco at 10:26 AM on May 21, 2014 [2 favorites]


how would one go about getting a piece of this action without having to do any work?

How do you feel about getting onto an offshore casino boat with condoms full of malware code hidden in your intestines?
posted by Bunny Ultramod at 10:27 AM on May 21, 2014 [2 favorites]


There are meh, not-so-terrible 0days, there are very bad 0days, and there are catastrophic 0days that hand the keys to the house to whomever strolls by.

Odays
posted by XMLicious at 10:31 AM on May 21, 2014 [2 favorites]


0days.
posted by Bunny Ultramod at 10:34 AM on May 21, 2014 [3 favorites]


It doesn't have to be this way. Computers have been made secure, and so can ours... but we have to be aware of it, seek it, and actually use systems that are secure.

I estimate I'll have a secure computer in 5 years. I estimate 10-15 more years until this happens on a wide scale.
posted by MikeWarot at 10:35 AM on May 21, 2014


If everything is broken, how come it works?
My answer used to be, "because most people with the skills to break it find using the pieces for a spam botnet more profitable than lulz". Today I suppose you can replace "spam botnet" with "NSA trawling network", but the same principle applies. Unless you're a special target, anyone who 0wns your computer probably wants to use it for something and so doesn't want you to think it's broken enough to unplug/reformat/reinstall.
posted by roystgnr at 10:36 AM on May 21, 2014 [5 favorites]


> Look at it this way--every time you get a security update (seems almost daily on my Linux box), whatever is getting
> updated has been broken, lying there vulnerable, for who-knows-how-long. Sometimes days, sometimes years.

Made a similar point on another site a few days ago:

Think about it this way: XP was released to the public in October of 2001 and Microsoft was still releasing security patches up until a couple days ago. That has to mean one of two things: either those recently plugged holes have been in the system since 2001, lying around undiscovered and unexploited (heh, as far as we know); or that the holes were NOT in XP as originally released, so they must have been introduced, oops! as an unintended side effect of later patches. If that is the case, it means that neither MS nor you nor I nor anyone can be certain that the latest Patch Tuesday will make XP security better or worse overall.
posted by jfuller at 10:40 AM on May 21, 2014 [2 favorites]


Part of the reason computer systems are so bad is that building any digital artifact of consequence is enormously complex. It's simply a difficult problem.

Another part of the reason, though, is that commercial development projects never have the time and resources they need to thoroughly test, or to rewrite parts that have recognized problems. And the project requirements are often changing right up until launch. So developers are constantly striving to build this enormously complex thing, from a blueprint that changes every time they look at it, without enough time or resources to do it right.

It's far from clear how to solve this problem. It'd be great if we could all take as long as we needed to produce a beautiful product, but time is money, and we all need to eat.

I sometimes think that we need to prune away some of the complexity—to eliminate some of the scores of layers, and build more things closer to the bare metal. I don't think you can really put the genie back in the bottle, though.
posted by escape from the potato planet at 10:44 AM on May 21, 2014 [8 favorites]


It's because taking the time to things right isn't planned for or hired for. Pushing out hacky spaghetti messes with donkey shit UX is faster and faster is profitable.
posted by bleep at 10:46 AM on May 21, 2014


blue_beetle: While I would imagine that your comment was said somewhat jokingly, it strikes me as being "ha ha only serious". There was one comment that Norton made that struck me as the central point of why her ultimate analysis goes off track:

NASA had a huge staff of geniuses to understand and care for their software. Your phone has you.

Well, no the key to why software that goes to space is well built isn't that NASA hires geniuses - it's that NASA treats software engineering as, well, engineering . The rest of the industry doesn't, which is where all the problems stem from.

In short, our computers are broken because our computer culture is broken. Moreover, it's broken in ways some see as features, not bugs.
posted by NoxAeternum at 10:47 AM on May 21, 2014 [12 favorites]


bleep: Yeah, that's pretty much what I meant. Software companies aren't in business to produce a 100% product; they're in business to make money—and the calculus for that often favors a 90% (or worse) product.
posted by escape from the potato planet at 10:47 AM on May 21, 2014 [1 favorite]


That was it, that was the answer: be rich enough to buy your own computer, or literally drop dead. I told people that wasn’t good enough, got vilified in a few inconsequential Twitter fights, and moved on.

So the answer is Capitalism. And people.
posted by GenjiandProust at 10:50 AM on May 21, 2014


Are there any OS programmers in the audience, by the way? The first person who makes it easy for a non-root program to say, "drop all but a small subset of my user privileges", will deserve the world's gratitude. My PDF reader doesn't need to have permission to write to my other programs' config files or log my keystrokes from other windows, for example, yet invariably it has those permissions.

The root user can create a new userid and an inescapable sandbox easily enough; you'd think that the same features which let the system avoid granting too much trust to each user would be easily extended to let the user avoid granting too much trust to each program.
posted by roystgnr at 10:51 AM on May 21, 2014 [1 favorite]


When you come right down to it, if I had taken the time to do everything in my life exactly right I would still be working on learning to walk.
posted by jfuller at 10:53 AM on May 21, 2014 [10 favorites]


fraula, if you don't mind my asking, would say more about the "digital life" choices you've made? Are there things you don't touch/never do because of what you know? Stuff you're willing to put up with despite having seen the craziness? I'm curious (given Quinn's piece here) what you're paring back on, and what you trust (or at least trust more).
posted by deathmarch to epistemic closure at 11:01 AM on May 21, 2014 [2 favorites]


Most of this problem is the same basic set of reasons why some of the buildings built in the 21st century will fall down or have leaky foundations or develop doors that don't close properly or will be improperly insulated for the climate they're in, even though ways to avoid those issues have been well-understood for thousands of years. And most buildings today can be broken into by a burglar far more easily than a pharaoh's tomb built four millennia ago could've been.

The difference is that today in computer networks the burglars can be sorcerer's-apprentice automatons sent out by the bazillions by their creators. Given the ubiquity of drones we'll probably see similar problems in the physical world shortly though it won't really be directly comparable until a drone can copy itself.
posted by XMLicious at 11:02 AM on May 21, 2014


I estimate I'll have a secure computer in 5 years.

You buy the beer, I'll bring the concrete truck. We can have it done by the weekend.
posted by RobotVoodooPower at 11:07 AM on May 21, 2014 [6 favorites]


The first person who makes it easy for a non-root program to say, "drop all but a small subset of my user privileges", will deserve the world's gratitude.

If anybody used it.
posted by ChurchHatesTucker at 11:09 AM on May 21, 2014 [3 favorites]


I'm only aware of a very, very small number of projects actually working on this goal, such as Bunnie's Laptop From Scratch.

He didn't build the CPU, so there's no telling what malware in in that silicon.

That's why it's so hard. If you want security, you have to have trust. Given recent reevaluations about NSA activity, there is no way you can trust that the hardware -- or the components that hardware is made of -- do not have a remote compromise baked into them.

In short, our computers are broken because our computer culture is broken.

Exactly. Security falls under "YAGNI" in modern development.

There's a simple fix. How did we make cars safer? Liability. Until software vendors *pay* for the damage they allow, they simply have little to no motivation to fix it.

Oh, and if you've ever said "we can deal with that in a patch", it's just as much your fault as anyone.
posted by eriko at 11:12 AM on May 21, 2014 [2 favorites]


I estimate I'll have a secure computer in 5 years. I estimate 10-15 more years until this happens on a wide scale.

I would expect the opposite to be the case. The more inexpensive and pervasive surveillance becomes, the less any security measure will matter because it's more likely whatever you're trying to secure or encrypt or hide can simply be captured before any of that happens. Surveillance capabilities proliferating on a wide scale will prevent security becoming common on a wide scale.
posted by XMLicious at 11:12 AM on May 21, 2014 [3 favorites]


"C is good for two things: being beautiful and creating catastrophic 0days in memory management."

I agree with half of this. I push code around HTML pages and even I know that this shit doesn't even start to get better until we can write things like OpenSSL in memory-managed languages.

The main advantages of C are performance and portability. I am not aware of a memory-managed language that can compete with C in those areas. Until we have one — and it must be open-source, because nobody is going to write security software in a language backed by Oracle or Microsoft — we're fucked. (Once this does happen, we're still fucked, just in a more interesting way.)
posted by savetheclocktower at 11:13 AM on May 21, 2014 [5 favorites]


Until we have one — and it must be open-source, because nobody is going to write security software in a language backed by Oracle or Microsoft — we're fucked.

You mean like OpenSSL?

Seriously. Open Source is in no way a panacea -- doubly so when you're relying on libraries you didn't write for security.
posted by eriko at 11:15 AM on May 21, 2014 [2 favorites]


There's a simple fix. How did we make cars safer? Liability. Until software vendors *pay* for the damage they allow, they simply have little to no motivation to fix it.

Yep.

If you call up an electrician and say, "I need a new outlet in this room, but my budget is only $20 and I need it by four o'clock, and I don't want to have to hire a drywall guy, so can you maybe just run the wire along the wall and kind of duct-tape the outlet on there," they'll laugh in your face. In software it's, "Yes, boss, sure thing, boss."

But the regulations that establish a quality floor for electrical work are totally absent in software, and so anyone who tries to make robust software will be undercut by the quick-and-dirty brigade and find themselves out of business in short order.
posted by enn at 11:20 AM on May 21, 2014 [13 favorites]


> Seriously. Open Source is in no way a panacea -- doubly so when you're relying on libraries you didn't write for security.

That's not what I mean. I mean that the language that replaces C can't be something like C# or Java, because it won't get the adoption it needs.

And someone will write OpenSSL's replacement in that language, and it won't necessarily have fewer bugs than OpenSSL. But hopefully its bugs will be less catastrophic because of the sandbox it's playing in.
posted by savetheclocktower at 11:22 AM on May 21, 2014 [2 favorites]


Computers don’t serve the needs of both privacy and coordination not because it’s somehow mathematically impossible. There are plenty of schemes that could federate or safely encrypt our data, plenty of ways we could regain privacy and make our computers work better by default. It isn’t happening now because we haven’t demanded that it should, not because no one is clever enough to make that happen.

Who's this we Kemo Sabe? And a general revolt in the United States would probably not be in the political direction she would like. If everybody agreed with my, or anyone else's politics, we'd get a lot of things done. But the world is not that simple, and we all have different notions of what would "unfuck" things.
posted by zabuni at 11:27 AM on May 21, 2014 [2 favorites]




Haskell can easily compete with C. But then you have to write in Haskell.

F# in my world (which is image processing and I routinely need to work on 100MB datasets) runs, on average, 1.5 x equivalent C and in some cases better. That's competitive. How do I know this? I ported my company's code base from C/C++ backed .NET to completely managed .NET code in a mix of F# and C#.

But then you have to deal with the security of either .NET or one of its support libraries (or with Mono).

Most of my work, especially on the PDF front is writing self-sustaining internal engines that reduce complexity and resist/reduce errors. I just worked on a case from a customer who supplied a fucked up PDF that they insisted should work. I looked it over and found the source of fucked-uppedness and my repair infrastructure was doing the best that it could, but this particular PDF was also using encryption with an empty password. Seriously. You can do that if you want in PDF and this chunk of repair code only needed two lines to fix this and even though I had written all the infrastructure behind the security provider mechanism and the encryption codecs it still wowed me that it worked first shot and the unit test went green.

Mostly, I just looked at the infrastructure and thought, "I guess I was having a really good week when I did that."

And that touches on another human aspect of this business. I'm really good at what I do, but one aspect of that is that I write a shitload of code because I'm also really productive. A common conversation with my peers at work might start, "Plinth, you wrote a chunk of code to handle xyz." "I did? I don't remember - I write a lot of code." I have about a 3 month window for code. If I'm still actively using/developing it in that window, I'm in great shape, otherwise my brain jettisons it. Fortunately I'm also quick up on the uptake so I can pick it up again very quickly.

Interestingly enough, speaking of security, I just hit another aspect of the problem of security. When I need to set up any online transactions that require an email, I conjure one up for that transaction which includes the domain name so that if I ever get email to that address about anything but that transaction, I know that the company's data has been compromised and that I can contact them to let them know. I just got a call from one place (clearly a one person operation) who was convinced that I was trying to steal the business name by having in the email to contact me. I had to explain to the owner twice that it was a security measure for both of us and that I didn't actually care what it was but that it was unique and traceable back to them so they could be informed. I got sworn at, accused again, and told that they had been in business 15 years and didn't understand why people need to do this, so I played the "happy to take my business elsewhere if you don't want to do business with me or you can change it to whatever you want." I figured it would also be lost to say, "and I've run my domain for 19 years and this has served me well." When I got the confirmation email, the origination was aol. Speaks volumes, really.
posted by plinth at 11:53 AM on May 21, 2014 [7 favorites]


Until software vendors *pay* for the damage they allow, they simply have little to no motivation to fix it.
Exactly. The real problem is that the consequences of bugs in software are considered an externalized cost. Internalize that cost and these problems go away. Congratulations, your new super secure OS will cost $10,000 per core per year and if you have to ask how much the word processor costs, you can't afford it. Oh, and BTW, FOSS is now dead since developers won't be able to afford the mandatory insurance they need if bugs are found in their code.
posted by Poldo at 11:54 AM on May 21, 2014 [6 favorites]


If you call up an electrician and say, "I need a new outlet in this room, but my budget is only $20 and I need it by four o'clock, and I don't want to have to hire a drywall guy, so can you maybe just run the wire along the wall and kind of duct-tape the outlet on there," they'll laugh in your face.

I think you may have an unrealistic assessment of the creative ability of the average electrician.
posted by RobotVoodooPower at 12:03 PM on May 21, 2014 [2 favorites]


The first person who makes it easy for a non-root program to say, "drop all but a small subset of my user privileges", will deserve the world's gratitude.
Modern versions of Windows have a version of this in the form of Mandatory Integrity Control, which is one of the mechanisms used to sandbox web page scripts in Chrome and Internet Explorer.
posted by mbrubeck at 12:12 PM on May 21, 2014 [1 favorite]


People have tried to do crypto in non-C, and they almost all run into a fairly substantial set of problems. Compilers are very good at optimizing, and optimization can reveal timing information which can reveal state, which is bad news.

Haskell, and the rest of the functional class of languages are a mixed bag when it comes to security. On the one hand, they provide provable security for certain attacks against their applications. On the other, their performance characteristics (and memory components) are very hard to reason about due to lazy eval, which can lead to a different class of attacks. Now, these attacks may be better attacks than the ones C exposes, but we don't have a lot of experience with them. Ultimately I think this is the way forward, but I think there's a lot of work to be done still.

I think the liability types are also onto something. Look at Apple security record on iOS. They have a strong economic motivator to not allow their devices to be jailbroken (the app store) and as a result their devices are some of the most secure on the market (in addition to other design decisions, such as very very minimal IPC).
posted by yeahwhatever at 12:16 PM on May 21, 2014 [4 favorites]


There are ways to make the typical software stack less vulnerable, and better. GNAT doesn't have the most friendly licence (GPL with no linking exception), but Ada is fast, portable and safe. Ocaml can be too, as could stuff like Go or Rust. C programs could be made relatively safe by writing them in a very disciplined manner.

One big issue is indeed a cultural one, because data is code is data is code: the content of a JPEG file can be seen as a program that, when it's executed by a JPEG decoder, produces an image. The problem is that many of our decoders (PDF is a big one, but compression libraries, image libraries, browsers, etc.) aren't really built around that fact, and so they can be programmed by malicious users to produce unintended results, like giving the attacker root, or even just crashing or getting into an infinite loop.

It's possible to defend against these attacks, but right now that type of defense is a complete joke in most software.
posted by Monday, stony Monday at 12:20 PM on May 21, 2014 [2 favorites]


Scanned this quickly, but did not notice the topic that much of the "Internet" was designed for openness, ease of use and survive-ability. SMTP, the core email protocol is Simple mail transport protocol. All the security was an add on, only after the physicists and nerds who were friendly and collaborative on the net were invaded by money and the the folks that want that money.

But security has been done well. If you know your history of computers, no one ever broke into a well configured DEC Vax. There have been network software with great security built in. Breaking SIPRNet from the outside would be futile. But turning that on in the general world would be prohibitively expensive.
posted by sammyo at 12:25 PM on May 21, 2014 [3 favorites]


I don't think it would be overly expensive. But it would definitely be a paradigm shift.

Also, if open source can't survive without externalizing it's costs, then maybe it's time for a re-evaluation. Heartbleed was the ultimate open source fail state - a open source protocol became widely used, yet was underfunded (because nobody had to fund it) and thus undermaintained. The result was a massive security hole that is going to cost serious money both now and down the road.
posted by NoxAeternum at 12:37 PM on May 21, 2014 [1 favorite]


Look at Apple security record on iOS. They have a strong economic motivator to not allow their devices to be jailbroken (the app store) and as a result their devices are some of the most secure on the market (in addition to other design decisions, such as very very minimal IPC).

Yet despite this incentive, every version of iOS has been jailbroken. Android OTOH makes their more open architecture a feature, catering to users who want to live dangerously and install CyanogenMod or whatever.

I've given up being a reliability absolutist. I'm really just happy if my computers don't literally catch on fire nowadays. As long as criminals and intelligence agencies exist, boxes will be popped. If a piece of software can actually kill someone, it should be dealt with specifically e.g. avionics and medical software.

Otherwise, we have civil courts. Maybe there's a cottage industry in writing software engineering contracts that have actually enforceable reliability metrics -- I've never seen one.
posted by RobotVoodooPower at 12:46 PM on May 21, 2014 [4 favorites]


OpenSSL was well-known to be a Lovecraftian horror before heartbleed, just not among the general population. Yet people at Serious Businesses, who should have done their due diligence, kept using it. In terms of liability, I see it more like a case of construction materials : your supplier gave you (for free, in this case) a poorly designed fastener that wouldn't hold up. You're an expert, you could easily have seen that the fastener was bad, yet you kept using it. Sure, your supplier shouldn't have sold (given for free) those fasteners, but you should have known better, and thus share the responsibility.
posted by Monday, stony Monday at 1:25 PM on May 21, 2014 [1 favorite]


Well, no the key to why software that goes to space is well built isn't that NASA hires geniuses - it's that NASA treats software engineering as, well, engineering . The rest of the industry doesn't, which is where all the problems stem from.

Which is why NASA space probes have been lost due to confusion about imperial and metric measurements.
posted by MartinWisse at 1:34 PM on May 21, 2014 [1 favorite]


Which is why NASA space probes have been lost due to confusion about imperial and metric measurements.

I blame that on Reagan. Besides, it's not like the physical side didn’t do that as well - why do you think we had to put glasses on Hubble?
posted by NoxAeternum at 1:39 PM on May 21, 2014


My dad is paranoid about security, because he's nearly 80, but he uses the computer more than is normal for his age group (when his age group do use computers, it's usually online banking and shopping, but he does neither, just email and flickr and news). Plus he can code and stuff a bit (from his work). So he does nothing risky at all and has nothing secret on his computer, but in security terms it's like Fort Knox because of his age. For a start, firefox with all the security bells and whistles, he only goes online on Linux (Lubuntu at the moment, but he kills his system and reformats the hard disc and re-installs something new every few weeks). He has Windows, but that's a separate hard drive and never allowed online. Etc. It's so funny. NB just finished Hari Kunzru's Transmission (zero day plot) and great fun.
posted by maiamaia at 1:48 PM on May 21, 2014 [2 favorites]


The first person who makes it easy for a non-root program to say, "drop all but a small subset of my user privileges", will deserve the world's gratitude.
Actually... that won't help. It is far, FAR better to say something like.... run this program, and give it read access to this, and write access here.... and have to OS enforce those rules. *Defaulting to NO ACCESS to everything else.

Systems to do that were perfected in the 1970s... you just haven't heard much about them, because they weren't needed for normal use back then. The reason we still haven't all heard of them is inertia...social inertia, meme inertia, and code inertia.

Because everyone has some other party to blame, the computer security problem has been considered a social problem until recently. Admins blame users, Users blame hackers, Hackers blame Admins, and almost everyone blames application programmers, and OS developers.

Fortunately, we're starting to break through the social inertia... people are waking up to fact that nobody can fix the existing pile of stuff we all count on for survival as a species.

Most technical people don't really grok the concept of "the principle of least privilege", which is one of the approaches that allowed the US Military, in the early 1970s to get systems which were capable of Multi-Level Security. Nothing a lower level user can do can see or write to anything above it.

Fortunately, the Genode project will eventually allow people to run systems which can run Linux apps in such a manner... and there will be other projects (GNU Hurd is one, but I don't count on it ever shipping)

Once people start to see that it is possible to have secure computing, there will be a stampede to fix things, and life will slowly get better for all of us... because of code inertia that will take a long time.

I figure 10-15 years until this situation starts to get fixed... and 20-25 until it is sufficiently fixed to call it good.
posted by MikeWarot at 2:05 PM on May 21, 2014 [3 favorites]


One of the "reimplement crypto in Rust" projects is now talking about whether we need to nuke TLS from orbit and start something with a sane design from scratch. And yes, they know how insanely hard that would be.
posted by mbrubeck at 2:40 PM on May 21, 2014 [2 favorites]


He didn't build the CPU, so there's no telling what malware in in that silicon.

That's why it's so hard. If you want security, you have to have trust. Given recent reevaluations about NSA activity, there is no way you can trust that the hardware -- or the components that hardware is made of -- do not have a remote compromise baked into them.


Couldn't you perform some audits of network traffic from machines using a given hardware profile, and see if there's anything odd?

Also, has anyone even demonstrated proof of concept for a HW exploit like this?
posted by weston at 2:54 PM on May 21, 2014


For the longest time I have this thesis that it's all duct tape. All of society is held together by hacks and tweaks and cut-corners and we're just lucky that cumulatively it rarely fails. More and more demands placed on ever fewer resources (time, material, energy), so one has to continue to push crappier and crappier product (not just software, not just physical product, but also service). Competition forces us into the drive for reducing costs of labor who are forced into taking shortcuts to meet the demands placed on them... Look at your place of employ (if you are so fortunate as to be employed if of employment age), and tell me that it doesn't do a shit ton of weird fucking shit just to stay afloat and its own quirky way of doing things.

In some ways you can embrace the quirk, but in others, god help us, it's only be fortune our whole system doesn't come crumbling down.
posted by symbioid at 3:42 PM on May 21, 2014 [5 favorites]


OpenSSL was well-known to be a Lovecraftian horror before heartbleed, just not among the general population.

I think you would have a hard time convincing a jury (or me) that someone should have seen this coming and immediately ported their software to, say, NSS or SChannel without making a whole lot of subjective statements.

Remember that all of these modules are FIPS-certified, which is something the government embraces as a Good Thing, even though it did jack-all to address Heartbleed. So I'm skeptical that legislation is going to address errant keystrokes by programmers, whether they come from some German guy (i.e. Heartbleed) or a Microsoft employee (i.e. Dual_EC backdoor).
posted by RobotVoodooPower at 3:46 PM on May 21, 2014 [1 favorite]


Competition forces us into the drive for reducing costs of labor who are forced into taking shortcuts to meet the demands placed on them...

that alone explains a lot of suckage
posted by thelonius at 3:53 PM on May 21, 2014 [1 favorite]


The article's title, I believe, is a reference to the Dylan song of the same name.
posted by mosk at 4:11 PM on May 21, 2014


> And yes, they know how insanely hard that would be.

Maybe not quite as hard as re-implementing people in germanium or something else in Group IV (silicon, such a cliche; tin, lead, non-starters; flerovium, not found yet, just predicted) but definitely--if you include getting it widely adopted--same order of magnitude hard.

> Couldn't you perform some audits of network traffic from machines using a given hardware profile, and see if
> there's anything odd?

I would be embarrassed to tell you how often I look at my NIC's link light just to see if there's suddenly a lot of activity and I don't know why. If ever somebody writes malware that controls link light blinking to make it look ho-hum normal (ir if they already have and I'm not aware of it) I am so skrewed.
posted by jfuller at 4:37 PM on May 21, 2014 [2 favorites]


Couldn't you perform some audits of network traffic from machines using a given hardware profile, and see if there's anything odd?

Which would work unless the modification in the silicon is completely passive;i.e., doing nothing until it finds itself faced with a memory buffer (i.e., an incoming packet) matching a certain cryptographic signature, and then executing an instruction (allocate a block of memory/cache, add contents of buffer to this block, execute the block in privileged mode). Presumably the NSA/GCHQ/PLA/Mafiya/MJ-12 won't be spamming the net with these packets, but saving them for use against high-value targets, and making sure to destroy the evidence where possible, at least while easier exploits are available for use against common or garden troublemakers.

Given the amount of machine-generated silicon on a modern CPU, putting in the logic to do something complicated like that would be extremely plausible, and finding it would be very difficult. One could pretty much hide an entire 486 PC in the rounding error of a modern CPU, so a nice virtual machine with some useful control/exfiltration functions and a cryptography engine is more than plausible.
posted by acb at 5:04 PM on May 21, 2014 [2 favorites]


I think one thing people need to keep in mind is that the existence of vulnerabilities can also be a good thing when they allow end users to utilize their own computers in perfectly legitimate ways corporations or governments (and the software devlopers who work for them) may not approve of. A good example of this need would be Apple's ongoing collaboration with the Chinese government to censor "subversive" applications and books. While Apple may have to adhere to Chinese law by limiting their users' freedom in order to be profitable there, end users can take advantage of weaknesses in their security to bypass their restrictions. Eliminating open source software and implementing absolutely perfect unbreakable top down security in what's left would very quickly be used against activists around the world.
posted by Poldo at 5:57 PM on May 21, 2014 [2 favorites]


How do you feel about getting onto an offshore casino boat with condoms full of malware code hidden in your intestines?

--------------------

Competition forces us into the drive for reducing costs of labor who are forced into taking shortcuts to meet the demands placed on them...

that alone explains a lot of suckage


Let's apply this model society-wide and have a techno-utopia
posted by Ray Walston, Luck Dragon at 6:53 PM on May 21, 2014


He didn't build the CPU, so there's no telling what malware in in that silicon.

Also, has anyone even demonstrated proof of concept for a HW exploit like this?


Last time this came up here I got to thinking about what a CPU-backdoor could look like. The CPU is pretty limited in what it can do; it can't store a whole lot of data internally, and it can't reasonably access the disk or get to the network without the help of the OS, and if your OS is complicit, why bother hacking the CPU? And it's not easy to tell programmatically what memory addresses are passwords or whatever - remember you have billions of accesses to look at every second. So aside from compromising crypto-specific CPU features, the most plausible risk I can think of is that some magic sequence of undocumented instructions could circumvent the CPU's own memory protection systems, which could result in a local backdoor-aware user process gaining root access. Even if it's not a deliberate backdoor, it wouldn't be too surprising to learn that a bug in the CPU has this effect.

But we already know that running untrusted code is a terrible idea, because historically unprivileged programs have often been able to get the same result just by exploiting accidental bugs in the OS.

Well, no the key to why software that goes to space is well built isn't that NASA hires geniuses - it's that NASA treats software engineering as, well, engineering . The rest of the industry doesn't, which is where all the problems stem from.

I don't agree with this at all. Because engineering is all about making compromises. You don't make every bridge the absolute strongest you possibly could make it. You balance the cost against the requirements, and put a sign saying trucks over 4 tons should go a different route. The reason NASA has high reliability software is that reliability is a huge priority for them, so they spend a shitload more to write their software using meticulous practices. And they dramatically limit both the features and the ways it is going to be used. And I'm sure it still has some bugs!

Until software vendors *pay* for the damage they allow, they simply have little to no motivation to fix it.

If you don't like the current software, and want people to write better software, saying "If you write software, you could lose all your money!" is a good way to convince anyone to do it. I guess free software might still be able to operate by releasing software anonymously or something? Don't get caught though!
posted by aubilenon at 11:13 PM on May 21, 2014 [1 favorite]


Oh hah. What I actually came in here for was to comment on the article, and say that it sounds like a call for action ("It isn’t happening now because we haven’t demanded that it should") but it's not really clear what that action is - how we can demand secure software. Honestly, I don't even know how we can even tell when software is secure, other than deploying it widely and then waiting several years to see if people manage to break it.

It took 10 years for people to find the problems with SHA-1, and that's the abstract mathematical algorithm, not the code that implements it!
posted by aubilenon at 12:14 AM on May 22, 2014 [2 favorites]


Crypto issues are very different from implementation, and are fairly well defined. We expect that, over time, people will punch holes in the algorithms, and that we're going to have to renew them from time to time. But the algorithms are really well studied and finding those holes is quite non trivial.

But outside of the use of outdated algorithm, crypto problems aren't what's being exploited. Rather, it's simple coding and design errors that have been known for years, but that remain in the code because it mostly works.

It's a bit like earthquake resistance: you can have a house that stands up just fine now, but if it wasn't built correctly, it's going to fail in an earthquake. The algorithms would be like those expensive Strong Tie connectors. There may be some problems with them, sometimes, and you do need them (or something like them) to build a wood-framed house with good earthquake resistance. But if the basic design or the workmanship of the house are substandard, all the high-grade connectors in the world won't do you much good.
posted by Monday, stony Monday at 9:34 AM on May 22, 2014


I don't agree with this at all. Because engineering is all about making compromises. You don't make every bridge the absolute strongest you possibly could make it.

How did you get "NASA doesn't make compromises when it comes to software" out of "NASA approaches software development as engineering, and most places don't"?
posted by escape from the potato planet at 9:54 AM on May 22, 2014


I would be embarrassed to tell you how often I look at my NIC's link light just to see if there's suddenly a lot of activity and I don't know why. If ever somebody writes malware that controls link light blinking to make it look ho-hum normal (ir if they already have and I'm not aware of it) I am so skrewed.

A quick google search yielded this: Rootkit In a Network Card Demonstrated. If someone can modify your network card's firmware, they can control those lights.

As referenced in the article, your "computer" is actually a federation of several interconnected computers*, all with software. When was the last time you updated the firmware on your USB or Ethernet controller?

* I have one that literally has a network-accessible Linux machine with an HTTP server embedded in the video controller.
posted by cosmic.osmo at 1:35 PM on May 22, 2014 [2 favorites]


And just one more pragmatic point. While there are some insanely sophisticated attacks a major portion of compromised systems are where things like the default password is never changed. Just trivial stupid stuff that is just ignored or forgotten. Use good passwords folks.
posted by sammyo at 7:13 AM on May 23, 2014


tl;dr - Security, especially rigorous security, is a trade-off against usability. Users aren't willing to trade any usability for better security, so vendors aren't taking the hard steps required to deliver it.

Around 25 years ago, I worked on developing systems that were very, very secure. Certified TCSEC B-level, mandatory access control (MAC), pervasive, immutable system call level logging, fine grained ACLs and privilege separation, multilevel security labels, secure window manager. All the bells and whistles. All with every change meticulously documented from inception. These systems were used exclusively by the military and intelligence community.

So, we know how to make such things (assuming you think TCSEC describes how to build secure systems, which is a different, and much more bitter, argument). Why doesn't the world of commercial/personal computing adopt it? Easy. These systems absolutely SUCKED to use. Most of these features are difficult to set up, usually can't be automatically configured, almost always are user visible, interact in bizarre ways and are maddeningly fragile. And it get's exponentially worse every single app and user you add. Can't cut from this window and paste into this one because somewhere along the line someone decided that it's security label was different (one is Confidential ALPHA and the other is Confidential WIZARD and there's no trust relationship between the labels...where's your ISSO when you need him). Can't save that file? Must be because noone set your ACL to allow VendorAdobeAllowRWToLocalDisk perms. Are the folks who write your apps (gawd forbid your device drivers) correctly integrating into those security infrastructures? And that security and logging overhead will consume up to 50% of your computing resources before you do anything.

And the reality is, many of these features exist in the products we use every day. Consumers just don't want to use them, because they're hard to get right and break easily.

Fine grained ACLs? Windows has had them since NT and Linux has had them since kernel 2.2 I think. Most people don't even know they're there. MAC? See: SELinux. Then see the millions of posts about how big a pain in the ass SELinux is to get right and how everyone just turns it off. Windows recently added MAC (I haven't used it), but has had a reasonably DAC implementation forever that while not a panacea is useful against a whole bunch of attacks. If anyone really used it.

The only reason the B-level systems are useful at all is that 1) they're a well defined, limited, fixed hardware platform, 2) well defined, validated application suite, 3) used by people who are trained, and who aren't allowed to stray outside the box. That doesn't match the average user who just wants to post selfies to Facebook, have Flash run for totally not porn videos from totally not skeevy web sites, and be able to load whatever crap their friends think is the Coolest App Evah without the computer telling them "no, that's really stupid". Even if it is. Security and usability are just as much a trade-off against one another today as they were 25 years ago.

P.S. - Someone up-thread mentioned Ada, which is almost 30 years old now. You want to kill essentially all of these security issues born of programming errors? Ada is what C isn't. Ada doesn't trust shit. Ada is an anal retent. Ada requires a *lot* of up front design work for even trivial programs. You can (obviously) get a running C program that's rife with lethal type conversion errors, memory leaks, off-by-one errors. Ada compilers won't even compile the program until you've addressed that. Sure, Ada can still have those issues (and others), but you have to work hard to force the compiler to look the other way. So Ada (and, more generally, the whole family of strongly typed, procedural languages), just like TCSEC security, stayed in niches where the risk of screwing up justified the pain in the ass of use. Strong typing seems to be making a comeback of sorts, so maybe there's hope.
posted by kjs3 at 8:42 AM on May 23, 2014 [2 favorites]


It's true that a lot of strong security measures are a pain in the ass. But surely your image decompression library shoulnd't hand everyone the keys to the kingdom.
posted by Monday, stony Monday at 2:49 PM on May 23, 2014


« Older Second Breath.   |   The goat says "Meh" Newer »


This thread has been archived and is closed to new comments