Cybersecurity is an Exaggerated Risk
April 28, 2011 12:43 PM   Subscribe

Jerry Brito and Tate Watkins of George Mason University published a new paper "Loving the Cyber Bomb? The Dangers of Threat Inflation in Cybersecurity Policy" examining the parallels with the US military's other recent exaggerations. "Cybersecurity is an important policy issue, but the alarmist rhetoric coming out of Washington that focuses on worst-case scenarios is unhelpful and dangerous. Aspects of current cyber policy discourse parallel the run-up to the Iraq War and pose the same dangers. Pre-war threat inflation and conflation of threats led us into war on shaky evidence. By focusing on doomsday scenarios and conflating cyber threats, government officials threaten to legislate, regulate, or spend in the name of cybersecurity based largely on fear, misplaced rhetoric, conflated threats, and credulous reporting. The public should have access to classified evidence of cyber threats, and further examination of the risks posed by those threats, before sound policies can be proposed, let alone enacted. ... No one wants a “cyber Katrina” or a “digital Pearl Harbor.” But honestly assessing cyber threats and appropriate responses does not mean that we have to learn to stop worrying and love the cyber bomb."
posted by RSaunders (17 comments total) 6 users marked this as a favorite
 
I'm pretty sure the crazy rhetoric is the result of computer and software companies wondering how to get a piece of that sweet military contractor money.

But it's been way over the top and totally bizzare in how over the top it is. I mean a "digital Pearl Harbor" (why not just "Perl Harbor"?) Really? How are hackers going to kill 2000 people? Maybe we shouldn't have systems setup capable of that kind of destruction?

The interesting this is that the worst "cyberattack" ever, Stuxnet, was A) Done by us, and B) didn't use the internet. Oh and, minor point: C) didn't kill anyone.
posted by delmoi at 12:51 PM on April 28, 2011 [4 favorites]


Ars Technica summary/coverage.
posted by Inspector.Gadget at 12:58 PM on April 28, 2011 [1 favorite]


Loving the Cyber Bomb? The Dangers of Threat Inflation in Cybersecurity Cyberterrorism Policy"

FTFY.

Sincerely,
- Everyone who stands to profit.
posted by Mister Fabulous at 1:00 PM on April 28, 2011


The interesting this is that the worst "cyberattack" ever, Stuxnet, was A) Done by us, and B) didn't use the internet. Oh and, minor point: C) didn't kill anyone.

Calling Stuxnet the worst is a bit of a stretch. Best designed? Most clever? Yeah, I'd probably give it consideration for those. Not worst, though.
posted by bfranklin at 1:00 PM on April 28, 2011


I'm pretty sure the crazy rhetoric is the result of computer and software companies wondering how to get a piece of that sweet military contractor money.

This - a recent Government Executive article identified cybersecurity as “the most active - and potentially lucrative - sector for contractors in FY 2011.”
posted by ryanshepard at 1:01 PM on April 28, 2011


Yeah, what's the worst case scenario here? A few companies lose lots of money? Internet access goes down for a while? ID theft?
posted by oddman at 1:02 PM on April 28, 2011


On the one hand, I agree with a lot of what the report is saying, regarding alarmist rhetoric, worst case scenarios, conflated threats, and just general fear, uncertainty and doubt. On the other hand, here is a list of cyber attacks that have happened in the past six months, and these are only the ones that made it to the front page of newspapers:
  • HBGary Federal had all of their emails made available on Bittorrent
  • Computers used by the Australian Prime Minister and other ministers were hacked, given the intruders access to several thousand sensitive emails.
  • A similar attack happened to computers maintained by the Canadian government, giving the attackers access to classified federal information.
  • Comodo Group had their systems breached, with several fake browser certificates created along the way.
  • Databases used to maintain RSA SecurID tokens were breached using a combination of a spear-phishing attack and a zero-day Flash exploit.
  • The Epsilon mailing list service, which maintained mailing lists for many large corporations, had their databases hacked, quite possibly through a phishing attack.
  • The PlayStation Network was hacked, with over 65 million accounts compromised, including names, street addresses, email addresses, and purchase histories stolen.
The authors are absolutely correct that many of the current cybersecurity proposals severely intrude on privacy and civil liberties. However, the bar for computer security is embarrassingly low right now, and there really does need to be a lot more done to improve the current state of affairs.

On preview: oddman, you have a good question, but it really is a lot more serious than money and ID theft. Source code has already been stolen in the Operation Aurora attacks, and that's the lifeblood of a lot of companies. StuxNet showed that malware could seriously damage industrial systems. Researchers in UW and San Diego showed how hackers could remotely get access to your car and affect other in-car systems. Computer systems are pervasive in every industry and every aspect of our lives. It's not hard to imagine the damage that can be done, especially with the sad state of computer security today. The challenge, of course, is figuring out how to realistically assess the risks and developing reasonable solutions that can legitimately address the problems.
posted by jasonhong at 1:17 PM on April 28, 2011 [3 favorites]


Is anyone else having problems with the PDF at the link? I can't seem to open it and even just downloading it is making my system go nuts (Win 7, Acrobat Reader 9.4).
posted by tommasz at 1:19 PM on April 28, 2011


Worst case scenario, imo, is Wikileaks-type disclosure of sensitive military information -- troop movements, classified weapon/vehicle characteristics, nuclear weaponry specs, etc. Exfiltrating this data could quickly erode any military dominance that the U.S. currently has. Some may question the importance of military dominance, but I think history shows that avoiding conflict is damn near impossible, and I want to be on the side of the guy with the bigger gun when conflict starts.

Another area of concern is SCADA control systems. There were lots of rumors in 03 that the east coast blackout was triggered by malware infection of SCADA control systems (hell, maybe this is where the idea for Stuxnet was conceived). SCADA systems are special purpose computers and networks designed for real-time control of important industrial systems. Taking down these could cripple infrastructure or very easily result in loss of life for the operators of these systems. In theory, an attack on an improperly secured nuclear control SCADA system could result in a Fukushima-style issue, especially if the system wasn't brought down, but was instead configured to report false information. I don't know enough about nuclear infrastructure to know the checks and balances that could affect this, but you can easily see how, e.g., a less regulated chemical plant could be a good target.

The first area of concern is actually significantly more likely than the second -- the retribution that would be unleashed for either would be immense, so the risk:reward is much higher for the first.

I'm sure to a lot of people this all sounds like the bogeyman, but I worked for 3 years at a security contracting firm. NDA prevents me from discussing specific clients, but you would not believe the ridiculousness that security professionals see on the networks of companies that should certainly know better. The stuff that happened to HBGary Federal isn't an example of someone getting their pants down. It's an example of how every single organization has gaping holes in its network, and it's not a question of if you'll be hacked; it's a question of when you'll be hacked and how long it will take to detect it.

On preview: jasonhong has a lot of good examples also. I agree that the report is FUD, but the issue is real and needs to be discussed.
posted by bfranklin at 1:24 PM on April 28, 2011


Ignore previous comment, this was a Chrome issue. Viewing and downloading in Firefox worked just fine.
posted by tommasz at 1:34 PM on April 28, 2011


HBGary Federal was not an example of a "cyber attack", they were the poster child of a useless government contractor trying to get that sweet military contractor money by hyping up a threat ("Anonymous!"). And while everyone has holes, if your admins create custom backdoors for the CEO via email and provide them with a username and password via email with no secondary authentication, you probably shouldn't be working in security.
posted by benzenedream at 1:45 PM on April 28, 2011 [2 favorites]


I am not convinced comparing the issue of cyber security ought to be compared to Iraq. In the Iraq invasion, the govt was either lying or at least lied to, and the invasion took place with the Congress sitting about and finally going along with what had already taken place. Cyber security might well be overblown to make money, but how does that compare to Iraq other than something overblown and lots of money gets involved. We do not invade a nation. We do not hunt down its leader. We do not lose men killed and wounded etc.
posted by Postroad at 1:48 PM on April 28, 2011


Postroad: "but how does that compare to Iraq other than something overblown and lots of money gets involved. We do not invade a nation. We do not hunt down its leader. We do not lose men killed and wounded etc."

It compares in that it can be used to lessen our civil liberties even more? (Of course, that is more 9/11 than Iraq, but the two are still bound together.)
posted by charred husk at 1:50 PM on April 28, 2011 [1 favorite]


Yeah, what's the worst case scenario here? A few companies lose lots of money? Internet access goes down for a while? ID theft?

The worst case scenario in my mind would be based on the following two thoughts: Damn near every business has connected any device they can think of to the internet. Computer security is only as good as the most ignorant employee.

An old example (mid-1990's) that appeared in an issue of 2600 magazine was about Best Buy and their in-store systems. For reasons I still fail to understand, Best Buy had the heat, A/C, lights, doors, etc. all controlled by a computer system that was directly accessible via a modem. Call the modem (phone number was in plain sight in the store, usually near the computer station near music), go through the terminal and you had to enter the password. The failure was that the login was the store number, the password was a 4-digit numerical password where brute force attacks worked easily. Often the password was 1111, 1234, etc. No brute force prevention (limited attempts, etc). Once in, you could set the heat to 90 on a toasty July day, all while disabling the doors and killing the lights in the middle of the afternoon if you damn well pleased. I'm pretty sure Best Buy has shored this instance up by now (probably connected to the internet instead of a modem).

Two thoughts rolled through my head when I read the article: "who uses store number and 1234 as login information?" and "why the hell do the temperature and lights need to be controlled by someone from outside the store via a dial-up?"
posted by Mister Fabulous at 2:31 PM on April 28, 2011


HBGary Federal was not an example of a "cyber attack", they were the poster child of a useless government contractor trying to get that sweet military contractor money by hyping up a threat ("Anonymous!").

Did they swat at the beehive? Yes. That doesn't change the fact that they were systematically attacked by an adversary. I don't see how you can say that this isn't a cyber attack. Exactly how do you define cyber attack?

while everyone has holes, if your admins create custom backdoors for the CEO via email and provide them with a username and password via email with no secondary authentication, you probably shouldn't be working in security.

The backdoor and providing u/p via email was how anonymous owned Hoglund's rootkit.com site. HBGary was owned through SQL injection, local privilege escalation, and unencrypted and reused passwords.

Most organizations don't have an accurate accounting of what devices are on their networks, let alone the patch level of those devices. Operational groups are worried about maintaining project billable hours and not security. Business needs consistently override security needs except in the case of (weak) regulations. You are working against an adversary that invariably has superior knowledge and firepower, even if you are a talented security practitioner.

Let me repeat that again. A talented attacker has significant tactical advantage over a defender. The defender must defend against all attackers. The attacker need only identify (or, in the case of 0-day vulnerabilities, discover) a single vulnerability that the defender missed. When you consider the variety of different software systems on a network and the fact that a significant number of users will trade their password for a Snickers, your typical defender is really in the situation of attempting to defend a city block from attackers with a bow and arrow.

All that said, I agree that the admin who created the rootkit.com backdoor really screwed up. However, if you're in the business of security you'd better get very comfortable with deciding where you're willing to let your defenses fail, and know how you're going to handle it when that happens. This includes when your operational folks fail to think like a security professional. The failure wasn't really the admin creating the backdoor. The failure was not having an enforced policy to prevent this sort of thing from being considered, and not having a standardized DR plan for recovering rootkit.com in the wake of the attack.
posted by bfranklin at 3:05 PM on April 28, 2011 [1 favorite]


Mister Fabulous makes a key point here. The current cybersecurity fashion was once called "the M&M strategy". Routers and firewalls make a hard crust on the outside of the network, leaving a soft unprotected (and tasty) inside. Since the Maginot Line military experts have known this was simply a bad strategy. When any crack results in total failure, your security is "only as good as the most ignorant employee".

Why does this persist? It is cheaper to use insecure solutions than secure ones. CIOs and other decision makers are fired for exceeding budgets and consoled for security breaches. As more and more breaches occur, it's more acceptable to have a problem under the "it happens to everybody" excuse.

This problem can be solved, with the same tool we used to solve reliability issues in cars. Software licenses contain a waiver of consequential damages. That means that if the software has a bug, you can't sue the manufacturer and recoup your losses. You might get the $92 you paid for Windows 7, but you can't sue Microsoft for your loss of revenue if a windows vulnerability causes you harm. The US should apply the principle of implied merchantability to software. This would make these consequential damages releases void and allow software companies to be sued for insecure (= defective) products. It would cause a huge uproar, so the law would have to have some gradual phase-in. It might mean that we can't buy a 50 million line operating system from Microsoft for $100. That might be a good thing.
posted by RSaunders at 6:40 AM on April 29, 2011


The US should apply the principle of implied merchantability to software.

I don't think this would work. Cars ship with defects; this is why we have recalls and recall notifications. These recalls reduce or eliminate the liability of manufacturers for defects that are not repaired or addressed by the product owner. Implied merchantability also only protects the consumer when they are operating the product in a reasonable fashion.

Every intrusion I have handled has been because someone either a) failed to install a patch, or b) misconfigured a product in a really stupid way. Implied merchantability does nothing for either case.
posted by bfranklin at 7:16 AM on April 29, 2011


« Older The Steely Dan Infographic Project   |   sometimes things go up, sometimes things go down Newer »


This thread has been archived and is closed to new comments