Apple's canary is missing
September 18, 2014 9:06 AM   Subscribe

 
The canary in the gold mine.
posted by 2bucksplus at 9:11 AM on September 18, 2014 [5 favorites]


Apple on Privacy:
Government information requests are a consequence of doing business in the digital age. We believe in being as transparent as the law allows about what information is requested from us. In addition, Apple has never worked with any government agency from any country to create a “back door” in any of our products or services. We have also never allowed any government access to our servers. And we never will...

...A tiny percentage of our millions of accounts is affected by national security-related requests. In the first six months of 2014, we received 250 or fewer of these requests. Though we would like to be more specific, by law this is the most precise information we are currently allowed to disclose.
I'd say what I think of the NSA and the PATRIOT Act, but, y'know...
posted by entropicamericana at 9:17 AM on September 18, 2014 [2 favorites]


.
posted by Buttons Bellbottom at 9:19 AM on September 18, 2014 [1 favorite]


Think'n different!
posted by markkraft at 9:20 AM on September 18, 2014


Now, Apple’s warrant canary has disappeared. A review of the company’s last two Transparency Reports, covering the second half of 2013 and the first six months of 2014, shows that the “canary” language is no longer there.

Does that mean no one has bothered to look at these reports since 2013?
posted by Monochrome at 9:21 AM on September 18, 2014 [1 favorite]


If a multinational like Apple can change their technical operations so that they can no longer comply with the spirit of the law, what does this say about the nation-state?
posted by infinitewindow at 9:26 AM on September 18, 2014 [2 favorites]


That it's broken? I thought we already knew that.
posted by Steely-eyed Missile Man at 9:28 AM on September 18, 2014 [1 favorite]


Ok, I love the idea of canaries, but here's the thing I don't get about them. The intelligence community has shown that it can pretty much do whatever the fuck it wants, so what's to stop an agency from telling, say, a librarian, "keep the canary up or Consequences"?
posted by threeants at 9:30 AM on September 18, 2014 [18 favorites]


Apple expands data encryption under iOS 8, making handover to cops moot. Allegedly, Apple is giving up the ability to decrypt devices with iOS 8. Maybe this decision came in response to receiving new NSA demands.

Of course, there's no real proof that they *can't* still decrypt an iPhone, or haven't turned that ability over to the NSA directly. And Apple wouldn't be allowed to reveal that fact fact anyway. So the "canary"'s death may or may not mean anything.

Upgrade to iOS 8, your nudies and drug deals are probably slightly safer from your garden variety warrantless search during a traffic stop by your friendly local Ferguson, MO police officer. But if you're trying to hide from the NSA, yea, good luck with that still, keep using your one time pads.
posted by T.D. Strange at 9:32 AM on September 18, 2014 [4 favorites]


Cook's starting to get religion when it comes to privacy - he even dropped blue_beetle's “When something online is free, you’re not the customer, you’re the product” bomb when touting new privacy and security features (such as that the new iPhones cannot be remotely unlocked by Apple, even with a court order.) This may have been a prod in this direction - hand over business data? So government contractors can go over them with a fine tooth comb and "leak*" them to business partners? I would be above ripshit and a sudden believer in privacy rights, too, especially if my business model isn't based around violating them (coughGooglecough).

I've been an Android user since the original Motorola Droid, mostly because I disliked the closed nature of the iThing ecosystem. Apple's been increasing transparency, security and interoperability within its ecosystem and without - where Google's been headed in the other direction, plus it's new UI is like Metro as envisioned by chimpanzee-children. I may be making a move - I'm already on Cyanogen, but I need to be rid of the ecosystem.

(*Bad apples in our organization, we've improved security and training, and ha! Who cares? It's all top secret, and we're so deep in bed with the bureaucrats who hired us, we'll never be held accountable!)
posted by Slap*Happy at 9:36 AM on September 18, 2014 [2 favorites]


Hypothetically speaking, if a person was an NSA mole assigned to work at Apple, would they collect both their NSA salary and Apple salary?

Traditionally double agents collect two salaries until one side or the other executes them.
posted by localroger at 9:39 AM on September 18, 2014 [32 favorites]


Traditionally double agents collect two salaries until one side or the other executes them.

"Looks like he was garrotted with a Lightning cable. What a way to go, poor bastard. Bag 'im up, Jimmy. Nothing for us to do here - it's Cupertino."
posted by griphus at 9:43 AM on September 18, 2014 [44 favorites]


"Serious question: has anyone suggested Android's built-in full encryption has backdoors built in?"

As far as i'm aware, the biggest problem is thati t only encrypts itself when the phone is turned off. when the phone is locked it is not encrypted so as long as the phone stays on and Android is running then it remains in a decrypted state.
posted by I-baLL at 9:47 AM on September 18, 2014


If a multinational like Apple can change their technical operations so that they can no longer comply with the spirit of the law, what does this say about the nation-state?

ROTFL-ing about "spirit of the law" applying to ex-parte gag orders interpreted by secret courts.
posted by RobotVoodooPower at 9:48 AM on September 18, 2014 [38 favorites]


Now I can only imagine how stylish and yet austere an Apple-sponsored execution would be.
posted by 2bucksplus at 9:48 AM on September 18, 2014 [13 favorites]


Serious question: has anyone suggested Android's built-in full encryption has backdoors built in?

Yup.
posted by Slap*Happy at 9:48 AM on September 18, 2014 [1 favorite]


Of course, there's no real proof that they *can't* still decrypt an iPhone, or haven't turned that ability over to the NSA directly. And Apple wouldn't be allowed to reveal that fact fact anyway. So the "canary"'s death may or may not mean anything.

Yeah, no, the canary is down. I don't know what I'd actually change in my day to day habits now. Maybe buy a Faraday bag or lead box to keep the thing in when I'm not using it. Its not as if I get more than a phone call a week anyway.
posted by Slackermagee at 9:51 AM on September 18, 2014


Ok, I love the idea of canaries, but here's the thing I don't get about them. The intelligence community has shown that it can pretty much do whatever the fuck it wants, so what's to stop an agency from telling, say, a librarian, "keep the canary up or Consequences"?

I'm actually sort of surprised that they work at all, or if there won't be some sort of a smack-down at some point in using them intentionally. It's not like propositional content isn't being communicated. It's just that a particular proposition about government involvement is being communicated in a way using different sorts of symbols. That is, a lack of a canary says something as clearly as giving an overt warning about whatever the concern is.

Is the idea that there is a plausible alternate explanation that shields those who use them, if pressed? Like, we just forgot to put the statement back in the Transparency Report?
posted by SpacemanStix at 9:52 AM on September 18, 2014 [3 favorites]


the idea that Apple, a closed source vendor, standing up against NSA & co is ludicruos.
we have no way to verify the fact, so this is nothing more than hot air interpreted for and delivered by Cook, from the lips & ears of the PR department.
posted by xcasex at 9:54 AM on September 18, 2014 [6 favorites]


Weren't there supposed to be hearing and prosecutions about now relating to the "discovery" that the Intelligence Community is basically an illegal blackmail ring aimed at the people it supposedly works for?
posted by Navelgazer at 9:54 AM on September 18, 2014 [1 favorite]


only encrypts itself when the phone is turned off. when the phone is locked it is not encrypted so as long as the phone stays on and Android is running then it remains in a decrypted state.
The filesystem is always encrypted, but when the phone is on then the encryption/decryption keys are in RAM. (this is necessary if you want programs to be able to actually do anything on that filesystem). It's not easy to get at the contents of RAM when you don't know the phone password, but it turns out that there is a cool way to do so.

This might not be a solvable problem. It's basically the same problem as DRM: if you hand someone encrypted data, you can't also hand them the encryption keys and expect that encryption to remain secure.
posted by roystgnr at 9:57 AM on September 18, 2014 [1 favorite]


~the idea that Apple, a closed source vendor, standing up against NSA & co is ludicruos

What does "closed source vendor" have to do with anything in this matter?

~so this is nothing more than hot air interpreted for and delivered by Cook, from the lips & ears of the PR department.

Care to cite some evidence? And how does this discovery (the missing canary) benefit Cook or Apple at all? Or, are you saying the canary was an Apple PR lie in the first place?
posted by Thorzdad at 9:58 AM on September 18, 2014 [1 favorite]


What does "closed source vendor" have to do with anything in this matter?
When everyone can read your source code diffs you have to do some clever obfuscation to put hostile changes into them.

When all they can see are your binary diffs you get the obfuscation for free.
posted by roystgnr at 10:00 AM on September 18, 2014 [4 favorites]


Slap*Happy: The article to which you linked to references another article which references another article:

http://www.cnet.com/news/how-apple-and-google-help-police-bypass-iphone-android-lock-screens/

which says that Google can just reset the lockscreen password. Doesn't really mention the encryption though.
posted by I-baLL at 10:03 AM on September 18, 2014


Or, are you saying the canary was an Apple PR lie in the first place?

How could it be a lie? They stated they had no Patriot Act searches. Then they didn't. We are interpreting this as a Warrant Canary, but Apple never said explicitly that it was.

So.

1) It was an explicit canary, and someone did in fact make those searches, so the canary was removed.

2) It was not, it was just a statement of fact that was removed when not true. It was never intended to be a warrant canary, but it was one.

3) It was not, the statement was made in one report and removed later for no real reason.

Which one's true? Got me.
posted by eriko at 10:06 AM on September 18, 2014 [3 favorites]


the idea that Apple, a closed source vendor, standing up against NSA & co is ludicruos.

If some /b/tard can hack the ICloud and get all those celebs pics out, imagine what the NSA can do... I think 'standing up against the NSA' would be pretty pointless anyways
posted by CitoyenK at 10:06 AM on September 18, 2014


Open sourcing the OS is of limited benefit to the end user when the hardware and implementation remain proprietary.
posted by ardgedee at 10:06 AM on September 18, 2014 [4 favorites]


When everyone can read your source code diffs you have to do some clever obfuscation to put hostile changes into them

And Heartbleed, the Debian SSL flaw, etc. have shown that you don't really need to do anything.

"Many eyes makes bugs shallow" is a myth.
posted by NoxAeternum at 10:09 AM on September 18, 2014 [17 favorites]


> If some /b/tard can hack the ICloud and get all those celebs pics out...

Nobody had hacked iCloud. The accounts were compromised due to weak passwords and social engineering.
posted by ardgedee at 10:10 AM on September 18, 2014 [19 favorites]


Apple expands data encryption under iOS 8, making handover to cops moot. Allegedly, Apple is giving up the ability to decrypt devices with iOS 8. Maybe this decision came in response to receiving new NSA demands.

This is a nice gesture, but it's been solidly established (twice now) that the US government merely demands "Redesign your infrastructure so you can do All The Snooping for us, Or Else." and the company has no way out - either hand us over to the snoops or go out of business (and possibly still face jail).

Yahoo was given a quarter-million-dollars-PER-DAY fine until it complied with the snoops, so Yahoo complied.
Lavabit went out of business rather than comply, exposing its owner to contempt of court charges.
No doubt there are more victims that haven't come to light.

But still, kudos to Apple for at least trying to give the finger.
(And a slap on the wrist to Apple marketing for suggesting to people that they are protected by this)
posted by anonymisc at 10:10 AM on September 18, 2014 [4 favorites]


Clearly the NSA wanted to know just how the fuck a U2 album got onto their systems.
posted by delfin at 10:11 AM on September 18, 2014 [51 favorites]


NoxAeternum: not sure what your point was. with closed source software if a bug or a vulnerability is discovered then you have to rely on the maker of the software to fix the problem. This may never happen.

With open-source software the problem can be fixed not just by the software vendor.

"Many eyes makes bugs shallow" is a myth.

It's not a myth at all. It's just that there aren't many eyes available to look at every piece of open source software.
posted by I-baLL at 10:14 AM on September 18, 2014 [3 favorites]


Nobody had hacked iCloud. The accounts were compromised due to weak passwords and social engineering.

A little from column A, a little from column B. Internet jerks were able to run a dictionary attack on the Find My iPhone login page because Apple forgot about security, but only after they got the relevant e-mail addresses (not access to said e-mails, just knowledge of what the addresses are) from non-hackery methods.
posted by Holy Zarquon's Singing Fish at 10:14 AM on September 18, 2014 [5 favorites]


Coincidentally, Apple announced a new privacy information website yesterday.
posted by ardgedee at 10:16 AM on September 18, 2014


Good news for Americans (Apple is now finally at the same privacy protection level as Google and Facebook), but Chinese Apple product users are still fucked. But hey at least Apple is all set to make $billions!
posted by Poldo at 10:16 AM on September 18, 2014


Allegedly, Apple is giving up the ability to decrypt devices with iOS 8.

Apple isn't the first company to try this. Ask Blackberry how well that worked out for them in the world markets. They have had "or else" demands from a number of governments in the past few years.
posted by bonehead at 10:24 AM on September 18, 2014


Couldn't the NSA coerce Apple into reengineering their systems so as to meet their requirements, and into not revealing that they have done so? As such, Apple's claim of no back doors and no ways to decrypt users' email may be irrelevant.
posted by acb at 10:27 AM on September 18, 2014 [1 favorite]


NoxAeternum: not sure what your point was. with closed source software if a bug or a vulnerability is discovered then you have to rely on the maker of the software to fix the problem. This may never happen.

With open-source software the problem can be fixed not just by the software vendor.


My point is that the latter may never happen either, and so we shouldn't be kidding ourselves about how open source is somehow structurally better with these sorts of issues, because there have been enough high profile incidents that have proven otherwise.
posted by NoxAeternum at 10:29 AM on September 18, 2014 [3 favorites]


Weren't there supposed to be hearing and prosecutions about now relating to the "discovery" that the Intelligence Community is basically an illegal blackmail ring aimed at the people it supposedly works for?

Elaborate?
posted by Fuka at 10:29 AM on September 18, 2014


"we shouldn't be kidding ourselves about how open source is somehow structurally better with these sorts of issues, because there have been enough high profile incidents that have proven otherwise."

It is better. Why? Because the problems are able to get fixed. With closed source the same problems are there but fixing them is much harder.
posted by I-baLL at 10:32 AM on September 18, 2014 [3 favorites]


As important as it is to push companies to protect privacy and data security, it's also a testament to how much we've given up if our starting principle is that the NSA is so hopelessly invasive and intractable that our democratic government is unable to enact any meaningful change to collar it.
posted by Apocryphon at 10:34 AM on September 18, 2014 [10 favorites]


"we shouldn't be kidding ourselves about how open source is somehow structurally better with these sorts of issues, because there have been enough high profile incidents that have proven otherwise."

Let me tell you a little story about Sun's GridEngine project. back when I still was a senior infrastructure architect and cared, I was working on a gap-analysis of what to do with Sun's GE project. First we'd had meetings for the better part of 6 months where we were promised that the project isnt going anywhere.
Then came the sunset.. Oracle bought Sun.
a month into it, we learnt that they were sunsetting the entire project (incidentally they fired everyone on the GE team, killed the opensource project site, storage, links for the OSS version etc).
a few weeks after that we were told by Oracle rep's that we could still get updates, if we paid xyz.

We went with a contender that was availible under an Open Source license instead.
this is why open source is important not because its by principle Gratis, but because it enables us to modify it ourselves even after the vendor has closed the doors.


the NSA has proven to be very invasi^H^H^H Persuasive when it comes to getting their mittens on raw data, and a gag is a gag. there is no way to know anything about their (AAPL) newfound supposed transparency, unless they're ready to be well and truly honest about it -- as in access to said NSL's -- but that would mean the company gets sanctioned by the goverment -- or should I say a part of the goverment.

(and re: backdoors bla bla: it's far far easier to grep/ctrl-f and look for strangeness than auditing binary blobs.)
posted by xcasex at 10:45 AM on September 18, 2014 [5 favorites]


When you have a massively invasive system, resistance isn't useless; it just evolves like multiply-drug-resistant bacteria to survive under extreme pressure and hide in the gaps. The downside of that is that, even when things ease off, you have cultures of extreme paranoia and dog-eat-dog predation. For example, the USSR gave us the Russian Mafiya and the impressively sophisticated and utterly malignant cybercrime ecosystem. The NSA umbrella may well end up giving us everything from unregulable virtual economies to paedoterrorist death cults, all of them impossible to eradicate.
posted by acb at 10:45 AM on September 18, 2014 [1 favorite]


Marcy Wheeler: About Apple’s Dead Warrant Canary
posted by homunculus at 10:45 AM on September 18, 2014 [2 favorites]


Did you even bother to read the link?

Yes, it is strongly implied that Google has a way to do an end-run around disk encryption. By way of background, Android only encrypts the "data" directory and conveniently leaves unencrypted and available most of the system files. It's a feature for those concerned with cybercrime, and not intended to secure your person and papers from any LEO who comes sniffing after it. They will reset your password/pin, and hand it over to the PoPo.
posted by Slap*Happy at 10:46 AM on September 18, 2014




It is better. Why? Because the problems are able to get fixed. With closed source the same problems are there but fixing them is much harder.

If this is the case, then why didn't Microsoft's SSL implementation have the same issues as OpenSSL?

Your argument is getting into "open source cannot fail, only be failed" ground. The fact that "oh, well, we can fix Heartbleed once it's discovered" doesn't change that it was a massive fail state for open source, nor that it proved that open source is not immune to the tragedy of the commons.
posted by NoxAeternum at 10:54 AM on September 18, 2014 [1 favorite]


As if Android is meaningfully open source these days. Enough critical behavior now lives in apps, which was an intentional decision to help with version fragmentation, that it's not really true that you could download the code somewhere and perform a comprehensive audit. It wouldn't be a bad thing if ios were truly open, but it's not like they are unique in not being so.
posted by feloniousmonk at 10:59 AM on September 18, 2014 [2 favorites]


the idea that Apple, a closed source vendor, standing up against NSA & co is ludicruos.
we have no way to verify the fact, so this is nothing more than hot air interpreted for and delivered by Cook, from the lips & ears of the PR department.


Actually, that's not true. As a publicly traded company Apple is subject to lawsuits if its executives lie or mislead the public on a subject that could affect the stock price. CEOs can't just outright lie about what their companies are doing.

Of course, you can disagree by saying "the NSA can do whatever it wants and make it legal for anyone to break any law". If that's what you believe then, yeah, you'll just need to only use software that you've written yourself. Tim Cook could actually be an NSA sock puppet masquerading as a CEO.
posted by alms at 11:01 AM on September 18, 2014 [2 favorites]


> the idea that Apple, a closed source vendor, standing up against NSA & co is ludicruos.
we have no way to verify the fact, so this is nothing more than hot air interpreted for and delivered by Cook, from the lips & ears of the PR department.


Isn't this about the data going to and from the servers? In which case the device's operating system doesn't matter.
posted by Monochrome at 11:01 AM on September 18, 2014


Elaborate?

In the past year the NSA was caught wiretapping the Senators on its oversight committee in order to blackmail them for funding, including Sen. Feinstein, who up until that point had previously been the Intelligence Community's fiercest bannerman.

In case that feels biased or improperly stated, see previously.
posted by Navelgazer at 11:04 AM on September 18, 2014 [7 favorites]


Allegedly, Apple is giving up the ability to decrypt devices with iOS 8.

Kind of disingenuous: Apple says they no longer have the keys. What they don't say is that it would be trivial for them to get the keys.
posted by blue_beetle at 11:06 AM on September 18, 2014 [2 favorites]


CEOs can't just outright lie about what their companies are doing.

As experiences with AT&T and other telcos have shown, that ain't necessarily so, when "national security" is invoked.
posted by bonehead at 11:07 AM on September 18, 2014 [5 favorites]


Actually, that's not true. As a publicly traded company Apple is subject to lawsuits if its executives lie or mislead the public on a subject that could affect the stock price. CEOs can't just outright lie about what their companies are doing.

AT&T, ENRON, BP, etc etc etc.
They can, they have. it's the reason why there's this whole discussion about sociopathic behaviour of the higherups.
posted by xcasex at 11:14 AM on September 18, 2014 [1 favorite]


Hypothetically speaking, if a person was an NSA mole assigned to work at Apple, would they collect both their NSA salary and Apple salary?

and

Traditionally double agents collect two salaries until one side or the other executes them.


You get only one pension plan. The other disavows knowledge of your existence.

BTW, when you said "hypothetically," were you trying to be cute?

Try to stay in the middle of the herd.
posted by mule98J at 11:18 AM on September 18, 2014 [1 favorite]


"If this is the case, then why didn't Microsoft's SSL implementation have the same issues as OpenSSL?

Your argument is getting into "open source cannot fail, only be failed" ground. The fact that "oh, well, we can fix Heartbleed once it's discovered" doesn't change that it was a massive fail state for open source, nor that it proved that open source is not immune to the tragedy of the commons.
"

Wait, are you actually arguing that closed source is somehow more secure than open source?

Microsoft's SSL implementation is only used by Microsoft. OpenSSL was used by tons of organizations. Microsoft SSL has had security holes and they get regularly patched by Microsoft. Except if you have an older Windows version then you might not be getting patches.

With OpenSSL, once heartbleed was discovered, an effort began into auditing the code.

With a closed SSL stack if a security hole is discovered...you wait for the hole to be closed by the vendor and wait for the next hole to get discovered by either malicious parties or, hopefully, security researches who are working with a black box.
posted by I-baLL at 11:19 AM on September 18, 2014 [3 favorites]


invasi^H^H^H Persuasive

inv Persuasive?

Sorry. Pet peeve.
posted by ChurchHatesTucker at 11:26 AM on September 18, 2014 [4 favorites]


Of course, you can disagree by saying "the NSA can do whatever it wants and make it legal for anyone to break any law". If that's what you believe then
you've been paying attention.
posted by roystgnr at 11:30 AM on September 18, 2014 [6 favorites]


I-baLL: Wait, are you actually arguing that closed source is somehow more secure than open source

I'll let Nox speak for him/herself, but I've interpreted his/her words on this topic in this and other threads as simply a warning that the "all bugs are shallow" mantra can provide a false sense of security that can in some cases counteract the benefits of the code having more exposure, not as any kind of statement that closed source is inherently more secure. The fact is that there's plenty of open source code out there that nobody's really looking at too closely, and there are also plenty of bugs that are complex enough that they don't really show up in a visual inspection or even when you run static analysis tools on them.

Sometimes people go overboard when they laud the virtues of the open source movement, as if simply posting the code to Github gets you a free code review, and this is not the case at all. Ceteris paribus, I'd rather use a well-established open source product over something commercial, but Heartbleed was a clear case of the amount of code review and testing resources not being commensurate with the immense amount of trust people put in the software.
posted by tonycpsu at 11:34 AM on September 18, 2014 [5 favorites]


inv Persuasive?

Read it with the voice of Chef in your head, it'll all make sense then :p
posted by xcasex at 11:34 AM on September 18, 2014


Read it with the voice of Chef in your head, it'll all make sense then :p

But that's true of everything.
posted by ChurchHatesTucker at 11:47 AM on September 18, 2014


Open Source closes barn doors better than anyone.
posted by blue_beetle at 12:03 PM on September 18, 2014 [4 favorites]


Those people thinking that Apple can't stand against nation-state actors, have a look at how many nation-states Apple is richer than. Even the security apparatus of the USA has only so much money, so much time, and so much imagination. Not saying Apple really would break a lot of security even if the could, even at a low level, but they're pretty trans- meta- whatever now. Plus: profits. Plus that whole post- meta- thing. Probably they should be paying Gibson royalties.

That said, I would be very surprised if DARPA doesn't have tech 20+ years advanced to what even Apple has now.
posted by digitalprimate at 12:04 PM on September 18, 2014


tonycspu: That's a good deal of the point - open source has its own unique failstates, and not nearly enough consideration is given to that. Yes, it's all well and good that OpenSSL is getting a needed audit, but if we're being honest, it's also very much a case of closing the barn door when the horse is on sale on a Russian black market. Part of the failure state was the code, but part of the failure state was how the code was or wasn't managed. The question isn't just about OpenSSL being secure now, but how it will be kept secure in the future, especially once Heartbleed is a memory.

For all of Microsoft's numerous faults, they have gotten religion about security, and do actively maintain their current products. Meanwhile, we've seen the tragedy of the commons play out over and over in open source. So honestly, I tend to see the assertion of open source being more secure as a declaration that the community's crap doesn't stink these days.
posted by NoxAeternum at 12:10 PM on September 18, 2014


Yahoo was given a quarter-million-dollars-PER-DAY fine until it complied with the snoops, so Yahoo complied.

Worse than that, the fine was set to double very week: Within 12 weeks, Yahoo would have racked up about $7.2 billion in fines — an amount equal to its entire revenue in 2008. This pretty much eliminates the possibility of mounting a worthwhile defense, since the company would be bankrupt soon after the first motions were filed.
posted by RobotVoodooPower at 12:16 PM on September 18, 2014 [8 favorites]


", it's also very much a case of closing the barn door when the horse is on sale on a Russian black market."

But why are you treating this as an open source problem? Microsoft's had many security holes that they end up having to patch for all versions of Windows because the security hole's been in their code from something like day 1.
posted by I-baLL at 12:19 PM on September 18, 2014


blue_beetle, the keys in question are written on the cpu in manufacturing and Apple has stated they destroy/do not retain any copies immediately. They're not accessible to Apple or anybody really.
posted by polyhedron at 12:22 PM on September 18, 2014 [2 favorites]


the cpu in manufacturing and Apple has stated they destroy/do not retain any copies immediately. They're not accessible to Apple or anybody really.

I was about to say, if Apple was able to get the keys given the security described in the white paper that would be an incredibly impressive feat. Tying the passcode authentication to the UID key in the processor was absolutely fucking genius.
posted by Talez at 12:29 PM on September 18, 2014 [1 favorite]


But why are you treating this as an open source problem? Microsoft's had many security holes that they end up having to patch for all versions of Windows because the security hole's been in their code from something like day 1.

Because Microsoft actually admits they have a problem, and built a process for dealing with it. Which is a key step. What's the gameplan for maintaining OpenSSL over the long run? More importantly, how are you going to fund that maintenance?
posted by NoxAeternum at 12:29 PM on September 18, 2014


What's the gameplan for maintaining OpenSSL over the long run? More importantly, how are you going to fund that maintenance?
With a fraction of the millions of dollars from a new industry consortium (including from Microsoft), apparently.

What's the gameplan for maintenance of closed source software over the long run? More importantly, how are you going to compete with funding for subverting that maintenance? The Heartbleed bug was uncovered after a year because there were a couple people with the intelligence and luck to read the right source code and discover a bug. The BSafe bug was uncovered after half a decade because there was one person with the intelligence and luck to swipe thousands of top secret NSA documents and get away with it.

Both processes suck but I think the first is still more effective. "Anyone in the world could review this code, but they don't get paid enough to do it properly" is a lousy incentive structure, but not as bad as "Anyone in the right corporate division could review this code, but they might be paid not to, and they might be jailed if they do it properly".
posted by roystgnr at 1:36 PM on September 18, 2014 [7 favorites]


NoxAeternum: “"Many eyes makes bugs shallow’ is a myth.”

True – opening the source doesn't automatically make bugs disappear immediately.

The trouble is that closing the source is even worse.
posted by koeselitz at 1:48 PM on September 18, 2014 [5 favorites]


Kind of disingenuous: Apple says they no longer have the keys. What they don't say is that it would be trivial for them to get the keys.

Right, and they don't say (and wouldn't be allowed to say) whether the NSA has the keys now. For all we know there's a realtime NSA feed to every iPhone in existence.
posted by T.D. Strange at 2:00 PM on September 18, 2014


Yes, I remember reading that the industry, properly chastened, is now going to fund the maintenance of OpenSSL.

Question is, are they going to be doing so in 5 years? How about a decade? Or are we going to see the money slowly taper off as this fades from memory? It's not like that ever happens. Not to mention that it is, again, a bit of barn door closing as well.

I also find it telling that the response to "the open source model has some serious organizational flaws" continues to be "but the other guy is worse!" Regardless of whether that is the case (and I'm not sold on it at all), all you're doing is deflecting attention from the point that your model is lousy, and needs some serious work. The other guy doesn't matter - it's your house that burnt down.
posted by NoxAeternum at 2:16 PM on September 18, 2014 [3 favorites]


Part of me would like to see Yahoo or Google or Microsoft go all the way and find out if the government is really willing to let them go bankrupt over this.
posted by jjwiseman at 3:13 PM on September 18, 2014 [1 favorite]


I also find it telling that the response to "the open source model has some serious organizational flaws" continues to be "but the other guy is worse!"
That's the trouble with reality; we only get to choose the best among actual options, regardless of how much better an ideal option could be in our imaginations.
posted by roystgnr at 3:34 PM on September 18, 2014 [4 favorites]


Or to put it another way; do you find it telling that the response to "the closed source model has some serious organizational flaws" is "but the other guy is almost as bad!"?
posted by roystgnr at 3:35 PM on September 18, 2014


NoxAeternum: “I also find it telling that the response to ‘the open source model has some serious organizational flaws’ continues to be ‘but the other guy is worse!’ Regardless of whether that is the case (and I'm not sold on it at all), all you're doing is deflecting attention from the point that your model is lousy, and needs some serious work. The other guy doesn't matter - it's your house that burnt down.”

That metaphor doesn't work at all – we're not talking about different houses, we're talking about a single aspect of software development and its repercussions.

There have been open-source security tools developed by billion-dollar companies, and there have been closed-source security tools developed by single developers. It makes absolutely no sense to say "oh well, open-source doesn't work because it'll never get funded." We now know objectively that isn't true.

More to the point, my response was specific to your complaint. Your complaint was that open source isn't perfect. I said: yeah, sure – but it is better than the alternative. And it is, in general, in a world where plenty of closed-source security gets exploited every day.
posted by koeselitz at 3:48 PM on September 18, 2014 [1 favorite]


That's the trouble with reality; we only get to choose the best among actual options, regardless of how much better an ideal option could be in our imaginations.

No, it's a dodge from the actual issue, which is that open source has organizational issues that have lead to problems, which have had deleterious effects for millions of people.

Just because you think you are the best choice does not mean that you can't be better.
posted by NoxAeternum at 3:49 PM on September 18, 2014


NoxAeternum: “The other guy doesn't matter - it's your house that burnt down.”

You're standing in the middle of an entire city that burned down and saying: "well, clearly brown houses are worse than red houses, because look at this brown house – it burned down!" Well, fine, but that isn't really the key point at the moment. The key is that every one of these houses was flammable.
posted by koeselitz at 3:50 PM on September 18, 2014


Your complaint was that open source isn't perfect. I said: yeah, sure – but it is better than the alternative.

It is? Because from where I'm standing, it looks like the opposite happened: the closed source version was properly maintained, while the open source version fell victim to the tragedy of the commons.
posted by NoxAeternum at 3:59 PM on September 18, 2014


NoxAeternum: “It is? Because from where I'm standing, it looks like the opposite happened: the closed source version was properly maintained, while the open source version fell victim to the tragedy of the commons.”

Both OpenSSL and Microsoft SSL have been exploited. Microsoft SSL has been exploited dozens, if not hundreds, of times. I'm not sure where you're seeing one of these fail and the other succeed.
posted by koeselitz at 4:13 PM on September 18, 2014 [2 favorites]


As if Android is meaningfully open source these days. Enough critical behavior now lives in apps, which was an intentional decision to help with version fragmentation, that it's not really true that you could download the code somewhere and perform a comprehensive audit. It wouldn't be a bad thing if ios were truly open, but it's not like they are unique in not being so.

Another point is how little is actually in AOSP these days, and how much is tied into APIs and other bits that you can only get if you have the google apps package(which i forget what they're calling now, it's not google experience, but you know what i mean. Maps, etc) to get all the extra bits most apps expect to be there.

AOSP is open source, the actual google package-of-things you get as an official OEM playing by their rules or whatever is not.

It's getting close, at this to point, to being as if linux was open source but X11 wasn't or something.

There's a multi pronged front to make android a have cake it too thing where google gets to make lots of nice noises about being open, whereas the open product isn't any more open than the bits of OSX that are. And they still get all the tugjobs and fanfare for being "open".
posted by emptythought at 4:32 PM on September 18, 2014 [2 favorites]


I'm not sure where you're seeing one of these fail and the other succeed.

Agreed. They both failed.

Open Source does not guarantee security, nor does Proprietary closed source. The culture that grows around the software and is instilled in the developers by leadership and tradition is far more important for meeting ancillary goals - security being one.

In this regard, we've got to go by track record. Apple has an exceptionally good one, considering its installed base, being very swift to respond to problems as they arise on a fundamental, ongoing level. Google has some good things - two factor for its online services - but Android overall is a wild-west shitshow. Device security isn't a priority for Google.
posted by Slap*Happy at 4:50 PM on September 18, 2014 [2 favorites]


the actual google package-of-things you get as an official OEM

You mean Google Play Services, also known as "The No Bezos Club."
posted by RobotVoodooPower at 4:51 PM on September 18, 2014 [1 favorite]


Is the idea that there is a plausible alternate explanation that shields those who use [canaries], if pressed?

I've chatted with a few (non-national-security-type) lawyers about this practice and nobody has thought it would withstand a legal test. On the other hand for a provider with a lot of customers the damage done is negligible so it's really not worth the bother to prosecute.

Mostly a canary seems to serve the purpose of letting customers know that you share their concern about privacy issues.

(a situation where the canary could plausibly alert an individual that the request was about them is a different matter).
posted by Tell Me No Lies at 4:56 PM on September 18, 2014 [1 favorite]


You mean Google Play Services, also known as "The No Bezos Club."

Yep. I swear they've changed the name of this a couple times. And adding in the "google experience" phones only muddied the waters.

Looking here, i think i understand my confusion. I've definitely heard it referred to as "google mobile services" before, which is a lot less euphemistic.

It's basically the "if you want what people consider android, you probably need to have this" package of bits.
posted by emptythought at 5:01 PM on September 18, 2014




Ars Technica: No, Apple probably didn’t get new secret gov’t orders to hand over data.
posted by mbrubeck at 11:51 AM on September 19


how dare you inject a reasonable and rational point into this admittedly reasonable panic that the NSA may have overstepped reasonable boundaries again
posted by DoctorFedora at 12:28 AM on September 19, 2014


ArsTechnica:
The warrant canary language was also missing from the company's December 31, 2013 transparency report, which covered the second half of the year. However, since then, as part of ongoing lawsuits at the Foreign Intelligence Surveillance Court, the government has imposed new guidelines that essentially make warrant canaries much more difficult to issue.
... which addresses the question of whether canaries can even work anymore. The answer seems to be 'no.'
posted by lodurr at 7:24 AM on September 19, 2014


Here's the text of the linked press release on the new guidelines:
Home » Office of Public Affairs
JUSTICE NEWS

Department of Justice
Office of Public Affairs
FOR IMMEDIATE RELEASE
Monday, January 27, 2014
Joint Statement by Attorney General Eric Holder and Director of National Intelligence James Clapper on New Reporting Methods for National Security Orders
Attorney General Eric Holder and Director of National Intelligence James Clapper released the following joint statement Monday:

“As indicated in the Justice Department’s filing with the Foreign Intelligence Surveillance Court, the administration is acting to allow more detailed disclosures about the number of national security orders and requests issued to communications providers, and the number of customer accounts targeted under those orders and requests including the underlying legal authorities. Through these new reporting methods, communications providers will be permitted to disclose more information than ever before to their customers.

“This action was directed by the President earlier this month in his speech on intelligence reforms. While this aggregate data was properly classified until today, the office of the Director of National Intelligence, in consultation with other departments and agencies, has determined that the public interest in disclosing this information now outweighs the national security concerns that required its classification.

“Permitting disclosure of this aggregate data resolves an important area of concern to communications providers and the public. In the weeks ahead, additional steps must be taken in order to fully implement the reforms directed by the President.

“The declassification reflects the Executive Branch’s continuing commitment to making information about the Government’s intelligence activities publicly available where appropriate and is consistent with ensuring the protection of the national security of the United States.”
Unless I'm reading this wrong, the plain meaning of the PR seems to be saying that 'information providers' will be able to say more about whether they've had requests; the way it's linked from ArsTechnica seems to imply that this is bullshit handwaving. I have no problem believing that's true, but does anyone here actually know? Has anyone read the new guidelines?
posted by lodurr at 7:29 AM on September 19, 2014


"Information providers" can say with more detail how many requests they've received, but since the low end of the scale that DOJ provides is "0-249" they can no longer say "we have not received any requests."
posted by Holy Zarquon's Singing Fish at 8:04 AM on September 19, 2014


"Many eyes makes bugs shallow" is a myth.

Maybe "Many eyes makes bugs shallow[er]."
posted by SpacemanStix at 8:23 AM on September 19, 2014


DoctorFedora: “how dare you inject a reasonable and rational point into this admittedly reasonable panic that the NSA may have overstepped reasonable boundaries again”

Have you actually read that article? It's... difficult to parse, to say the least. There is plenty of stuff in that article seems to contradict the headline. Like, um:

“Because Apple has almost certainly received an order to turn over content under Section 702—as seen in a now-infamous slide provided by National Security Agency whistleblower Edward Snowden—that could explain why the Section 215 language warrant canary has been removed from Apple's transparency reports.”

... which, yeah, seems to directly contradict the headline. Maybe the headline meant to say "No, Apple probably didn’t get new secret gov’t orders to hand over data," as in they've already been getting them since 2012? That would make some sense, but it's hard not to read the headline as misleading, to say the least.

It basically seems to be saying that the "canary" disappearing isn't evidence that the NSA is issuing information demands to Apple – because we are pretty sure that they've been issuing those demands for at least two years already.
posted by koeselitz at 9:07 AM on September 19, 2014 [2 favorites]








« Older Needs a bottle feeder.   |   The twisted world of sexual organs Newer »


This thread has been archived and is closed to new comments