They have asked us to build a backdoor to the iPhone
February 17, 2016 3:00 AM   Subscribe

Investigations into the San Bernardino attack by the FBI have been potentially impeded by information locked in an iPhone 5c found on one of the perpetrators. A federal court judge has ordered Apple to assist the FBI in defeating any and all security measures built into the device. In a turn similar to Ladar Levison's letter to Lavabit users (previously), Apple has written a letter to end users about the civil rights at stake.
posted by a lungful of dragon (532 comments total) 48 users marked this as a favorite
 
A Worldwide Survey of Encryption Products

The study for Harvard University's Berkman Center for Internet and Society, conducted by cryptography expert Bruce Schneier and colleagues Kathleen Seidel and Saranya Vijayakumar, surveyed the availability of encryption products worldwide, compiling findings that make it clear that U.S. laws to weaken domestic encryption wouldn't stop malicious users from obtaining foreign encryption, but would put U.S. firms at a competitive disadvantage...

It further concluded, "anyone who wants to evade an encryption backdoor in US or UK encryption products has a wide variety of foreign products they can use instead: to encrypt their hard drives, voice conversations, chat sessions, VPN links, and everything else.

"Any mandatory backdoor will be ineffective simply because the marketplace is so international. Yes, it will catch criminals who are too stupid to realize that their security products have been backdoored or too lazy to switch to an alternative, but those criminals are likely to make all sorts of other mistakes in their security and be catchable anyway.

"The smart criminals that any mandatory backdoors are supposed to catch—terrorists, organized crime, and so on—will easily be able to evade those backdoors."

posted by a lungful of dragon at 3:11 AM on February 17, 2016 [8 favorites]


A friend linked to this and asked for a layperson's explanation. Here's mine, translating Apple's letter into lock-picking terms:

- We make an impregnable safe with an unbreakable lock.
- We specially designed it so that not only don't we have a master key for our locks, we couldn't make one if we wanted to.
- The US Government wants to get into one of our safes but accepts that we don't have a master key and can't make one.
- It therefore wants us to craft a special tool that would allow the lock to be removed altogether.
- We don't want to do that because once we craft such a tool none of our safes will be secure any more.

posted by Major Clanger at 3:19 AM on February 17, 2016 [118 favorites]


It should be clarified that the FBI doesn't want simple accesss to the phone but a simpler way to hack the phone by giving them better technical access to the phone and by disabling the security feature, which will brick the phone and delete all data after 10 attempts. This is not the same as "defeating any and all security measures". They don't want a backdoor just better access to the keyhole for the front door.

Although this clarification doesn't make Tim Cook's statements less true.
posted by KMB at 3:28 AM on February 17, 2016 [5 favorites]


Additionally the FBI would like Apple to provide a constant time method for finding prime factors of large numbers, a proof that P=NP, an integer solution to x^n+y^n=z^n for n > 2, and a live unicorn.
posted by autopilot at 3:31 AM on February 17, 2016 [119 favorites]


An insight from Ars Technica:
Tim Cook published the open letter at midnight Pacific time, when most Americans were already asleep. Europe, however, was just waking up—and Europeans tend to get quite upset by egregious breaches of privacy. If Apple was compelled to introduce such a backdoor for the FBI, European governments would have access to it as well.
posted by Rhaomi at 3:38 AM on February 17, 2016 [18 favorites]


It should be obvious, but it bears noting that EVEN IF you trust the FBI to do what they claim they are going to do with such a backdoor AND also support them doing so, it is STILL a terrible idea because what you must ALSO TRUST is that their own security is 100% impregnable to being hacked and/or that they never have a disgruntled (former?) employee with access etc. because that is the only way that shit is not getting into a bad actor's hands.
posted by juv3nal at 3:44 AM on February 17, 2016 [24 favorites]


Alternatively, the FBI could just suck it up and admit that this is just another workplace shooting like all the other workplace shootings that happen in the US every year and not evidence of some secret Muslim sleeper cell. But then how would they justify the huge budgets for them and "Homeland Security" and whipping everyone into a frenzy and derailing discussions of the economy and helping people and actual meaningful gun control by shouting TERRORISM because this one time the shooters were brown people instead of another angry white guy with a gun?
posted by hydropsyche at 4:03 AM on February 17, 2016 [119 favorites]


In other words, this is something the FBI has wanted as long as there has been an iPhone, and now they think they've finally found a case that will get it for them. Because TERRORISM!
posted by hydropsyche at 4:04 AM on February 17, 2016 [57 favorites]


I'm a little confused by exactly what the FBI wants - it doesn't look like a cryptographic back door, but instead a high bandwidth pathway for a brute force attack. Which I would have thought would be reverse-engineerable, unlike a change in crypto that would reveal the data on the particular phone they're holding.

So Apple's cooperation, while desirable to the FBI, isn't necessary. If that cooperation is legally enforceable, then it would seem that any company that provides a cryptographic product would be legally biddable to undertake any sort of weakening the FBI would like to see, which is extremely undesirable - certainly as bad as any sort of previous attempt to impose key escrow or similar.
posted by Devonian at 4:06 AM on February 17, 2016 [5 favorites]


juv3nal, I agree. And further, it's this "Baby, just one time, please!" attitude by the FBI, as if they won't come right back and ask for it again and again, that galls me.

It's Chekhov's pistol, of course they will be back!
posted by wenestvedt at 4:08 AM on February 17, 2016 [4 favorites]


A back door for the "good guys" is a back door for the "bad guys" also.
posted by LoveHam at 4:10 AM on February 17, 2016 [16 favorites]


autopilot: Fermat's theorem has been proven already?
posted by flippant at 4:11 AM on February 17, 2016 [1 favorite]


The FBI would like two things from Apple. First, a way to brute force guesses against the secure module so they can just try the whole number space in a reasonable time frame.

Second, they would like Apple to remove the block that will wipe the phone after 10 incorrect attempts. That's the real problem here. With the wipeout code removed they could just have agents sit at a table manually typing in all 10,000 guesses for the next few months, but if the wipeout code is in place they're hosed even if they have a way to automate the process.
posted by zrail at 4:11 AM on February 17, 2016 [4 favorites]


Use of the All Writs Act of 1789 is perhaps one of the most worrying bits here.
posted by memebake at 4:11 AM on February 17, 2016 [10 favorites]


Devonian, except that you can't currently brute force the passcode. The iPhone bricks after 10 failed attempts.
posted by MythMaker at 4:12 AM on February 17, 2016


But then how would they justify...

Even sincere, committed, thoughtful people who want to do good in the world have terrible ideas sometimes; and even terrible ideas start to sound convincing if they're repeated long enough and with enough different phrasings.

That is to say, I think it matters less whether they're a bunch of whiny babies or a bunch of unsung heroes, than whether this is in fact a bad idea.

(I'm not really taking issue with your comment, hydrop, I'm just trying to get this thought in early in the thread in case it helps.)
posted by amtho at 4:13 AM on February 17, 2016 [9 favorites]


Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

Unsaid is that once they have such a tool, they would find create reasons to use it.
posted by Old'n'Busted at 4:15 AM on February 17, 2016 [13 favorites]


No lover of the Apple, but I must say: good on them for bringing it out in the open. MS never did.
posted by eclectist at 4:18 AM on February 17, 2016 [22 favorites]


Wait a minute, if Apple can install a new OS on a locked iPhone, then all their encryption and security is fundamentally broken, right? A locked phone should not allow a new OS to be installed.

If Apple hadn't left in the ability to install a new OS on this particular locked phone, then the FBI wouldn't be able to compel them to craft a hacked OS to get the data. It sounds like Apple left themselves open to this kind of coercion by being bad at security.

(Another possibility is that the news article is imprecise and this isn't quite what's going on.)
posted by ryanrs at 4:19 AM on February 17, 2016 [10 favorites]


autopilot: Fermat's theorem has been proven already?

It has. Unfortunately, the comment box is too small...
posted by thelonius at 4:20 AM on February 17, 2016 [50 favorites]


Can't the FBI just get the information like they always have - by taking a large dose of magic mushrooms and line dancing?
posted by robocop is bleeding at 4:23 AM on February 17, 2016 [49 favorites]


Couldn't they just drain the iPhone's battery, plug it in for long enough to try nine passwords, then defeat the 10th-attempt iPhone memory wipe by simply unplugging it and starting over?

/probably already on a list anyway.
posted by emelenjr at 4:23 AM on February 17, 2016 [2 favorites]


Or maybe the FBI could entice some disaffected muslim youth into sending incriminating message from a different iphone, then send that guy to jail instead. It will probably save some time since their agents are already trained in this.
posted by ryanrs at 4:27 AM on February 17, 2016 [17 favorites]


They should have foreseen this situation when they were in development. Unbreakable encryption, law enforcement, criminal behavior.... not a hard connection to make.
I do agree, unlock one and you unlock them all. That will lead to inevitable abuse.
Road to hell.... best intentions....
posted by a3matrix at 4:33 AM on February 17, 2016


Arrghghghghghghghghghghg. So much misinformation floating around here.

First, calling what the FBI is requesting a "backdoor" is simply incorrect. That term is normally reserved for a mechanism to access a system immediately and surreptitiously - think "type the magic password and you get logged in straight away." But what the FBI is requesting is decidedly not that.

Second, the request is technologically possible, unlike autopilot implied.

Third, assuming Apple provided everything the FBI requested, the FBI still wouldn't have access to any data. They still have work to do at that point: namely, brute forcing the passcode to then decrypt the contents of the filesystem.

Fourth, while Tim Cook makes many good, correct and insightful points, he is simply incorrect when he says that creating this thing once will let governments break iphone security all over the place. Wrong wrong wrong.

Fifth, and perhaps most importantly, if users have a secure passcode, and if the FBI gets what they request, the FBI still might not be able to access the encrypted data. (Assuming Apple was smart - and I think they were - and used a proper key derivation function on the user's passcode before encryption.)

To pick apart two of the points in Major Clanger's lock analogy:

- It therefore wants us to craft a special tool that would allow the lock to be removed altogether.
- We don't want to do that because once we craft such a tool none of our safes will be secure any more.

No. Regarding the first, the reality is more akin to allowing the FBI as many attempts as they like to pick the lock, not removing it from the doorframe. Regarding the second, and as I've already mentioned, that simply isn't true. Picking one lock does not give you access to all other locked doors.

Finally, I'm concerned the FBI has to use a law over 200 years old here. That's suspicious AF. However, and bearing in mind I'm a strong critic of the FBI (and will use less polite language in person), this is actually a reasonable request, and if Apple doesn't go along here I'm worried the FBI will resort to doing something really stupid.
posted by iffthen at 4:46 AM on February 17, 2016 [15 favorites]


ryanrs: this is where the hardware matters – phones which have the newer Secure Enclave coprocessor enforce all of this in hardware on the enclave. That's the iPhone 5S and newer (i.e. A7) but not this phone which is a 5C using the older A6 hardware. It's not clear whether this is confirmed publicly but the security people I follow believe that the older design has only a fixed delay in hardware, without back off or auto-wipe.

That author wrote a blog post summarizing why he thinks Apple can comply with this order but not in the case of newer phones.
posted by adamsc at 4:50 AM on February 17, 2016 [6 favorites]


if users have a secure passcode

This is the four digit code to unlock your phone, right?
posted by ryanrs at 4:53 AM on February 17, 2016


Note that there is another case involving Apple and the All Writs Act that is ongoing. In this one, the defendant has already pled guilty and is awaiting sentencing.
posted by RobotVoodooPower at 4:53 AM on February 17, 2016 [1 favorite]


ryanrs: I believe (but haven't checked on recent iOSs) that you can use an arbitrary-length passphrase composed with a proper keyboard. (Pretty sure that was possible at one point, doubt it's been removed.) If that's the case the number of possible passcodes isn't 10,000, it's a number greater than the number of seconds since the universe began.

Of course, if there's only 10,000 keys the FBI needs to check, none of my previous comment applies. But I'm fairly certain that's not the case.
posted by iffthen at 4:56 AM on February 17, 2016


Anyone who has ever played Fallout knows the secret to this. When you're trying to hack a terminal, you make three guesses, and if none of them are correct, you quit out of it and come back. You never make a fourth guess. But quit and come back, and you get infinite tries.
posted by jbickers at 4:57 AM on February 17, 2016 [13 favorites]


Regarding the first, the reality is more akin to allowing the FBI, and the Chinese People's Liberation Army, and the Iranian secret police, and the Russian Mafia, and random griefers as many attempts as they like to pick the lock, not removing it from the doorframe.

FTFY. There's no such thing as a technology only red-blooded American patriots can use.
posted by acb at 4:57 AM on February 17, 2016 [20 favorites]


Yeah, but nobody is going to use a proper 16 character password 50 times a day to unlock their phone.

I suppose this was the impetus for introducing the fingerprint reader.
posted by ryanrs at 4:59 AM on February 17, 2016 [4 favorites]


acb: the solution is still trivial. Pick a secure passphrase, and nobody can access your data.
posted by iffthen at 4:59 AM on February 17, 2016


Am I the only one who thinks both the FBI and Apple are posturing here? As ifftehn says above, what Apple could supply isn't that much of a help in the first place, but by asking for it, the FBI can make it look like they are doing all they can but are either blocked by Apple, or helped by Apple, each with its own spin. Apple, on the other hand can look like they are trying to protect their users by refusing and if forced to give in, tried their best to fight the good fight.

Or maybe I'm just too cynical.
posted by Obscure Reference at 5:00 AM on February 17, 2016 [11 favorites]


Yeah, but nobody is going to use a proper 16 character password 50 times a day to unlock their phone.

Depends on how valuable the information on it is.
posted by iffthen at 5:00 AM on February 17, 2016 [3 favorites]


Ha ha, the new iphone forces you to unlock using a password if the touch reader hasn't been used in 48 hours, or the phone has been rebooted. That is definitely designed to frustrate the FBI.
posted by ryanrs at 5:03 AM on February 17, 2016 [8 favorites]


Use of the All Writs Act of 1789 is perhaps one of the most worrying bits here.

Well, at the point where people can claim that the 2nd Amendment was obviously intended to cover everything from automatic weapons to wave-motion guns, and not get laughed out of the courtroom, it's not all that surprising that the FBI is willing to resort to 236-year-old statutes of questionable applicability to browbeat the courts into doing their bidding.
posted by Mayor West at 5:03 AM on February 17, 2016 [15 favorites]


First, calling what the FBI is requesting a "backdoor" is simply incorrect.
Technically correct, but the public has a larger use of the term than the techie community.
Regarding the first, the reality is more akin to allowing the FBI as many attempts as they like to pick the lock, not removing it from the doorframe.
Perhaps a proper way to say it is that you're replacing a steel-reinforced door that has 3 high-security Assa Abloy deadbolts and an integrated alarm system that sets off thermite over your office desk... with a cheap plywood door containing only a single standard Kwikset lock without security pins (so as to be easy to bump) and rigging the alarm to always show the door as secured.
Fifth, and perhaps most importantly, if users have a secure passcode, and if the FBI gets what they request, the FBI still might not be able to access the encrypted data. (Assuming Apple was smart - and I think they were - and used a proper key derivation function on the user's passcode before encryption.)
You realize that if you disable the repeated-attempt lockout, disable any rate-limiting, and enable connection-port submission of the code, then the KDF doesn't matter, right? It would still be only 10,000 possible guesses if the user left their phone in 4-digit code mode (which most do) instead of using a more secure passphrase, because the guess is being submitted directly to the KDF, not the crypto'd contents. A KDF prevents the brute-forcing of the crypto-text alone, not the brute-forcing of the KDF when the input is from a restricted range and the KDF's temporal stretch of the code check is quick enough. The phone is clearly capable of checking a single code in a fraction of a second, but if we assumed 1 second per guess then 10,000 rounds could still be done in less than 3 hours.
posted by mystyk at 5:05 AM on February 17, 2016 [19 favorites]


Wait a minute, if Apple can install a new OS on a locked iPhone, then all their encryption and security is fundamentally broken, right?

Answering my own question, the phone will load the new software if it is signed by Apple. So yeah, Apple totally left themselves open to coercion by governments.
posted by ryanrs at 5:07 AM on February 17, 2016 [8 favorites]


Even if it's technically possible, there are lots of factors that could make it unreasonable. For one: If any of these builds leak, they will be immediately reverse-engineered to find the diffs between good and evil firmware images, and that could give attackers a head start writing future exploits.

But hey, at least this is out in the open, unlike the old secret 'hotwatch' orders, supposedly also justified by the All Writs Act.
posted by RobotVoodooPower at 5:10 AM on February 17, 2016 [2 favorites]


Answering my own question, the phone will load the new software if it is signed by Apple. So yeah, Apple totally left themselves open to coercion by governments.

Yes and no — they fixed this on the newer Touch ID devices, because the software will install but you still can't decrypt anything on the device without the Touch ID unit allowing it, and you can't install new software on that without wiping.

Regarding the first, the reality is more akin to allowing the FBI, and the Chinese People's Liberation Army, and the Iranian secret police, and the Russian Mafia, and random griefers as many attempts as they like to pick the lock, not removing it from the doorframe.

FTFY. There's no such thing as a technology only red-blooded American patriots can use.


This really makes no difference, because as ryanrs says the "backdoor" already exists. The fact that Apple is able to install software on this phone that will make it possible to brute-force it is the vulnerability. (And note that it's not even possible for them to do this on Touch ID-enabled devices).

So if the CPLA or Iranian Secret Police are able/unable to compel Apple to install their new software on a given phone, then they're just as able/unable to compel Apple to create it, surely.

What Tim says he fears has already happened, in effect.
posted by bonaldi at 5:11 AM on February 17, 2016 [7 favorites]


The thing about the 'ten goes and you're out' feature, as with the retry throttling, is that it's not part of the crypto. I don't know what levels of protection the Apple platform has against the FBI having physical access to the hardware, but the data they want will be held in flash, protected by encryption; if they can get that encrypted data out (and that's not a mathematical problem; you pull the flash chips and clone them - unless there's additional hardware protection there, but again that's not going to be mathematically intractable) they can brute-force it as much as they like outside the additional restrictions of the iPhone. They won't need to go through iOS at all.

So what is it they are trying to get out of Apple? It looks like precedent, a general proof of legal concept that could then be extended without restraint.
posted by Devonian at 5:12 AM on February 17, 2016 [20 favorites]


Use of the All Writs Act of 1789 is perhaps one of the most worrying bits here.

The act says this:

(a) The Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.

(b) An alternative writ or rule nisi may be issued by a justice or judge of a court which has jurisdiction.


So courts can order people to do things. That's not especially controversial, is it?

I feel like people are putting scare quotes around "1789" to suggest that it's an ancient law inapplicable to modern technology. But really the reason it's old is that it codifies a basic power of the courts.
posted by ryanrs at 5:16 AM on February 17, 2016 [12 favorites]


Yeah, but nobody is going to use a proper 16 character password 50 times a day to unlock their phone.

(later)
Ha ha, the new iphone forces you to unlock using a password if the touch reader hasn't been used in 48 hours, or the phone has been rebooted. That is definitely designed to frustrate the FBI.
I use a passphrase longer than 16 characters on mine (and also don't use ANY cloud storage whatsoever), but that's because I'm involved in crypto research already. I was very pleasantly surprised (ok, not surprised, but still pleased) when the FP reader does the lockout after 48 hours (or reboot).

I have no problem entering the code every now and then under those circumstances, and I guarantee you they're going to be frustrated trying to figure out what I used as a print (despite what you may be thinking, it's actually SFW). Plus, as a bonus, the FP processes (including its lockout) are handled by the secure enclave, so software bypass won't get you very far.

Actually, this is ironic. It's this very secure enclave aspect that has made the "error 53" issue come up, as the biggest attack on the enclave was hardware-based, which bricking the phone when unapproved hardware changes (specifically changes to the reader) are detected closes that avenue. I have sympathy for people who bricked their phones, but it does fit into a larger strategy of Apple protecting its customers from nation-state level actors and their resources.
posted by mystyk at 5:18 AM on February 17, 2016 [22 favorites]


if they can get that encrypted data out (and that's not a mathematical problem; you pull the flash chips and clone them - unless there's additional hardware protection there, but again that's not going to be mathematically intractable) they can brute-force it as much as they like outside the additional restrictions of the iPhone. They won't need to go through iOS at all.

So what is it they are trying to get out of Apple? It looks like precedent, a general proof of legal concept that could then be extended without restraint.


The difficulty, as I understand it, is that the data on the chips is protected by a strong key. The FBI are unable to crack this key even if it isn't in situ. Standard crypto talk about "more combinations than atoms in the universe" or whatever apply.

However access to this key through the phone hardware/software is controlled by a simple pass code which is (probably) a 4 digit PIN. The FBI can certainly crack a 4 digit PIN, but for the fact that the phone locks out permanently if you get it wrong ten times. They are asking for a removal of this restriction.
posted by Dext at 5:23 AM on February 17, 2016 [3 favorites]


However access to this key through the phone hardware/software is controlled by a simple pass code which is (probably) a 4 digit PIN. The FBI can certainly crack a 4 digit PIN, but for the fact that the phone locks out permanently if you get it wrong ten times. They are asking for a removal of this restriction.

This doesn't worry me too much. I guess it depends on what scenario you envisage for abuse of the FBI getting this capability. Anyone doing anything Really Naughty won't be using a 4 digit PIN. I agree, however, that ordinary citizens in a country with a secret police force *would* use 4-digit PINs, and that is a concern.
posted by iffthen at 5:30 AM on February 17, 2016


Hardware isn't magic either though. What's to prevent governments to require Apple provide a hardware work around for the security enclave? Or make one themselves?
posted by bonehead at 5:32 AM on February 17, 2016


So the upshot is, even if Apple is compelled to make this backdoor/firmware for the iPhone 5c used in San Bernardino, Apple can't do the same for an iPhone 5s or any later model.
posted by ryanrs at 5:39 AM on February 17, 2016 [2 favorites]


Actually, this is ironic. It's this very secure enclave aspect that has made the "error 53" issue come up, as the biggest attack on the enclave was hardware-based, which bricking the phone when unapproved hardware changes (specifically changes to the reader) are detected closes that avenue. I have sympathy for people who bricked their phones, but it does fit into a larger strategy of Apple protecting its customers from nation-state level actors and their resources.

The reaction people had there is a common theme with Apple though that I don't understand. They're the only major tech company that does anything meaningful about trying to protect their users —ditching Google Maps, suing Radio Shack, implementing end-to-end in iMessage, even taking on their own government—but if people find out that their iPhone is caching location data locally or bricking if someone tries to replace a major security feature with an unapproved device, people immediately assume nefarious intent. And then download Facebook Messenger.

It's exhausting to watch. It's the tech equivalent of people ignoring the super-rich playing grab-ass with the economy and instead blaming the "welfare queens" for economic problems.
posted by middleclasstool at 5:40 AM on February 17, 2016 [55 favorites]


I like how the government is so fucked up that the NSA won't share with the FBI their code that exploits the flaws built into the encryption algorithms themselves. It blows my mind that the only reason we're as free as we are is bureaucratic infighting.
posted by mikelieman at 5:41 AM on February 17, 2016 [10 favorites]


I say give the iPhone to Hilary and when she breaks the encryption she gets to be President.
posted by cjorgensen at 5:42 AM on February 17, 2016 [2 favorites]


Bonehead, if the Secure Enclave is running in its own self-contained bit of silicon, it is very hard to make it do things against its programmed wishes. Back in the old days you could futz around with electron microscopes and focused ion beam workstations, but I don't know if even those techniques work on modern 14nm chips like the latest iPhone processors. Apple probably doesn't even have a lab to do this (since they aren't a semiconductor manufacturer). You'd probably have to get the NSA to do it, and it would cost millions.
posted by ryanrs at 5:44 AM on February 17, 2016 [2 favorites]


No. Regarding the first, the reality is more akin to allowing the FBI as many attempts as they like to pick the lock, not removing it from the doorframe. Regarding the second, and as I've already mentioned, that simply isn't true. Picking one lock does not give you access to all other locked doors.

When you're talking about a passcode with only 10,000 possible combinations, removing the 10-chances-or-it'll-brick restriction (and allowing attempts to be made programmatically, rather than manually via the keypad) does effectively render the lock ineffective. Okay, yeah—the FBI still has to figure out the correct four-digit code, but that's trivial. They would just write a script to brute-force the passcode. The difference between "lock that can be automatically picked in minutes by any desktop computer" and "nonexistent lock" is academic.

And, yes, each "lock" still needs to be picked individually—but again, if that picking can be done by just plugging the phone into a computer and running a trivial script, then yes—in practical terms, all other doors have been thrown open. Sure, you can keep the lock on your door if you like—but it's purely security theater.

Unless I'm totally misunderstanding what the FBI is asking for here.

I don't really understand why the FBI needs Apple to do this, though. Couldn't they just make a backup image of the entire iPhone, and then run it in an emulator? At that point, they could easily script a brute-force attack, or even modify the OS. (It'd be harder than working from source code, but it seems like it'd definitely possible to make these simple modifications to machine code—always return false from the "is the user out of passcode attempts?" function; reduce any delays that would slow down the brute-force attempt; etc.).

The four-digit passcode has never provided meaningful protection against someone who is in physical possession of your iPhone (or its disk image), who has the resources and technical background to mount a coherent attack. If you have anything that sensitive, you need to encrypt it with a real password.
posted by escape from the potato planet at 5:45 AM on February 17, 2016 [9 favorites]


I donated cash to the Lavabit legal defense fund (we all have to have hobbies). I didn't even use his product, but I saw the value in it, and I for sure saw the value in a company actually being able to be transparent about its legal struggles with the government.
posted by cjorgensen at 5:46 AM on February 17, 2016 [4 favorites]


I wonder if the ultimate outcome will be affected by the court's decision to order Apple to "make a tool that unlocks phones" rather than "unlock this phone and give me the data".
posted by RobotVoodooPower at 5:49 AM on February 17, 2016 [3 favorites]


What's sad is all the posturing, lawyering and money spent when you figure all they'd find is some guns-akimbo selfies and receipts from tactical-vests.com.
posted by valkane at 5:51 AM on February 17, 2016 [10 favorites]


i don't see any reason why the fbi cannot do this themselves (except for money and lack of expertise). it's just working round software restrictions.
posted by andrewcooke at 5:53 AM on February 17, 2016 [1 favorite]


Mikelieman -- the FBI not having access to certain NSA technology is a feature, not a bug.

The FBI is a domestic law enforcement agency. It is required to honor Constitutional and statutory protections of suspects and targets in the ways and means of its investigations, and is limited in the "dragnet" approaches is permitted to use to identify suspects and targets in the first place.

The NSA is for the most part a foreign intelligence agency. In that capacity, its targets have no Constitutional rights and only very limited other rights, most arising under treaty and international law, and non-state actors like terrorist organizations don't have any of those, either. It can dragnet to its heart's content, and send in the Hellfire missiles and special forces door-kickers as the data warrant.
posted by MattD at 5:56 AM on February 17, 2016 [4 favorites]


They would just write a script to brute-force the passcode

$ seq --format="%04.0f" 0 9999
posted by mikelieman at 5:57 AM on February 17, 2016 [6 favorites]


> A back door for the "good guys" is a back door for the "bad guys" also.

Locks are only meant to keep out the honest people.

Once there is a back door the only people who will use it are those who shouldn't.

> With the wipeout code removed they could just have agents sit at a table manually typing in all 10,000 guesses for the next few months, but if the wipeout code is in place they're hosed even if they have a way to automate the process.

There are devices that will even enter the codes for you. Which is why everyone should have a 6 character code at minimum (more complex is even better). You should also be using touchID so you can use a code you only need to enter in on reboot or if the device sits fallow for days.

I got one of them new iPad Pros. I set up touchID, made a new unique long couples password like #P00pSchtick!, and after a week of use I went back to my old iPad for a couple days. I went to use the new one again and it wouldn't take my fingerprint. It wanted the password, which I had set up weeks before. Long story short I was my own nightmare. I reset my device in the end, but since it was new I didn't much care about the data.
posted by cjorgensen at 5:57 AM on February 17, 2016


I have some sympathy for Apple over the Error 53 issue, but not very much- whatever the underlying security logic, the company exhibited exactly the lack of communication with its customers which has given it the perception of being high-handed and careless of their interests. They could have rolled out exactly the same level of enhanced security without leaving anyone with an unexpectedly bricked phone, or with the impression that it was some sort of attack on independents.

I don't know the details of how Apple implements the pathway between the four digit pin input and the full data decryption, but it will involve cryptographically signed software components that won't run without the right hardware in place, and that won't be something you can duplicate on an emulator (or just state-replicate on another iPhone.) If anything changes, then it has to be reauthorised by Apple, and if that wasn't the case then it wouldn't be much cop. And this is what the FBI is going the whole "I'm gonna need you to step out of the car, sir".
posted by Devonian at 6:00 AM on February 17, 2016


The NSA is for the most part a foreign intelligence agency. In that capacity, its targets have no Constitutional rights and only very limited other rights

Oh boy. This is a huge derail, but my reading of the Constitution says the limits on Federal power aren't limited by borders, and the only context Citizenship is relevant in is Privileges conferred like voting.

I *do* understand that several courts having jurisdiction might have ruled in ways that conflict with my belief, but I do note that the USSC has said that even those held extrajudicially in GITMO get Habeas Corpus hearings...
posted by mikelieman at 6:00 AM on February 17, 2016 [6 favorites]


It sounds like on new-ish iPhones, even a 4 digit code is ok, since the secure enclave enforces the 10-try limit. Four digit codes with robust rate limiting works well for ATMs...
posted by ryanrs at 6:01 AM on February 17, 2016


Apple should send a truckload of phones with the following instructions:
1. Remove original memory card and replace with duplicate of desired data.
2. Install duplicate cards in the included 1000 iphones.
3. Assign 1,000 agents* to attempt the following codes: (1) 0000-0009, (2) 0010-0019 . . . (1000) 9990-9999
4. Please return the 999 useless phones upon completion.

*Surveillance state must supply own agents
posted by cmfletcher at 6:05 AM on February 17, 2016 [5 favorites]


5. Please remit $650,000,0000

But your steps would also fail, since you can have all kinds of characters in there, presumably even emojis.
posted by cjorgensen at 6:10 AM on February 17, 2016


*Surveillance state must supply own agents

** Or maybe just a room full of monkeys
posted by Strange Interlude at 6:10 AM on February 17, 2016


FWIW I wiped the settings on my iPhone 6 a few weeks ago and discovered that iOS now defaults to a 6-digit passcode.
posted by Andrhia at 6:16 AM on February 17, 2016 [6 favorites]


Answering my own question, the phone will load the new software if it is signed by Apple. So yeah, Apple totally left themselves open to coercion by governments.

This is the crux of the problem, as far as I can tell. The backdoor is already there, the government just needs Apple's digital signature as the key. Don't put backdoors in your products.
posted by Holy Zarquon's Singing Fish at 6:18 AM on February 17, 2016 [2 favorites]


Remove original memory card and replace with duplicate of desired data.

That wouldn't work either - the encryption is a combination of a hardware key and the passcode. They have to unlock that particular phone, with flash in situ. The 5c has pretty good security that combines software and hardware, but which can be broken. The Touch ID devices designed even those vulnerabilities out.
posted by bonaldi at 6:20 AM on February 17, 2016 [1 favorite]


The specific phone in question, an iPhone 5c, does not have a fingerprint reader (wikipedia), so it's highly unlikely that the password used is more complex than a 4-digit PIN.

Why the FBI needs Apple's help in this instance is questionable, since a pin-guessing robot (that prevents the software from wiping after 10 incorrect guesses) has already been built.

The attitude that Apple can do no right is just as odious as the attitude that Apple can do no wrong. The fact that earlier iPhones had an issue where it was possible for Apple to sign firmware for the FBI and use that to extract data doesn't change the fact that Apple has fixed that problem in later iPhones.

Of course, the Apple-centric solution of throwing out your old one and buying the latest one is generally crap, but in this case, it's not because they "invented" a stupid new cable, but because the hardware is actually more secure by having a remarkably complex piece of hardware buried inside the home button.

It rings a bit hollow for the FBI to be playing politics like this, but the FBI's current director was appointed by President Obama, and the President's stance on privacy isn't great.
posted by fragmede at 6:21 AM on February 17, 2016 [6 favorites]


Finally changed my pin to a seven digit one since the fingerprint reader means that I only have to enter that when I reboot or after a pretty long timeout period.
posted by octothorpe at 6:22 AM on February 17, 2016 [1 favorite]


on a positive note, does this allow apple to unbrick your phone?
posted by Nanukthedog at 6:24 AM on February 17, 2016


FBI Backdoor password: 324-367-7325 (FBI-FOR-REAL)
posted by Nanukthedog at 6:27 AM on February 17, 2016 [1 favorite]


I'm surprised the FBI isn't just having someone desolder the flash memory from the phone and go around all the OS....
posted by Xyanthilous P. Harrierstick at 6:31 AM on February 17, 2016 [2 favorites]


Why lift a finger when you can compel Apple to provide what you want on a silver platter?
posted by cmfletcher at 6:35 AM on February 17, 2016 [2 favorites]


Ha ha, the new iphone forces you to unlock using a password if the touch reader hasn't been used in 48 hours, or the phone has been rebooted. That is definitely designed to frustrate the FBI.

Does the fingerprint reader work if the finger is not attached to a live person? They should have unlocked it with the corpse's finger right away.
posted by 445supermag at 6:38 AM on February 17, 2016 [1 favorite]


1. one can install / upgrade without unlocking the screen on an iPhone?
2. Can one just not make infinite disk images of the locked phone and try to unlock them?
posted by asra at 6:38 AM on February 17, 2016 [1 favorite]


Why the FBI needs Apple's help in this instance is questionable, since a pin-guessing robot (that prevents the software from wiping after 10 incorrect guesses) has already been built.

They fixed that vulnerability in 8.1.1. iOS flushes the guess counter to flash before confirming the attempt's success or failure.
posted by Talez at 6:39 AM on February 17, 2016 [9 favorites]


My take? For what it is worth I'm fairly sure this is all theater. The FBI already has the ability to get access to this data and surely has done so already. This is being done so that:

* They can get access to data in the future, i.e. set precedent before it is too late: i.e. before Apple locks things up for real.
* Convince the people of the world that Apple phones are secure, which they are not, so that they don't take extra steps in using better encryption, which is available for use and really, really works.
* My pet theory: the three letter agencies really want the "bad guys" to stop using electronics and the internet all together. They would like the world to work like it did in the 80s again: real people, meeting face to face, talking. The intelligence practices of that simpler time were much more effective: spend an enormous amount of money that the other players don't have and win. If the FBI/CIA/NSA can create a crisis of confidence in electronic communication they will be able to play the game the way they would like to.

But, in summary, there is absolutely no reason why anyone that wants to run an organization of trusted individuals in such a way so as to never, ever have their communications snooped on. There is nothing stopping this technologically and there has not been for some time. If any competent technologist worth their salt were hired (and trusted) to set up a system for comms they could do it in days. That's why none of this adds up and is surely indicative of political dog wagging or false flagging.

But: trust. Tech is easy. Trust is tricky as hell.
posted by n9 at 6:44 AM on February 17, 2016 [3 favorites]


It sounds like Apple left themselves open to this kind of coercion by being bad at security.

Last time I checked, ttvgg.cxors.wohof.cxqcg.pwexx was deemed too weak to be an Apple ID password, while Apple123 was just fine. You be the judge.
posted by flabdablet at 6:45 AM on February 17, 2016 [2 favorites]


It's unfair to characterise Apple's ability to upload new security software as a back door. It has to be able to do this, in case - for example - a flaw becomes apparent in shipped software that is itself a security vulnerability. I don't think the state of the art in software verification is good enough to say the ability to update software is a greater risk than no such ability.

No system is completely secure. In this case, the weakest point in the system is Apple's own corporate security over its signature management. If that is compromised by a hostile agent, which could be surreptitiously through criminal means or could be by duly-authorised government agents, then the necessary mechanism for code updating can be subverted.

I don't know anything about Apple corporate security. The exploit by the legal system is the one under discussion here.
posted by Devonian at 6:46 AM on February 17, 2016 [7 favorites]


1. Some can install / upgrade without unlocking the screen on an iPhone?

You can if you use DFU mode however you can only install a version of the OS that will bypass the PIN lock restrictions on an A6 or earlier (i.e. this particular model the FBI wants Apple to unlock). Post-A7 processors have the PIN enforced in hardware by the secure enclave and the OS can't fuck with it.

2. Can one just not make infinite disk images of the locked phone and try to unlock them?

The encryption is tied to the secure enclave's UID key laid down during fabrication. You need to perform unlocking on the particular device or break a 256-bit AES key.

If Apple hadn't left in the ability to install a new OS on this particular locked phone, then the FBI wouldn't be able to compel them to craft a hacked OS to get the data. It sounds like Apple left themselves open to this kind of coercion by being bad at security.

That's why they figured out the hole and moved the PIN authentication stuff into the secure enclave on the A7 and all future models.
posted by Talez at 6:46 AM on February 17, 2016 [3 favorites]


What's sad is all the posturing, lawyering and money spent when you figure all they'd find is some guns-akimbo selfies and receipts from tactical-vests.com.

Yeah, the really actionable info, that is, who these people made phone calls to and thus possibly collaborated with, is already available due to surveillance metadata.
posted by CheeseDigestsAll at 6:47 AM on February 17, 2016 [1 favorite]


oh, i think i finally get it. they need apple because the software is signed and validated by firmware? they don't really need apple to write code - they would be equally happy with their private key.
posted by andrewcooke at 6:48 AM on February 17, 2016 [2 favorites]




> They fixed that vulnerability in 8.1.1. iOS flushes the guess counter to flash before confirming the attempt's success or failure.

I take that back then.

The A6 used in the iPhone 5c doesn't have the "Secure Enclave" security features used in later devices, so Apple-signed FBI firmware could extract the keys.

> I'm surprised the FBI isn't just having someone desolder the flash memory from the phone and go around all the OS....

At least as of iOS 9, that won't work. The data is encrypted "at rest", so all you'll get off the memory chips themselves is AES encrypted blobs and on A7 chips that key is locked away and not directly accessible.
posted by fragmede at 6:49 AM on February 17, 2016 [1 favorite]


FBI using Terror, Drugs or Prostitution to force the world to use insecure methods? The 1990s called, they want Clipper back.
posted by eriko at 6:52 AM on February 17, 2016 [1 favorite]


Maybe I'm dense, but couldn't Apple make this actually for-real one time only by doing as the FBI asks and then pushing an iOS update so that locked phones can't receive iOS updates?
posted by ROU_Xenophobe at 6:53 AM on February 17, 2016 [3 favorites]


The stored image is 256-AES IIRC which is effectively unbreakable using brute force (or biclique attacks which reduce the complexity by about x4- still heat death of the universe etc, etc).

There are side channel methods of breaking 256-AES but the generally rely on being able to run code on the same platform (via malware or other code). My guess is that either the FBI wants to be able to bypass the 10 times and then brick option or even more importantly have the way of installing a side channel toolset into iOS which would tend to make decryption of AES-256 feasible.

If Apple can effectively force an iOS upgrade remotely then yeah their entire security schema is basically suspect.
posted by vuron at 6:56 AM on February 17, 2016 [2 favorites]


fragmeade: The data is encrypted "at rest", so all you'll get off the memory chips themselves is AES encrypted blobs and on A7 chips ...

Yes, but this one is a 5C with an A6 processor. It should be possible (difficult) on this particular device to pop the case open, desolder the flash, and perform destructive surgery on the A6 CPU (xray, drilling, grinding, acid) to obtain access to the hardware key data. This would allow them to attack the pin-code offline.

That kind of attack would be moot on a newer A7 or better device with a Secure Enclave (and yet, has anyone really performed hardware attacks against the SE? we don't know. I bet *someone* has one under a microscope somewhere, though.)

While we're at it, I do believe that this is an acceptable minimum level of security for my data. If you want it, you're gonna have to take it to a reverse engineering lab in China to get it.
posted by Xyanthilous P. Harrierstick at 7:02 AM on February 17, 2016 [1 favorite]


...or brute-force your pissweak AppleID password and pull it all out of iCloud.
posted by flabdablet at 7:09 AM on February 17, 2016


I'm surprised the FBI isn't threatening to out Tim Cook as gay. The rest of their playbook seems as outdated.
posted by cjorgensen at 7:10 AM on February 17, 2016 [11 favorites]


I'm guessing these phones are running something newer than iOS 8.1.1, or the FBI could just use an IP-BOX to brute force the phone without bricking it (there was a flaw in Apple iOS before 8.1.1 does not properly enforce the failed-passcode limit that has since been patched).
posted by filthy light thief at 7:13 AM on February 17, 2016


Honestly I figure the preferred method for accessing user data long term is going to be through malware downloaded via the Apple Store. Yeah Apple is pretty good at reviewing most of the apps but it seems the optimal strategy for persistent threats like most nation states and organized crime outfits is going to be through hiding malware in otherwise innocuous applications.

I think it's safe to say the FBI probably has a reasonable suspicion of what's on the phone simply based upon the likely interception of data between the suspects phone and other persons of interest on the carrier network but they still want or need access to the phone for other reasons.

Beyond this it seems likely that Apple will be forced to comply to some degree (even if Apple wins in court the Republicans in Congress would likely force the issue). It's unclear what impact this will have on newer phones with the secure enclave but it seems quite likely that a legal strategy will be employed to neuter the SE at some point in time which will likely force people needing strong crypto secure from nation-state actors to rely on third-party crypto.
posted by vuron at 7:15 AM on February 17, 2016 [1 favorite]


It should be possible (difficult) on this particular device to pop the case open, desolder the flash, and perform destructive surgery on the A6 CPU (xray, drilling, grinding, acid) to obtain access to the hardware key data. This would allow them to attack the pin-code offline.
Back in 2008, just before the "agent.btz" worm badly infected military computer networks and caused the almost complete abandonment of thumb drive use in the military, we (several elements within ARCENT, of which I was a part) had only recently switched to "secure" thumb drives.

In addition to requiring a decryption key to access the media (which was prompted for right after plugging it in) and having the media be stored encrypted "at rest," it also had all the free space inside the drive filled with some kind of epoxy. That was done specifically to make it so that almost any attempt to get at the Flash chips or the crypto chip would destroy them beyond the ability to attempt any direct data extraction or chip bypass.

Now, obviously, this kind of additional protection doesn't apply to the particular case of this phone, but it shouldn't be presumed that there aren't ways to block the kind of exploration you refer to.
posted by mystyk at 7:17 AM on February 17, 2016 [3 favorites]


This whole thread is kind of like an iPhone commercial, if you care about security, (something I had mostly given up on, assuming the feds could get into any device they wanted to). If the killers had an Android phones, would the FBI have had no trouble waltzing in and getting whatever data they wanted?

Also, yes, I am thinking, these were not criminal masterminds, the chance of finding anything significant from their phones that isn't already known about them seems somewhat small. This feels more like an excuse for a precedent.
posted by emjaybee at 7:20 AM on February 17, 2016 [5 favorites]


It's unfair to characterise Apple's ability to upload new security software as a back door.

That's true but that's not where the back door lies. The back door is that they can upload new security software *without the user's permission*. Since that is now fixed, I think it's reasonable to call the previous situation a security vulnerability.
posted by bonaldi at 7:21 AM on February 17, 2016 [3 favorites]


The argument used by the FBI is that the phone might contain infor about contacts in the world of terror that would be useful to them. However, since a lot already is known about the two now dead terrorists, it seems more likely that as noted above the case is best "evidence" to get the courts to side with having access to every Iphone there is. How do you like them apples? I side with Apple on this.
posted by Postroad at 7:22 AM on February 17, 2016 [7 favorites]


my question is: does error 53 not appear if you upgrade your iPhone with TouchID turned off? If that's the case then error 53 is a baaad bug since all the iphone should be doing when discovering a swapped fingerprint reader is to disable the fingerprint reader and that's it.
posted by I-baLL at 7:23 AM on February 17, 2016


Also, yes, I am thinking, these were not criminal masterminds, the chance of finding anything significant from their phones that isn't already known about them seems somewhat small. This feels more like an excuse for a precedent.
Actually, the official reason has been laid our fairly clearly: They want access to his past communications to see who he was talking to and when. They're establishing the timeline for his actions leading up to the attack, but also building the case for larger terrorist connections.

In that light, the "precedent" angle, or just the ability to use the attack on other phones (since they likely want the custom OS version turned over to the Gov as well) is just a bonus in their view.
posted by mystyk at 7:24 AM on February 17, 2016


"Alternatively, the FBI could just suck it up and admit that this is just another workplace shooting like all the other workplace shootings that happen in the US every year and not evidence of some secret Muslim sleeper cell. But then how would they justify the huge budgets for them and "Homeland Security" and whipping everyone into a frenzy and derailing discussions of the economy and helping people and actual meaningful gun control by shouting TERRORISM because this one time the shooters were brown people instead of another angry white guy with a gun?"

Maybe you should call the FBI and let them know that you have evidence that contradicts everything known about the shooting so far. Like the fact that the attack seemed to be planned at least a year in advance and that the perpetrators had more weapons and bombs at their home indicating a planned second attack.
posted by I-baLL at 7:26 AM on February 17, 2016 [1 favorite]


"The argument used by the FBI is that the phone might contain infor about contacts in the world of terror that would be useful to them."

Yeah, it strikes me that you could just look at their phone records and text messaging records to see whom they've contacted. I mean, this is the theoretical basis for the NSA's metadata collecting program. I'm wondering how important is the phone to the FBI really. I mean, it would always be good for the investigators to know more but this does seem like a precedent type thing. Also, I wonder if the perpetrators had their phones set to sync with iCloud.
posted by I-baLL at 7:29 AM on February 17, 2016


Begun the encryption wars have.
posted by blue_beetle at 7:33 AM on February 17, 2016 [6 favorites]


my question is: does error 53 not appear if you upgrade your iPhone with TouchID turned off? If that's the case then error 53 is a baaad bug since all the iphone should be doing when discovering a swapped fingerprint reader is to disable the fingerprint reader and that's it.
That's what a lot of the complaints were; even people who had never used the FP reader were getting locked out because the FP reader had either been (1) damaged or (2) replaced by a non-apple retailer.

iOS 9 enforces the authenticity of the reader and the enclave to use the enclave *at all*, and no enclave means bricked phone. Prior to that, the reader provided its authentication to the enclave, but it wasn't checked against the known value for the reader that came with the phone.

If you get your phone serviced by Apple, they have the means to set the enclave to store and trust the new reader in place of the old one; third-party retailers swapping out your screen can't do that.
posted by mystyk at 7:35 AM on February 17, 2016 [1 favorite]


my question is: does error 53 not appear if you upgrade your iPhone with TouchID turned off? If that's the case then error 53 is a baaad bug since all the iphone should be doing when discovering a swapped fingerprint reader is to disable the fingerprint reader and that's it.

So says you. Me? I want the device bricked it it fails the hardware coupling check.
posted by cjorgensen at 7:37 AM on February 17, 2016 [7 favorites]


Yeah, it strikes me that you could just look at their phone records and text messaging records to see whom they've contacted. I mean, this is the theoretical basis for the NSA's metadata collecting program.
That works well for calls and SMS.

For iMessage: date/time [edit: and contacted person] could be theoretically collected from Apple via the courts (assuming they stored that data), but would be difficult for even the NSA to get without the court's involvement unless they had a backdoor to Apple's user database (possible, but I imagine Apple's taking at least some steps to prevent the utility of that).

iMessage contents could only be collected from either the sender or receiver's devices, though, as Apple uses end-to-end encryption in iMessage without key escrow.
posted by mystyk at 7:41 AM on February 17, 2016 [3 favorites]


Yeah, it strikes me that you could just look at their phone records and text messaging records to see whom they've contacted.

A lot of the apps on the iPhone use end to end encryption, so if they were talking to other people with iPhones over the internet there would be no record of FaceTime sessions or Messages sessions outside of the devices themselves. And then there are dozens of apps they could have been using for coordination. Without getting into the phone you wouldn't even have an idea of where to start. What if they were using Snapchat or WhatsApp or whatever the cool terrorists are using these days?
posted by cjorgensen at 7:42 AM on February 17, 2016


"Why the FBI needs Apple's help in this instance is questionable, since a pin-guessing robot (that prevents the software from wiping after 10 incorrect guesses) has already been built."

From the link:
"Trying all 10,000 possible combinations at that rate would take around 4 and a half days"
Pretty sure all this legal wrangling has gone on a lot longer than 4 and a half days.
posted by howling fantods at 7:43 AM on February 17, 2016 [1 favorite]


"So says you. Me? I want the device bricked it it fails the hardware coupling check."

Exactly. It should be an option. If I want my phone to be ultra secure then I'll use that. Though I wouldn't use fingerprint authentication if I wanted my phone to be secure since it's more of a convenience device so you don't have to type in your password or passcode.
posted by I-baLL at 7:45 AM on February 17, 2016


It should be an option. If I want my phone to be ultra secure then I'll use that.
And what if you decided you wanted extra security later, and turned it on (assuming you even remembered to)? The prosecutor is already using the fact that the shooter in question turned off iCloud storage/backup as proof of nefarious intent 6-weeks prior. In his case, that's probably a reasonable assumption (though admittedly unproven), but what happens when they knock on *your* door?
posted by mystyk at 7:49 AM on February 17, 2016 [2 favorites]


Honestly I figure the preferred method for accessing user data long term is going to be through malware downloaded via the Apple Store. Yeah Apple is pretty good at reviewing most of the apps but it seems the optimal strategy for persistent threats like most nation states and organized crime outfits is going to be through hiding malware in otherwise innocuous applications.

Even if you were able to install malware you'd have to break through the app sandbox to get what you're after and so far that's been non-trivial. Most malware when it does make it on to the app store is just straight up social engineering password collection.

If the killers had an Android phones, would the FBI have had no trouble waltzing in and getting whatever data they wanted?

Probably. Google have wavered for the last few years on making full disk encryption mandatory on Android. The reason it goes back and forth is because Google insist on using the ARM v8 crypto extensions to accomplish FDE despite ARM telling them, no, don't fucking do it, it's not fast enough for FDE. Then the OEMs push back because the NAND performance is garbage and they get killed in the benchmarks. Not only that but because of the fragmented Android update scene only 35% of handsets actually run an Android version that has vaguely usable FDE.
posted by Talez at 7:51 AM on February 17, 2016 [5 favorites]


The thing is -- the shooters destroyed their own personal phones and their hard drive. This is just a work phone, and the fact they didn't bother to destroy it despite being sufficiently complete with the others that the FBI hasn't either been able to find them or recover any data, suggests that this was only for work, not any of their planning. So yeah, this is likely just theater.
posted by tavella at 7:53 AM on February 17, 2016 [6 favorites]


"A lot of the apps on the iPhone use end to end encryption, so if they were talking to other people with iPhones over the internet there would be no record of FaceTime sessions or Messages sessions outside of the devices themselves."

Encryption prevents the contents of the conversations from being known. It does not do anything to hide who the sender or recipient are. iMessages are sent through Apple's infrastructure and thus the sender and recipient could be tracked since it's not a direct p2p communication protocol.
posted by I-baLL at 7:53 AM on February 17, 2016 [1 favorite]


Rhaomi: "Tim Cook published the open letter at midnight Pacific time, when most Americans were already asleep. Europe, however, was just waking up—and Europeans tend to get quite upset by egregious breaches of privacy. "

Oh, FFS. Tech journalists need to stop trying to read the tea leaves about everything that Apple does. Just. Stop.

Apple's PR set the press release to go out over the wires at midnight in their local time zone. It it the most unremarkable way imaginable for a corporation to release a public statement.
posted by schmod at 7:54 AM on February 17, 2016 [3 favorites]


Bricking on security fail is a reasonable action. The problem with error 53 was that it wasn't coincident with the hardware failure, it was an unexpected side-effect of a software update which picked up a pre-existing, silent fault. That update could have refused to install on discovering the problem, giving the user enough information to make a sensible decision about what they wanted to do about it - disable the phone, revert to earlier version, set a passcode.

Would have saved a lot of hassle, and wouldn't have left anyone involuntarily exposed.
posted by Devonian at 7:55 AM on February 17, 2016 [2 favorites]


My work e-mail gets press releases all day long; you know when I never get any from US-based corporations? Midnight. If I got one at midnight about something that had happened many hours earlier I would definitely remark on that.
posted by Holy Zarquon's Singing Fish at 7:57 AM on February 17, 2016 [2 favorites]


Maybe you should call the FBI and let them know that you have evidence that contradicts everything known about the shooting so far. Like the fact that the attack seemed to be planned at least a year in advance and that the perpetrators had more weapons and bombs at their home indicating a planned second attack.

You know who else had been planning for months and built pipe bombs: the Columbine killers. Were they a sleeper cell?

Here is a very long list of US right wing terrorists, many of whom also had long term plans, stockpiled weapons, and built bombs. Nothing new here, except the religion of the perpetrators, which is very conveniently being used to justify breaking encryption on iPhones while we all watch because Muslim terrorists are oh so much scarier than Christian terrorists.
posted by hydropsyche at 7:58 AM on February 17, 2016 [4 favorites]


There is no doubt in my mind that the government has the means to unlock this phone.
posted by humanfont at 8:04 AM on February 17, 2016


My understanding is that FDE (well user data encryption - not system OS encryption) support will be mandatory for Android 6. There are obviously multiple ways of doing it but Talez is generally correct in that the performance hit for encryption on the Nexus platforms has been less than stellar. Some OEMs use an additional subprocessor (Qualcomm,etc) rather than doing it on the ARM processor and some people just do it in software using the CPU.

Samsung is the 800 pound gorilla though and they've been less than supportive of FDE (breaks some of their wearables and is incompatible with Samsung Pay) so it's kinda unclear what the long term FDE future is for Android in a general user sense.

That being said there are some OEMs with a super secure version of android out there (Silent Circle's Blackphones for instance) but they typically aren't marketed to the average consumer (and honestly I suspect just buying a Blackphone 2 probably gets you added to some sort of watchlist)
posted by vuron at 8:06 AM on February 17, 2016 [2 favorites]


My house has an alarm system, but I have not rigged the house to collapse if the alarm goes off.

Similarly, I want to at least have the option of repairing my phone if it breaks.

Given that it's already possible to unlock a phone without using the fingerprint reader, I don't understand why it wouldn't be acceptable to simply disable it (and display an intrusive warning alerting the user that their hardware has been compromised).

Sidenote: I criticize Apple a lot. They are unquestionably doing the right thing here, even though the security on the 5C was less-than-ideal, thanks to their backdoor. I wish that Apple would be more forthcoming about issuing a "mea culpa" regarding that backdoor, because the 5C was probably still the most secure mass-market phone available up until that point.

Sidenote #2: I'm massively uneasy running closed-source software on an encrypted device that I do not have permission to fully decrypt, audit, or load my own software onto. This is admittedly a nerd gripe, but also sets a series of scary precedents. Most people (myself included) will never do this, but I firmly believe that we are all better off as long as that option exists.
posted by schmod at 8:18 AM on February 17, 2016 [1 favorite]


Counterpoint: OpenSSL
posted by humanfont at 8:23 AM on February 17, 2016 [1 favorite]


"You know who else had been planning for months and built pipe bombs: the Columbine killers. Were they a sleeper cell?"

They used their pipe bombs in their one attack. The San Bernadine shooters had bombs at home which they were going to use later.

"Here is a very long list of US right wing terrorists, many of whom also had long term plans, stockpiled weapons, and built bombs. Nothing new here, except the religion of the perpetrators, which is very conveniently being used to justify breaking encryption on iPhones while we all watch because Muslim terrorists are oh so much scarier than Christian terrorists."

I'm not sure what you're arguing since my objection was to you calling the San Bernadino shooting "just another workplace shooting" as opposed to a terrorist attack.
posted by I-baLL at 8:30 AM on February 17, 2016


Given that it's already possible to unlock a phone without using the fingerprint reader, I don't understand why it wouldn't be acceptable to simply disable it (and display an intrusive warning alerting the user that their hardware has been compromised).

I imagine that with specialized hardware, you could circumvent the secure enclave hardware in the phone. The only way to insure that Apple can't be tasked to unlock newer phones in the same way they are being tasked in this case is to immediately brick the phone in the case that the secure enclave hardware has been breached.

It gives Apple plausible deniability that they can't create firmware to circumvent pin unlocking because the secure enclave hardware isn't affected by the firmware. And they can't change the secure enclave hardware because as soon as they do, it bricks the phone.
posted by zabuni at 8:35 AM on February 17, 2016 [1 favorite]


I don't care if it was a terrorist attack, that still shouldn't be used as a reason to allow the FBI sweeping bullshit accommodations. We need to stop acting like "terrorism" is a magic word that should kill all civil liberties.
posted by corb at 8:37 AM on February 17, 2016 [19 favorites]


> It should be an option. If I want my phone to be ultra secure then I'll use that.

It is an option. There are plenty of other phones out there. No one says you have to buy one with security. No one says you have to turn on security if you buy an iPhone. But if you have security you want it to be the best available. Otherwise, Apple probably isn't your pick.

> It is the most unremarkable way imaginable for a corporation to release a public statement.

It's not how well the bear dances that is remarkable. It's that the bear dances at all. Show me google's statement on this, or Microsoft's, or Amazon's.

Apple Blows Up The Concept Of A Privacy Policy

Why Apple really cares about your privacy
posted by cjorgensen at 8:43 AM on February 17, 2016 [2 favorites]


humanfont: "Counterpoint: OpenSSL"

Heartbleed [probably] wasn't a deliberately-placed backdoor.

zabuni: "I imagine that with specialized hardware, you could circumvent the secure enclave hardware in the phone. The only way to insure that Apple can't be tasked to unlock newer phones in the same way they are being tasked in this case is to immediately brick the phone in the case that the secure enclave hardware has been breached."

Breaching the secure enclave is not the same thing as replacing the fingerprint reader. If the secure enclave reports that it cannot authenticate the fingerprint reader, there's no reason to necessarily suspect that the enclave itself has been compromised.
posted by schmod at 8:43 AM on February 17, 2016 [1 favorite]


schmod: Apple's PR set the press release to go out over the wires at midnight in their local time zone. It it the most unremarkable way imaginable for a corporation to release a public statement.

Holy Zarquon's Singing Fish: My work e-mail gets press releases all day long; you know when I never get any from US-based corporations? Midnight. If I got one at midnight about something that had happened many hours earlier I would definitely remark on that.

Publicists typically time when we auto-send out press releases pretty carefully, based on a variety of criteria. Including: Is the news urgent? Time-sensitive? Should this news be released during US/European business hours? What kind of response are we expecting? Are the press likely to require a rapid response to related queries? In the case of publicly-held companies: What stock markets are open, and how is this news likely to affect them? Etc.

A press release sent between midnight and 4am on a weekday will be one of the first things journalists see in the morning when they wake up, or when they arrive to the office. With luck, it will not be buried in the rest of the overnight news. With more luck, journalists will have time to think about the wider ramifications of what you are releasing before addressing the story in print or electronic media.

This press release addresses an issue Apple is having with a government agency, which that means this story will undoubtedly get a great deal of attention and put a negative focus on the FBI. Releasing it at midnight means the news could theoretically have been out there for between 7 and 9 hours before that agency's PIOs sit down at their desks in the morning to formulate a response, or get one approved.
posted by zarq at 8:44 AM on February 17, 2016 [7 favorites]


Show me google's statement on this, or Microsoft's, or Amazon's

How about AT&T's? Here's AT&T's CEO Randall Stephenson on the issue.
posted by Xyanthilous P. Harrierstick at 8:48 AM on February 17, 2016 [2 favorites]


schmod: Breaching the secure enclave is not the same thing as replacing the fingerprint reader. If the secure enclave reports that it cannot authenticate the fingerprint reader, there's no reason to necessarily suspect that the enclave itself has been compromised.

Its my understanding (which may be wrong!) that the fingerprint reader just sends a pass/fail to the SE to unlock the code; which is why they're so hell-bent on authenticating that the reader (and the WIRE TO THE READER?! how?) has not been tampered with.
posted by Xyanthilous P. Harrierstick at 8:51 AM on February 17, 2016


Google, microsoft, amazon, at&t....

Jeez, you almost smell the narcs in here.
posted by valkane at 8:52 AM on February 17, 2016 [1 favorite]


I'm confused: I thought agencies like the FBI could only get a court order to force a company to turn over information they actually have --- like getting AT&T or Verizon or whoever to give them records of what numbers I called or that called me, that sort of thing. This though.... it sounds like the FBI wants a court to order Apple to actually create a new computer decryption system, not just turn over data Apple already holds?!?

How can that even be legal? How can you legally force a civilian corporation to build something for free like that?
posted by easily confused at 8:55 AM on February 17, 2016 [4 favorites]


Its my understanding (which may be wrong!) that the fingerprint reader just sends a pass/fail to the SE to unlock the code; which is why they're so hell-bent on authenticating that the reader (and the WIRE TO THE READER?! how?) has not been tampered with.

That is wrong.

At boot, the reader sends its crytographic ID to the secure enclave. If this ID matches the one that was paired with the enclave at the time the phone was manufactured, the two are paired successfully and will continue to talk to each other. If the IDs mismatch -- likely because the home button was replaced with a third-party part -- the secure enclave will refuse further communication, effectively disabling TouchID. This is separate from the Error 53 bricking, which only happens during an iOS update.

Assuming that handshake completes successfully, then when you scan your fingerprint for TouchID the reader sends an encoded image of the print to the secure enclave, where it will be compared against the registered print. The registered print is stored in the enclave, not in the reader -- a compromised scanner cannot simply send a generic "pass" signal.
posted by Holy Zarquon's Singing Fish at 8:56 AM on February 17, 2016 [8 favorites]


Donald Trump on Apple: "Who Do They Think They Are?"
posted by valkane at 8:58 AM on February 17, 2016


"Show me google's statement on this, or Microsoft's, or Amazon's."

On what? On the San Bernadino shooters' phone thing? Because why would they comment on that? But if you mean on government surveillance and the importance of encryption then:

https://www.reformgovernmentsurveillance.com/

And here's a letter from Google about the exact concept at hand here:

https://www.google.com/takeaction/issue/encryption/
posted by I-baLL at 9:01 AM on February 17, 2016 [1 favorite]


Donald Trump on Apple: "Who Do They Think They Are?"

He's giving his new catch phrase a workout.
posted by zarq at 9:03 AM on February 17, 2016 [1 favorite]


"How about AT&T's? Here's AT&T's CEO Randall Stephenson on the issue."

Wow, thank you for posting this. I started laughing at this part:
The AT&T chief said his own company has been unfairly singled out in the debate over access to data. "It is silliness to say there's some kind of conspiracy between the U.S. government and AT&T," he said, adding that the company turns over information only when accompanied by a warrant or court order.
I guess he must have selective memory or maybe he didn't have his coffee and glazed over Room 641A.
posted by I-baLL at 9:06 AM on February 17, 2016 [4 favorites]


Mr. Trump: less common, more sense please.
posted by mazola at 9:06 AM on February 17, 2016 [1 favorite]


And here's a letter from Google about the exact concept at hand here.

Cool. They released that statement in October. I'll check them off my mental checklist of companies that need such statements.
posted by cjorgensen at 9:08 AM on February 17, 2016


schmod:
Similarly, I want to at least have the option of repairing my phone if it breaks.

This is an option. Error 53 doesn't destroy the phone. A phone bricked by Error 53 can be repaired by Apple using an authorized replacement TouchID. Of course, you'd have to prove ownership of the iPhone first.
posted by LoveHam at 9:13 AM on February 17, 2016 [1 favorite]


This reminds me of the plot of the 1996 Michael Bay Classic, the Rock. The Federal Government attempts to compel someone with special knowledge of Alcatraz to become its involuntary agent in a mission to infiltrate Alcatraz. Doublecrosses follow.

The articles give the impression that Apple is being ordered to make something that did not previously exist, against its will and against its economic interest. This is different than being compelled to provide evidence (testimonial or documentary) of facts that already exist. It seems more like being compelled by the executive branch and the courts to become a de facto agent of the government.

If that is accurate, I think it runs afoul of the 13th amendment: "Neither slavery nor involuntary servitude, except as a punishment for crime whereof the party shall have been duly convicted, shall exist within the United States, or any place subject to their jurisdiction."
posted by pablocham at 9:14 AM on February 17, 2016 [4 favorites]


Anyone who uses the argument "teh terrorists!" to advance their agenda is either a craven coward or opportunistic weasel and should be publicly mocked and pilloried at every opportunity.
posted by entropicamericana at 9:15 AM on February 17, 2016 [9 favorites]


Come and knock on our door... (Come and knock on our door)
We've been waiting for you.... (We've been waiting for you)
Where the data are yours and Apple's and the FBI's,
And The Company's too.
posted by Kabanos at 9:31 AM on February 17, 2016 [7 favorites]


It seems more like being compelled by the executive branch and the courts to become a de facto agent of the government.

Which is why the FBI is invoking the All Writs Act, which they have used with Apple before, to get them to unlock the iPhone.

Here's a Youtube video about the act by Jonathan Mayer.
posted by zabuni at 9:41 AM on February 17, 2016


This reminds me of the plot of the 1996 Michael Bay Classic, the Rock. The Federal Government attempts to compel someone with special knowledge of Alcatraz to become its involuntary agent in a mission to infiltrate Alcatraz. Doublecrosses follow.

"'Patriotism is a virtue of the vicious,' according to Oscar Wilde."
posted by zarq at 9:46 AM on February 17, 2016 [3 favorites]


All I have to add to this is that I'm not really surprised the only gay tech CEO is the one defending users privacy.
posted by SansPoint at 9:52 AM on February 17, 2016 [12 favorites]



I'm a little confused by exactly what the FBI wants - it doesn't look like a cryptographic back door, but instead a high bandwidth pathway for a brute force attack. Which I would have thought would be reverse-engineerable, unlike a change in crypto that would reveal the data on the particular phone they're holding.


Right now, the phone is holding part of the decryption key in volatile memory. So it would only take 10^4 attempts t brute force the rest of it.

The FBI's choices are:

1. Dissect the phone, copy out the image of the flash device, ad brute force it, at which point the key length is something like 10^20 times larger, and even at full speed, would take a damn long time to decrypt

2. Get Apple to write a specialized software tool that installs itself into the phone over the USB cable, and speeds up the brute forcing of the PIN so those 10^4 attempts take under 30 minutes.

3. Give up.

There's more than enough proof to indicate that 3 is justified. Even if the phone contains messages from some snaggle toothed preacher telling the suspects to do what they did, it would not lead to a viable prosecution.

The problems with #2 are as follows:

1. Apple is being ordered to produce software that does not exist at the moment. We're supposed to not like forced labor in this country.

2. The moment this software tool exists, Apple can start receiving FISA court orders to make more extensive use of it, under conditions where their software signing key can be compromised. Best not to let the tool exist.
posted by ocschwar at 9:59 AM on February 17, 2016 [8 favorites]


This is not a new issue between Apple and the govt. It has been going on for some years, with Apple arguing their case in this way, and the FBI arguing another way, all recorded HERE
posted by Postroad at 10:00 AM on February 17, 2016 [2 favorites]


"2. The moment this software tool exists, Apple can start receiving FISA court orders to make more extensive use of it, under conditions where their software signing key can be compromised. Best not to let the tool exist."

That's why it's a double edged sword for the government to push this since foreign governments will then also force Apple to make use of it for them.
posted by I-baLL at 10:00 AM on February 17, 2016 [3 favorites]


This is such a terrible case for Apple to make their stand on that one suspects they chose it deliberately so as to lose and then not have to deal with this question again. Really the optics are terrible. The phone does not belong to the terrorists, it is the property of the government agency that employed one of the terrorists and that agency has consented to the search of its own device.

You could not pick a worse situation legally or on the equities to fight this battle.
posted by Ironmouth at 10:01 AM on February 17, 2016 [4 favorites]


tavella: So yeah, this is likely just theater.

Not so much theater, I suspect, as getting a precedent on the books and tools written. The "terrorism" boogy man is fairly powerful, especially on US soil, so the relative pushback is lower. In this instance, even with that, you are seeing significant resistance in the court of public opinion.
posted by MrGuilt at 10:02 AM on February 17, 2016 [1 favorite]


The court should force Apple to write some tools to break into Android phones while they're at it. I mean, two birds? Right?
posted by valkane at 10:05 AM on February 17, 2016 [4 favorites]


I'm not sure just what universe I'm in, but Ironmouth is totally correct. This is such a terrible hill to die on that it simply has to be deliberate. It's almost fictionally bad, if you put this into a movie, people wouldn't believe it.
posted by Sphinx at 10:08 AM on February 17, 2016 [1 favorite]


It should be clarified that the FBI doesn't want simple accesss to the phone but a simpler way to hack the phone by giving them better technical access to the phone and by disabling the security feature, which will brick the phone and delete all data after 10 attempts. This is not the same as "defeating any and all security measures". They don't want a backdoor just better access to the keyhole for the front door.

The problem with Apple turning the data over in this case is that there is a chain of custody problem. If Apple itself unlocks the phone the argument anyone being prosecuted based on information in the phone will argue that the fact that the phone was turned over to Apple and Apple provided the data is that the evidence was tainted because it left the evidence room and was in the hands of a third party. Which would be true.

Again, I think Apple is trying to take a knee on this one while appearing to resist. Judges are affected by the equities and you will have a tough time finding any judge, district or appeals who is going to rule in Apple's favor because of the fact that the device was used in a known, infamous terror attack and the information cannot be provided by any other means. Apple's argument that it will be harmed by doing so is irrelevant to the analysis, because that is not something search and seizure law covers.

My guess is Apple loses this one.
posted by Ironmouth at 10:08 AM on February 17, 2016 [2 favorites]


Donald Trump on Apple: "Who Do They Think They Are?"

As if I needed to be further convinced of the Rightness of Apple on this point.
posted by MrGuilt at 10:09 AM on February 17, 2016 [3 favorites]


Of course it's a terrible case on optics - because that's what the government uses to expand its powers. Like how email searches started with child pornographers, because, you know, who the fuck wants to defend them? They very carefully choose unpopular defendants to push their power, so people can go "this is no big deal," ignoring that the power they authorize will eventually be used on them.
posted by corb at 10:17 AM on February 17, 2016 [23 favorites]


Sorry if this has already been mentioned, but just to clarify exactly how the four-digit passcode works from an encryption standpoint:

iOS has a feature called "Data Protection" which works on iPhone 3GS and later (so, almost all iPhones in current usage, I would imagine). It appears to be enabled by default, and as far as I can tell, it can't be disabled by the user.

This feature basically applies an extra layer of encryption over the existing hardware-based encryption, using the passcode as the encryption key. It's a flimsy layer, yes—but it means that you really do need that passcode in order to read the contents of the the flash drive (or a copy of it).
posted by escape from the potato planet at 10:23 AM on February 17, 2016


Also, Apple isn't focusing on the shooters' privacy rights for a reason. If I buy an unpickable safe and then forget the combination, the fact that it's my property doesn't entitle me to force the manufacturer at gunpoint to make it, and all other safes of that model, pickable.
posted by Holy Zarquon's Singing Fish at 10:23 AM on February 17, 2016 [6 favorites]



The problem with Apple turning the data over in this case is that there is a chain of custody problem. If Apple itself unlocks the phone the argument anyone being prosecuted based on information in the phone will argue that the fact that the phone was turned over to Apple and Apple provided the data is that the evidence was tainted because it left the evidence room and was in the hands of a third party. Which would be true.


It's not like there will be a prosecution here.

We're talking about a lone-wolf attack. Terrorist organizations have written material advocating for lone wolf attacks in principle. They've done it for decades. The neo-nazi David Lane wrote "Leaderless Resistance" in the 80's. You can find the essay on line. It's part of a long line of writings advocating for lone wolf attacks in the abstract. Nobody's been prosecuted for publishing such material before. To get prosecuted, you need to tell a specific person to go and commit an attack.

The San Bernadino murderers may have been motivated by such material. (ISIS and AQ have both published such writings.) But if they got a specific instruction to go and do it, they got it from someone who is nowhere near the reach of the FBI, and will never be prosecuted. And if they got the idea from material posted and addressed "to whom it may concern," the author could be out in the open in the US and he will not be touched.

This is simply an attempt to establish a precedent where a software maker is compelled to write software that does not exist. It's a camel's nose in the tent scenario, and kudos to Apple's SEO for slapping the camel away.
posted by ocschwar at 10:26 AM on February 17, 2016 [21 favorites]


Thank you for the very first comment in this thread. It sheds light on what I'm seeing all around me.
posted by infini at 10:36 AM on February 17, 2016


So I take it the password isn't "I am SHERlocked"
posted by Ber at 10:38 AM on February 17, 2016 [6 favorites]


Also, Apple isn't focusing on the shooters' privacy rights for a reason. If I buy an unpickable safe and then forget the combination, the fact that it's my property doesn't entitle me to force the manufacturer at gunpoint to make it

The terrorists did not own the phone. The employer does and consented. Apple built a lock that it refuses to open for the owner of the phone. This is why I think they are trying to lose, the only argument they have is that they will be harmed. Not a winner under search-and-seizure law.

As for this not linking to any other suspects, that is baseless speculation. Maybe there is nothing on the phone of value or it is wiped, or it leads to terror cells. Nobody knows
posted by Ironmouth at 10:40 AM on February 17, 2016


The terrorists did not own the phone.

Yes, that would be why in my analogy it is the owner of the safe who wants access to its contents.
posted by Holy Zarquon's Singing Fish at 10:41 AM on February 17, 2016 [1 favorite]


Kinda a funny coincidence that the error 53 thing, despite existing for months beforehand, became a story last week huh?
posted by yeahwhatever at 10:43 AM on February 17, 2016 [4 favorites]


"This is such a terrible case for Apple to make their stand on that one suspects they chose it deliberately so as to lose and then not have to deal with this question again"

This didn't start with this case and it's not just Apple.
posted by I-baLL at 10:49 AM on February 17, 2016 [2 favorites]


Oh, also -- "they will be harmed" might not be "a winner" under search and seizure law, but this is not a search and seizure order. It's an application of the All Writs Act, and past rulings specifically bar writs that impose an "undue burden" on the subject. Like, say, crippling their own security practices.
posted by Holy Zarquon's Singing Fish at 10:49 AM on February 17, 2016 [6 favorites]


Oh, also -- "they will be harmed" might not be "a winner" under search and seizure law, but this is not a search and seizure order. It's an application of the All Writs Act, and past rulings specifically bar writs that impose an "undue burden" on the subject. Like, say, crippling their own security practices.

Good point. Apple can't argue 4th Amendment because their stuff is not being searched.
posted by Ironmouth at 10:53 AM on February 17, 2016 [1 favorite]


(Someone asked me to do some punditry on this one earlier today; I declined, as I didn't think I quite got the issues. Glad I did that, as reading up about it has taken me down American legal lanes I didn't know existed.)

As I now understand it, the technicalities don't matter except in that the FBI can't do this, and Apple can. The AWA can apply and compel someone/thing to carry out a writ if there's no other way to get it done, if there's no other applicable judicial instrument that covers the case of the writ, if the third party has some connection with the case, and if it doesn't put an undue burden on the third party.

I think most of that is true - only Apple has the ability to make the changes in the iPhone to reveeal the encrypted data. There's no precedent for this exact circumstance - the combination of the current state of technology, the configuration of the phone and the ownership issues - and there's no better law to use. Apple has a connection with the case, as it's an Apple product and Apple's security systems/policies... and on one level, it's clearly not going to affect the company too much to comply, in that I imagine producing the firmware will involve making tiny changes to a handful of lines of code. I doubt the FBI will require the full software QA process that normally accompanies an update...

Which doesn't mean that Apple can't appeal to higher courts, until one refises to hear it, or that it will find a good argument for declining help. But I can't easily see one, from my not-particularly-well-informed non-US layperson's perspective. It could be that the undue burden test will be more broadly applicable than just the 'work required for this one ask'. Who knows; the AWA is vague, which is why it gets used in novel cases by the looks of things. But as mentioned upthread, it doesn't look like a strong 'un for Apple.

I agree that it's an odd battle to fight. I'dve complied quickly and quietly, and if called on it say that it was the phone's owners who requested access, and in any case more recent iPhones are beyond even Apple's reach to open. There's plenty of case law that limits expectations of privacy on work devices.

i do wonder if this is to establish the unusual nature of this particular case, and so limit precedence.
posted by Devonian at 10:57 AM on February 17, 2016 [1 favorite]


That's why it's a double edged sword for the government to push this since foreign governments will then also force Apple to make use of it for them.

Something tells me the FBI is looking forward to seeing Britain's GCHQ make far more cavalier and unaccountable use of this tool on their behalf.
posted by ocschwar at 10:58 AM on February 17, 2016 [5 favorites]


They are not refusing to unlock the phone. They can't (taking them at their word) unlock the phone with the tools at hand. They are being compelled to create a new tool. A tool which they do not believe should exist.

Although the order says that the tool should be keyed to this one device, once it exists, it becomes a much smaller burden on Apple to create versions for other devices (of that generation or older), or even to create a version that is not locked down. At that point, it becomes much harder for Apple to resist further requests to unlock these older phones.
posted by notbuddha at 11:00 AM on February 17, 2016 [4 favorites]


anybody have any cites to the relevant case law and balancing test for the All Writs Act? I just reviewed the order and there is no reference to it because the case has not been litigated yet.
posted by Ironmouth at 11:01 AM on February 17, 2016


It's a completely horrible case on optics and keep in mind the US electorate has sided again and again with the idea that security concerns trump privacy concerns. The Republicans (and even some Democrats) will line up one after another to attack Apple's behavior in this regard because attacking Silicon Valley companies with billions in cash reserves will play into the current populist rage great.

Keeping in mind that the FBI already has all the icloud data prior to October (which was the last backup) it's pretty clear that Apple is convinced that they have more to lose by crippling their security that they do in fighting this. I suspect the current Supreme Court vacancy might also be playing into their decision because they will likely get to have this battle in the Ninth Circuit which is probably the most friendly to privacy advocacy.

Without knowing whether the lockpick is exclusively in relationship to the older pre-SE iPhones or also includes some methodology for bypassing the hardware encryption functionality of the A7 would also be interesting to know because in theory Apple could allow the older phones to get compromised without losing their holy grail on the newer phones.
posted by vuron at 11:02 AM on February 17, 2016 [1 favorite]


The NSA is for the most part a foreign intelligence agency. In that capacity, its targets have no Constitutional rights and only very limited other rights, most arising under treaty and international law, and non-state actors like terrorist organizations don't have any of those, either. It can dragnet to its heart's content, and send in the Hellfire missiles and special forces door-kickers as the data warrant.

Well NSA is nominally a foreign intelligence agency. But we now know that their domestic surveillance activities are extensive, that for many purposes they make no or only pro forma distinctions between foreign and domestic intelligence, and that they share extensively with foreign-country intelligence services in order to circumvent each's respective prohibitions on domestic intelligence gathering.

NSA has a relationship with DEA, exact extent unknown, where they provide intelligence to prosecute drug cases (including domestic cases involving only US persons), but their involvement is then covered up by "parallel constructing" the intelligence they (unconstitutionally) supply by apparently legal means. This relationship seems to be based on sharing information from the bulk collection programs, and we don't know if NSA also shares the expertise of its offensive "Tailored Access Operations" teams -- perhaps not.

Even if the FBI could find resources in the government to break this phone themselves, though, they might prefer a court order requiring Apple to do it, because it allows them to politically establish more rights to violate people's privacy, and diminishes any reliance they may have on other agencies.

I agree with your broader point that we should see any barriers between NSA and law enforcement as "features" not "bugs", though. (Worth remembering here, though, that, after 9/11, lack of cooperation between law enforcement and intelligence was conveniently fingered as one of the reasons the Homeland was not Secure.)
posted by grobstein at 11:05 AM on February 17, 2016 [5 favorites]


Could Apple assert at least partial ownership of the phone based upon the EULA? In effect the idea that while the hardware is owned by the State of California (or whomever) that Apple has ownership rights concerning iOS.

The Open Source security community will obviously point out that any security design based upon closed source code is completely suspect anyway because you can't discover vulnerabilities (or even deliberate backdoor capabilities placed into the code). Yes you get the occasional vulnerability that will get patched (OpenSSL for instance) but that's still better than security through obscurity.

It's likely that Apple is being truthful in regards to not having a backdoor because their exposure if a backdoor is ever detected would be massive but the potential still exists.

Man this is a messy case and the fact that Apple is trying to play hardball helps illuminate the likely reasons why they really don't want this (mainly that it would basically forever compromise their Apple Pay initiative).
posted by vuron at 11:12 AM on February 17, 2016 [4 favorites]


The big SCOTUS case is U.S. v. New York Telephone. I'm not up to date on recent interpretations of the test, but I wouldn't be surprised to see Apple throw at least a passing citation to in re: In the Matter Of the Application, a 2003 decision by the 9th Circuit (which will also hear any appeal that Apple files). That was a wiretap case, but it quoted part of New York Telephone that said the All Writs Act order at issue there was acceptable because it required "minimal effort on the part of the Company and no disruption to its operations" (Emphasis added by the 9th Circuit). Apple is making a credible case that creating an iPhone skeleton key would disrupt its business operations.

the in re: court also said "The obligation of private citizens to assist law enforcement, even if they are compensated for the immediate costs of doing so, has not extended to circumstances in which there is a complete disruption of a service they offer to a customer as part of their business," which doesn't sound like something the FBI wants to hear in this case.
posted by Holy Zarquon's Singing Fish at 11:13 AM on February 17, 2016 [9 favorites]


HZSF,

Thanks. Digesting these two now. Your analysis is very good.

One question, a hypo--sort of--Apple's business model could be argued to provide a service that appeals to persons involved in criminal conspiracies to defeat government law enforcement efforts. Assuming, arguendo that internal company docs show that they hoped to market to persons wishing to avoid US law enforcement, could the business purposes be declared improper in the first place? Wouldn't every company start using these types of systems if Apple is successful, making law enforcement effectively blind?
posted by Ironmouth at 11:27 AM on February 17, 2016



I agree that it's an odd battle to fight. I'dve complied quickly and quietly, and if called on it say that it was the phone's owners who requested access, and in any case more recent iPhones are beyond even Apple's reach to open.


The more recent models are DESIGNED to be beyond the reach of this method or anything like it.

But are they? They're designed by human beings, after all.

What happens if there's a flaw? NSA figures it out. Hands a plan of action to the FBI. FBI turns it into a court order. And we're back to where we started.
posted by ocschwar at 11:30 AM on February 17, 2016


weird question but is it pronounced San BernaDINO? Looks like it is San BerNARDINO, but I haven't heard that pronunciation
posted by Ironmouth at 11:36 AM on February 17, 2016


> So the upshot is, even if Apple is compelled to make this backdoor/firmware for the iPhone 5c used in San Bernardino, Apple can't do the same for an iPhone 5s or any later model.

Note that the blog post adamsc's comment linked to has been updated to say that the author is uncertain of whether changing the secure enclave's firmware actually causes it to wipe all the keys. It seems like this would call into question a few of the other assertions about the secure enclave but only the one sentence is struck out.
posted by XMLicious at 11:38 AM on February 17, 2016 [2 favorites]


Techdirt mentions the potential First Amendment problem here. Can the government compel code?
posted by ChurchHatesTucker at 11:42 AM on February 17, 2016 [2 favorites]


iffthen:
No. Regarding the first, the reality is more akin to allowing the FBI as many attempts as they like to pick the lock, not removing it from the doorframe. Regarding the second, and as I've already mentioned, that simply isn't true. Picking one lock does not give you access to all other locked doors.
This is not just wrong, it is dangerously wrong. The order is not to pick a single lock, the order is to create new software that (effectively) enables lock picking on all equivalent devices. It has been well demonstrated that there is no such thing as broken encryption that only works for good guys and not bad guys. The minute the tool exists, there are two risks: (1) that it will be stolen by unknown bad actors (cf. that Juniper backdoor, placed by the NSA, but affecting our government's own hardware, oopsie, and pretty much assumed to have been exploited by, say, China); (2) that the legal precedent is such that known bad actors can still compel Apple to provide the same tool to them.

Even if it were technically possible to constrain this one-time circumvention, a bad actor needs merely to run a diff to find out what parts to change, then can change them at leisure.

And make no mistake, the attempt at using a court writ is a huge overreach because it compels new work. This isn't just "you're compelled to let us look through your filing cabinets" this is "you need to take resources off their actual jobs to do this."
posted by fedward at 11:47 AM on February 17, 2016 [3 favorites]


Apple built a lock that it refuses to open for the owner of the phone.
This is false.

They implemented an encryption scheme. Encryption is fundamentally different from a lock.

When you lock something behind a safe, the original thing still exists.

When you encrypt something and destroy the plaintext, the plaintext no longer exists. It can be *recreated* if you have the key, but until then the thing that you're looking for literally does not exist!

This is also why Apple can't just "bypass the encryption" like you'd see in a movie. There is nothing to bypass, because encryption is not a lock. It doesn't work that way. Instead, Apple is being asked to build a tool that will make brute-force attacks easier. Now in this case, brute force is actually a viable strategy since it sounds like the numeric PIN is used to decrypt a secondary key which is used to access the device's storage (I think that's an oversimplification, but still logically correct). So they don't need to brute-force the key itself, just the PIN that's expanded out into the key that encrypts the key that encrypts the storage. Hence if they can bypass the lockout (or just dump the storage and the encrypted secondary key) they can brute force the PIN at their convenience.

This is also why I suspect this isn't going to be a test case for the "right to encrypt", since that's not at all what is being argued.
posted by -1 at 11:49 AM on February 17, 2016 [5 favorites]


weird question but is it pronounced San BernaDINO? Looks like it is San BerNARDINO, but I haven't heard that pronunciation

More like San Berna-DEAN-o. Although sometimes people say more of a DEAN-no at the end.

And, if you watch old Noir crime movies, people in LA say "San Berdoo" a whole lot. Apparently that came from all the Midwesterners showing up during the Great Depression. Although, I don't know if I've heard someone say that in real life.
posted by sideshow at 11:55 AM on February 17, 2016


Ironmouth: Zappa
posted by dforemsky at 12:00 PM on February 17, 2016


It's a pity that Apple's so well-entrenched in the United States; there are plenty of nations with stronger legislative protections on privacy, that specifically forbid 'the State or its representative agents' from violating them. After all, 96% of the world's population has no need to consider what the FBI, NSA, or indeed any branch of the United States government might want.
posted by The Zeroth Law at 12:05 PM on February 17, 2016




> All I have to add to this is that I'm not really surprised the only gay tech CEO is the one defending users privacy.

I think you mean the only out gay tech CEO.

Also, Apple's commitment to user privacy predates Cook.
posted by cjorgensen at 12:12 PM on February 17, 2016 [4 favorites]


If you need me, I'll be reading William Gibson books and nodding my head along in agreement.

*sighs*
posted by Fizz at 12:12 PM on February 17, 2016 [2 favorites]


If you're my age, you were never promised a flying car.

You were promised an oppressive cyberpunk dystopia.

(Plagiarizing a tweet here.)
posted by ocschwar at 12:15 PM on February 17, 2016 [10 favorites]


> This is such a terrible case for Apple to make their stand on that one suspects they chose it deliberately so as to lose and then not have to deal with this question again. Really the optics are terrible. The phone does not belong to the terrorists, it is the property of the government agency that employed one of the terrorists and that agency has consented to the search of its own device.

You could not pick a worse situation legally or on the equities to fight this battle.


And a few comments later: My guess is Apple loses this one.^

Alternatively, since Apple is being hit with a lot of these cases, and being beaten up over them, it's the perfect case to shut down the death of a thousand cuts. Seriously, if they couldn't/wouldn't unlock an iPhone used by a known terrorist, an iPhone owned by the government, on behest of the government, then that pretty much puts a stop to all future requests.

I'm guessing Apple loses this one as well, but it'll take a long damn time and go to the SCOTUS, and perhaps, just perhaps they won't.

My take on it is Apple shouldn't lose, but then I value freedom more than I value the government's ability to hack a phone. Talk about bad optics. It looks like Apple is defending freedom against an overreaching government.
posted by cjorgensen at 12:22 PM on February 17, 2016 [5 favorites]


schmod to clarify my point re OpenSSL. OpenSSL was open source, but the code was a mess. Every developer that had a device that relied on OpenSSL could have looked at the code and figured that out. Yet mostly people didn't.
posted by humanfont at 12:25 PM on February 17, 2016


As a side note this isn't currently hurting Apple stock.
posted by cjorgensen at 12:40 PM on February 17, 2016 [1 favorite]




Wouldn't every company start using these types of systems if Apple is successful, making law enforcement effectively blind?
Were we only so lucky.

(I'm really not sure why "wishing to avoid US law enforcement" should be seen as improper on its face, but we're not about to come to an agreement there, so I won't belabor the point)
posted by CrystalDave at 1:00 PM on February 17, 2016 [4 favorites]


(I'm really not sure why "wishing to avoid US law enforcement" should be seen as improper on its face, but we're not about to come to an agreement there, so I won't belabor the point)

For purposes of the Court's analysis whether or not Apple's business model would be overly impacted by use of the All Writs Act. Let's provide an example: Company A makes a device that makes it impossible for a phone to be wiretapped and specifically markets it to drug dealers, like cartels or something. Is that a legitimate business purpose that the Company can use to justify that its business will be unduly impacted by being forced to turn off the protection for any particular phone?

No other point implied.
posted by Ironmouth at 1:08 PM on February 17, 2016


Huh--I wanted to see if ISIS was pro-iPhone because of Apple's privacy features, it turns out, No.
Apple has famously conquered one major world market after another, but the technology giant has encountered stiff resistance in one especially contested piece of territory: The Islamic State group, also known as ISIS, has banned all Apple products from its self-declared caliphate, asserting that such devices can be used by American intelligence agents to target its forces with airstrikes.

The militant group issued a directive banning the use of all Apple products in December via a statement issued by its “general supervisory committee” for distribution throughout each province within its caliphate. Written in Arabic, the statement was disseminated via social media and the online media offices that ISIS operates.
Even more interesting:
“There is a way for Apple to track us if they want to,” says Aaron Ross, a technology security expert and founder of Ross Backup, a cloud-based computer backup and file-syncing company. “They say they won’t, and their privacy statement says they won’t. I’m a pretty big believer that if there is a big emergency, they would be able to track you on your phone.”

Apple did not respond to a request for comment, but its privacy policy states that the company “may collect, use, transfer and disclose nonpersonal information for any purpose.” Apple defines nonpersonal information as “zip code, area code, unique device identifier, referrer URL, location and the time zone.”

Apple also maintains the right to “collect, use, and share precise location data, including the real-time geographic location” from its devices. Though its privacy statement is explicit that this data will be collected anonymously and won’t be shared, Apple also reserves the right to disclose both personal and nonpersonal information if it “determines that for purposes of national security, law enforcement or other issues of public importance, disclosure is necessary or appropriate.”
Fascinating. They will turn over your location data to the NSA but not what is on the phone. Kind of a weird dichotomy.
posted by Ironmouth at 1:11 PM on February 17, 2016 [2 favorites]


The Macworld link posted above is a good one, worth reading. ("No legal case applies in a vacuum, and in this case the FBI needs the precedent more than the evidence.")

I agree with comments above that the optics are terrible for Apple in a certain light - dead terrorist, phone owner wants it unlocked to prevent future terrorist attacks - and that's why the FBI is pushing the point. But the optics are also very good for Apple - if they won't give up without a fight even in this terrible case ...

"Make no mistake: This is unprecedented, and the situation was deliberately engineered by the FBI and Department of Justice to force a showdown that could define limits our civil rights for generations to come. This is an issue with far-reaching implications well beyond a single phone, a single case, or even Apple itself."
posted by RedOrGreen at 1:11 PM on February 17, 2016 [2 favorites]


Apple, the FBI, and the San Bernadino iPhone. By CS professor and security expert Dan Wallach. Goes into technical detail.
posted by Nelson at 1:14 PM on February 17, 2016 [3 favorites]


Fascinating. They will turn over your location data to the NSA but not what is on the phone. Kind of a weird dichotomy.

Not really. They have one. They neither have nor want the other.
posted by ChurchHatesTucker at 1:15 PM on February 17, 2016 [1 favorite]


Fascinating. They will turn over your location data to the NSA but not what is on the phone. Kind of a weird dichotomy.

Not really. They have one. They neither have nor want the other.


The third party doctrine means they don't even need a warrant just a an order based on probable cause.

In most investigations the location data is more critical.
posted by Ironmouth at 1:17 PM on February 17, 2016


If I want my phone to be secure, I'll use a tin-can and string. Pretty sure the FBI can't hack that yet...

... well not without additional hardware, namely more string and a third tin-can.
posted by Nanukthedog at 1:36 PM on February 17, 2016


but seriously
posted by philip-random at 1:42 PM on February 17, 2016 [1 favorite]


I think this is a false-flag op just so the FBI can get into Spitzer's iphone for the lulz.
posted by valkane at 1:47 PM on February 17, 2016 [1 favorite]


I think this is a false-flag op just so the FBI can get into Spitzer's iphone for the lulz.



The way to get the incriminating stuff from Spitzer's iPhone is to wait a week and check Gawker.
posted by ocschwar at 1:52 PM on February 17, 2016 [2 favorites]


Snowden backs Apple in fight over iPhone, calls on Google to speak up: "This is the most important tech case in a decade."
posted by Sir Rinse at 2:01 PM on February 17, 2016 [4 favorites]


I agree that it's an odd battle to fight. I'dve complied quickly and quietly, and if called on it say that it was the phone's owners who requested access, and in any case more recent iPhones are beyond even Apple's reach to open. There's plenty of case law that limits expectations of privacy on work devices.

Remind me to never tell you any of my secrets. But seriously, just because you can do something with a court of law doesn't make it right.

I'm always suspect when people start using the word "optics" because it leads me to believe there's a strawman afoot. And with Trump as my litmus, I pray to the gods of privacy that Apple does not fold on this issue.
posted by valkane at 2:09 PM on February 17, 2016 [3 favorites]




If only apple could publish the iphone as a book.
posted by valkane at 2:18 PM on February 17, 2016 [1 favorite]


The legalese doesn't matter, I think. If the government wants something badly enough, the legalese is just there to lend the exchange some legitimacy.
posted by a lungful of dragon at 2:25 PM on February 17, 2016 [1 favorite]


I mean sure they are, but how the fuck are we okay with that?
posted by corb at 2:38 PM on February 17, 2016 [2 favorites]


The EFF on similar cases back in October.

Support for Apple from the ACLU and WhatsApp CEO Jan Koum.
posted by ChurchHatesTucker at 2:48 PM on February 17, 2016 [2 favorites]


TidBits on Cook's letter.

Dan Guido on the technical details of what Apple can and can't do (but included mainly for coining the term "FBiOS".)
posted by ChurchHatesTucker at 3:15 PM on February 17, 2016 [2 favorites]


The legalese doesn't matter, I think. If the government wants something badly enough, the legalese is just there to lend the exchange some legitimacy.

I think that's been clearly established. Consider that they could strap you to a table and torture you if they wanted to, and even though the USSC *said* you deserves a Habeas Corpus hearing, getting one can be problematical when you can't speak to counsel...
posted by mikelieman at 3:18 PM on February 17, 2016 [2 favorites]


From Adam Engst's response to Cook's letter:
According to Stanford professor Jonathan Mayer in a lecture linked to by Wikipedia, the All Writs Act is used as a catch-all authority for the courts to supplement warrants, but only when [...] It is justified by extraordinary circumstances.
[...] a case involving domestic terrorism certainly qualifies as an extraordinary circumstance."
No. You need a better argument than that. There have been lots of crimes as bad or worse than the San Bernardino attack; its characterisation as "domestic terrorism" is a label applied by the government, not something intrinsic to the crime. Your argument would make "extraordinary circumstances" commonplace, because every prosecutor would seek to define their prosecution as having an "extraordinary" element.
posted by Joe in Australia at 4:38 PM on February 17, 2016 [6 favorites]


The All Writs Act applies to every writ. It is a universal power so that thr courts can do their work.

The issue wil boil down to the First Amendment, not the Act.

The All Writs Act is basic law, despite one computer journalist's cherry picking of one professor's lecture. How could courts act without the legal power to compel?
posted by Ironmouth at 4:48 PM on February 17, 2016


The All Writs Act applies to every writ.

It has limits though, and at least one judge has found that in a similar case involving Apple. (See the EFF link from October above.)
posted by ChurchHatesTucker at 4:55 PM on February 17, 2016


Also, don't use edit that way.
posted by ChurchHatesTucker at 4:55 PM on February 17, 2016 [3 favorites]


Mod note: Emphasizing in mod font: Ironmouth (and everyone) do not use the edit function to substantively change your comment. Make a second comment instead. Thanks.
posted by restless_nomad (staff) at 5:03 PM on February 17, 2016 [4 favorites]


The All Writs Act applies to every writ. It is a universal power so that thr courts can do their work.


I was thinking about that over dinner, and the "undue burden" part of the All Writs act can really kick in with a vengeance here.

The burden on Apple here is not the wages of the developers tasked with complying with the Act. It's the foregone opportunity to use those people for something that will make Apple their next billion. (Or to use them to avoid being beaten by their competition for the Next Great Thing and losing huge amounts of revenue.)

But that's my L0 rumination here.
posted by ocschwar at 5:38 PM on February 17, 2016 [2 favorites]


The burden on Apple here is not the wages of the developers tasked with complying with the Act

Probably right, given the pen registers precedent. It may go a little differently given a defendent that wasn't actively trying to be captured.

I actually agree with Ironmouth (mark your calendars!) in that it's going to be a First Ammendment fight in the end.
posted by ChurchHatesTucker at 5:47 PM on February 17, 2016


So software has bugs, right? And the FBI are asking Apple to go muck around with a part of the phone that might render it unreadable if something goes wrong (like guessing wrong ten times). Just give the FBI what they asked for, but have a summer intern implement it and don't test at all. "Well it worked on my machine."

So seriously, the problem is not the actual technology but the precedent it sets. I suppose Apple could allow a software setting in the future to never accept software updates of any kind, regardless of any cryptographic signatures.
posted by rustcrumb at 5:58 PM on February 17, 2016 [2 favorites]


Another undue burden would be the hit to future iPhone sales. Or to the perceived security in ApplePay.

I find it sad Apple is being forced to defend Amercan citizens against their own government.
posted by cjorgensen at 6:28 PM on February 17, 2016 [4 favorites]


Really the best accounting of the issues involved are in the transcript of a hearing for a similar case in October. It's here that Apple claimed they had unlocked at least 70 devices since 2008. (And where the judge scolds the government lawyers for having too large a portion of their brief dedicated to questioning Apple's patriotism)
posted by RobotVoodooPower at 7:29 PM on February 17, 2016 [3 favorites]


(the legal issues, that is. the political theater issues exist on a whole 'nother plane)
posted by RobotVoodooPower at 7:38 PM on February 17, 2016


I've been thinking before too many decades we'll be seeing big multinational corporations going for sovereignty. This is the kind of shit what's gonna push that along.
posted by save alive nothing that breatheth at 8:35 PM on February 17, 2016 [1 favorite]


"Really the best accounting of the issues involved are in the transcript of a hearing for a similar case in October. It's here that Apple claimed they had unlocked at least 70 devices since 2008. "

Interesting choice of year since the iPhone debuted in July of 2007 and the second iPhone debuted in july (or was it June?) of 2008. Then, in 2014, Apple changed the way they implemented encryption to lock even themselves out.
posted by I-baLL at 10:28 PM on February 17, 2016


fedward:

This is not just wrong, it is dangerously wrong. The order is not to pick a single lock, the order is to create new software that (effectively) enables lock picking on all equivalent devices.

Agree. However, I'm not personally worried about this. log2 10,000 bits of entropy is shit security anyway, and the onus is still on the user to use a strong-enough key. Also, while it's possible the FBI could abuse this newly created software, I'm much more concerned about more abusive regimes which also tend to have cybersecurity budgets large enough to create similar software if they deem it necessary.

It has been well demonstrated that there is no such thing as broken encryption that only works for good guys and not bad guys. The minute the tool exists, there are two risks: (1) that it will be stolen by unknown bad actors (cf. that Juniper backdoor, placed by the NSA, but affecting our government's own hardware, oopsie, and pretty much assumed to have been exploited by, say, China);

I still agree. But I want to make two points. First, the issue here is not broken encryption. It's the removal of layers of defense-in-depth. Second, if totalitarian regimes really want this software, they have the capability to build it themselves, so I'm not worried about it getting stolen from the FBI.

that the legal precedent is such that known bad actors can still compel Apple to provide the same tool to them.

I'm not convinced of this, but I would like to be, so please keep talking while citing sources. (You actually have to, or your motivations are called into question.)

Even if it were technically possible to constrain this one-time circumvention, a bad actor needs merely to run a diff to find out what parts to change, then can change them at leisure.

Run a diff on what? You think the FBI is going to publish this altered firmware online? Provide it to the US's enemies as a token of goodwill? I'll be the first to call the FBI a bunch of incompetent fucks, but I still don't see that happening.

And make no mistake, the attempt at using a court writ is a huge overreach because it compels new work. This isn't just "you're compelled to let us look through your filing cabinets" this is "you need to take resources off their actual jobs to do this."

I have no disagreement here. I'm inclined to agree, but IANAL, and will defer to the others in this thread who are.
posted by iffthen at 2:25 AM on February 18, 2016 [1 favorite]


The code has to be signed which is a significant hurdle for any regime wanting to roll their own.
posted by Mitheral at 3:52 AM on February 18, 2016 [1 favorite]


Run a diff on what? You think the FBI is going to publish this altered firmware online? Provide it to the US's enemies as a token of goodwill? I'll be the first to call the FBI a bunch of incompetent fucks, but I still don't see that happening.

Even if it only exists on an FBI intranet, our federal government's cybersecurity measures have not been what you'd call inspiring lately.
posted by Holy Zarquon's Singing Fish at 5:03 AM on February 18, 2016 [4 favorites]


OK - so, this is what I think the facts are

1. Apple can provide unlimited PIN tries to the FBI for any model iPhone, including the latest, with a hard limit of 80ms per try (equalling years for a strong 6 character PIN)
2. The FBI can only ask for this on a case-by-case basis under the AWA, and doesn't need or require actual sight of the code - Apple can comply through remote access of the phone hosted on its premises.
3. The FBI has used the AWA to require Apple to provide access many times in the past
4. The AWA can be defended against by showing it has unreasonable effects on a company, the other defences being much more tightly defined.
5. Apple will fight this one all the way.

It is at least possible that the FBI expected and wanted that last one; it is also possible that the case, however it lands, will provoke political movement to put some form of encryption bypass/weakening into law. Neither directly affects the course of this case.

So: it will probably boil down to whether this class of FBI request is unreasonably damaging to Apple, which can be parlayed into whether the government's right to demand access overrides the benefits of providing unbreakable access protection to all.

Which has been coming for a while.
posted by Devonian at 5:19 AM on February 18, 2016


I thought this was a fairly astute observation.

"But Rogers confirmed for the first time that the law was used successfully by the NSA after the San Bernardino terror attack to retrieve the phone records of the two perpetrators, and the agency “didn’t find any direct overseas connections.” Those records provided “metadata” — the time and duration of phone calls — but not the content of emails and text messages that the FBI is seeking by requiring Apple to unlock one of the iPhones."

"Fascinating. If this is their stated goal they have all the information they need, except iMessages. Supposedly the SB killers disabled iCloud backups 6 weeks prior to the attack, closing the iCloud backup iMessages backdoor. In which case, this is actually about iMessage encryption, not just about the FDE on the handset."
The primary goal here may be to legislate key escrow on iMessage.
posted by Xyanthilous P. Harrierstick at 6:36 AM on February 18, 2016 [5 favorites]


And make no mistake, the attempt at using a court writ is a huge overreach because it compels new work.

Yeah, and what I am unclear on is if the amount of effort that is judged reasonable is related to the importance of the warrant. It's debatable that this one is absolute highest priority, considering that the perps are dead and there's only a six week gap of data. What's the low threshold of crime severity that requires Apple to flex its custom build muscle?
posted by RobotVoodooPower at 6:53 AM on February 18, 2016


I'm not convinced of this, but I would like to be, so please keep talking while citing sources. (You actually have to, or your motivations are called into question.)

So … you're equating me to a bad actor with something to hide? As I used to say on USENET way, way back when, I feel sorry for whoever has the job of reading my email (and now that has to be extended to however many web sites and services on which I use this exact same user name). They'll spend a lot of time and not find anything. My motivations are that I actually work in software and live in constant fear that I'm doing something stupid and it'll be my own name on the line if private data is exposed. The idea of somebody being compelled to weaken their own platform frankly gives me the heebie-jeebies.

As for sources, I dunno, wasn't there a thing where China forced key escrow on a vendor already? If we allow our own government to compel this sort of thing, how naive does one have to be to think China won't also compel it? What about the phones of journalists who visit, say, Iran? Russia?
posted by fedward at 7:02 AM on February 18, 2016 [1 favorite]


Google CEO Sundar Pichai has chimed in on Apple's side.
posted by ChurchHatesTucker at 7:31 AM on February 18, 2016


congrats @sundarpichai. in true google fashion, google's ceo saw what apple produced and replied with an uninspired, shittier version of it *
posted by cjorgensen at 7:50 AM on February 18, 2016 [3 favorites]


Forcing companies to enable hacking could compromise users’ privacy.

Could? Could?
posted by RedOrGreen at 8:04 AM on February 18, 2016 [2 favorites]


The old 1984 Apple ad seems more apropos than ever.
posted by davidpriest.ca at 12:01 PM on February 18, 2016 [2 favorites]


The old 1984 Apple ad seems more apropos than ever.

And I fear it will end as badly as the novel.
posted by ChurchHatesTucker at 12:12 PM on February 18, 2016




Yes, The Backdoor That The FBI Is Requesting Can Work On Modern iPhones Too

It's really hard to get clear information on this, but I wonder why the solution isn't as easy as preventing the ability to install OS updates while the phone is locked or turned off. (Obviously this would be for future OS releases and not for the current situation.)

Honestly I would have assumed that this was already the case. I wish we would get more details on the functionality that allows Apple to do an OS patch on a locked phone.
posted by zixyer at 12:29 PM on February 18, 2016


And, if you watch old Noir crime movies, people in LA say "San Berdoo" a whole lot. Apparently that came from all the Midwesterners showing up during the Great Depression. Although, I don't know if I've heard someone say that in real life.

I heard somebody say it in Frisco once.
posted by Chitownfats at 1:10 PM on February 18, 2016


I wonder why the solution isn't as easy as preventing the ability to install OS updates while the phone is locked or turned off.

I'd think so, but I'm not sure. The whole "write this software for us" thing makes real security exceptionally difficult.
posted by ChurchHatesTucker at 1:44 PM on February 18, 2016


Ars Technica has a nice (IMO and as I understand them) explanation of the details.
posted by MikeKD at 3:14 PM on February 18, 2016 [1 favorite]


It's OK, John McAfee is on this.
posted by ChurchHatesTucker at 4:12 PM on February 18, 2016


He's going to use "social engineering" which means he's either going to get a dead man to talk or he's going to go to Apple begging on his hands and knees going "PLEEEEEEEEEEEASE UNLOCK THIS PHONE FOR ME! I'LL BE YOUR BEST FRIEND!".
posted by Talez at 4:20 PM on February 18, 2016 [1 favorite]




The obvious solution would be use the ImageIO vulnerability (patched by update after the San Bernadino shooting) and mms to gain access to the device.

Suppose the government wanted to unlock your front door. Then could call a lock smith or knock the door down. Yet in this case they've gone to the lock maker and demanded a master key that could unlock every door in the country.
posted by humanfont at 5:44 PM on February 18, 2016 [1 favorite]




I can't see why the password-timeout can't be written in silicon and immune to updating. The fact that Apple designed it to be updateable means that they always intended iPhones to have a backdoor.
posted by Joe in Australia at 6:48 PM on February 18, 2016


No it does not. The legal landscape is shifting under their feet.
posted by ChurchHatesTucker at 6:54 PM on February 18, 2016


Apple knows that, if they provide what the FBI is asking for, that special OS will be the last one that Apple ever makes.

Nobody in their right mind will ever believe that their products are protected anymore, and sales all over the globe would plummet.
posted by kadmilos at 6:58 PM on February 18, 2016 [2 favorites]


humanfont: " Yet in this case they've gone to the lock maker and demanded a master key that could unlock every door in the country."

No they haven't. Apple can comply and lock the update to a specific device. Because the update is required to be signed the FBI isn't going to be able to adapt it to any other device.
posted by Mitheral at 7:49 PM on February 18, 2016 [1 favorite]


I refer you to the Apple Letter describing the FBI's request:
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
posted by humanfont at 8:32 PM on February 18, 2016


What's up with the NYT coverage?

ALL of this, on how Apple vs FBI impacts China, vanished from story w/ no note. By no stretch is this not unusual.

China is watching the dispute closely. Analysts say the Chinese government does take cues from United States when it comes to encryption regulations, and that it would most likely demand that multinational companies provide accommodations similar to those in United States.

Last year, Beijing backed off several proposals that would have mandated that foreign firms providing encryption keys for devices sold in China after heavy pressure from foreign trade groups. …

“… a push from American law enforcement agencies to unlock iPhones would embolden Beijing to demand the same.”

posted by a lungful of dragon at 8:40 PM on February 18, 2016 [2 favorites]


That doesn't conflict with any particular backdoor update being locked to a particular device. Apple's press release indicates they think that if they cave on this case the FBI and other alphabet soup agencies will keep returning to the well to get versions for different handsets.
posted by Mitheral at 8:43 PM on February 18, 2016


I refer you to the Apple Letter describing the FBI's request:
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.


Only if Apple fucks it up; I refer to the Judge's order, namely, page 2, lines 20-22:
The SIF [Software Image File] will be coded by Apple with a unique identifer of the phone so that the SIF would only load and execute on the SUBEJCT DEVICE.
And the back door already exists, either through Apple's short-sightedness, imcompetence, or previous agreements with the government. But I salute their attempt to resurrect the Reality Distortion Field and claim contempt of court is in the users' best interests.
posted by MikeKD at 10:21 PM on February 18, 2016


I should qualify that; potential contempt, depending on how appeals go (which I'm fine with them persuing).
posted by MikeKD at 10:24 PM on February 18, 2016


Two words, "Fishing Expedition". Apple's lawyers should just point that out and ask, "What *other* evidence supports the hypothesis that there is actionable intelligence in this phone?", because IIRC all the ISIS/ISIL connections never really panned out.
posted by mikelieman at 11:21 PM on February 18, 2016 [5 favorites]


IANAL, and especially not a US lawyer, but I think an argument about "actionable intelligence" is misleading here. Intelligence is part of the government's executive function, not its judicial authority. The All Writs Act is there to allow the judiciary to operate, by (e.g.) compelling the appearance of witnesses and the surrender of documents. Things that would allow a prosecutor to build a presently-undefined case are generally called fishing expeditions, and are not the sort of thing that subpoenas can be properly used for; I don't suppose the All Writs Act is meant to extend courts' power in that regard.

Also, the Act only allows writs that are "agreeable to the usages and principles of law". The government is not trying to get Apple to surrender evidence; it's demanding that Apple work on its behalf. That's not the sort of thing that can be done with traditional writs, and I can't see how it could be "agreeable to the usages and principles of law". It may not even be Constitutional.
posted by Joe in Australia at 12:03 AM on February 19, 2016 [3 favorites]


What exactly IS the case against the decedents?
posted by mikelieman at 3:14 AM on February 19, 2016


"What *other* evidence supports the hypothesis that there is actionable intelligence in this phone?"

* Apple asked the government to file the warrant under seal, but they decided to make it public, so now the IMSI + SN is out there for the world to see (probably no biggie, but perhaps foreign governments with access to carrier data could do something nasty with this info)

* It's his work phone. He destroyed his personal phone.

This is evidence this is more about political theater than an investigation.
posted by RobotVoodooPower at 4:34 AM on February 19, 2016


Mitheral: No they haven't. Apple can comply and lock the update to a specific device. Because the update is required to be signed the FBI isn't going to be able to adapt it to any other device.

Spoken like somebody who has never used a hex editor to crack into software or jailbroken a phone. This claim (and it's coming from the FBI, not just you) is somewhere between naïve, uninformed, or willfully misleading. There's a lot less locking than the FBI claims, and you seem to think, there is.

The first half of the "lock" is the serial number. In the abstract a device serial number can easily identified by its length and composition, without even considering that this particular device is evidence in a court case and thus its serial number might be rather easily attained by someone who wanted to use a known text attack against the compiled binary (not just looking for something that has the same length and composition as a serial number, but the actual serial number of the device in question).

The second half of the "lock" is the software signature provided by Apple. When you download software, the bulk of it1 isn't encrypted, it's just signed. The difference is that while the signature should be unique, older versions of iOS were vulnerable to a variety of attacks on either the signature itself or the mechanism by which the signature is authenticated. This was, in fact, how a number of jailbreaks worked. Based on the particular software version and the type of attack it was sometimes possible to overwrite specific parts of the application bundle without causing the signature check to fail. Other jailbreaks effected a man in the middle attack on the verification process. This security has been hardened in subsequent releases of iOS, but it is naïve to assume that it's 100% bug free2.

1. The application bundle does contain an encrypted firmware package and thus it's difficult to edit/replace the device's boot loader, which is delivered inside that encrypted payload. Some jailbreaks on older versions of iOS worked by bypassing that encrypted firmware entirely and using the boot loader from a previous software version.

2. The reason that matters is because bad actors and security researchers tend to know about these bugs before the public does. A vulnerability that becomes known to the public only once it's in the wild, in the hands of a bad actor, is known as a zero day exploit. Some exploits are potentially so useful people are willing to pay for them.


One might argue that since there's an encrypted payload that installs a new boot loader on the device, you could just put the serial number check inside that encrypted payload, so it can't be easily located with a hex editor. The problem with that is that you've then specified software that installs on any device and just isn't supposed to execute if it's not on a specific one. That's bad. So then there have to be two integrity checks, one in the unencrypted payload (which a bad actor could attack with a hex editor) and one in the encrypted payload contained within, and if you think the first one isn't the first part of a solution to hacking into the second one, I've got a secondhand iPhone to sell you and it's totally not been hacked, wink wink.

And plus by saying "Apple can lock it" you're putting a burden on Apple to create new software with a new type of lock and just expect it to be adequately hardened against attack because … why? They have to create a new thing with its own protections, just this once, and somehow both (A) it will be adequately protected, and (2) it's reasonable to ask them to do what amounts to heroic effort that goes against their own best interests, and that's not an overreach of the All Writs Act? For that matter, this somehow wouldn't open floodgates?

And don't get me started on how wise it is to expect the FBI to protect the new weaponized software Apple wrote for them (hint: completely unwise).
posted by fedward at 7:04 AM on February 19, 2016 [6 favorites]


To tie those two last posts together a bit, the fact that the unique identifiers for this phone are now public record means that attackers would know exactly what to look for in the firmware if they want to adapt it to work on other devices, so it's that much more difficult to lock down.
posted by Holy Zarquon's Singing Fish at 7:38 AM on February 19, 2016




I've been wonder what would happen if Apple gets ordered to do this, and Tim Cook brings down his software engineering team, and they take a stance and say, "No." What happens then? I mean, the way I see this is you are asking Apple to create code that they believe goes against their interests. So what if a programer says "Fuck that, I'm not going to work on that project."?

Additional wonderings:

Let's say Apple agrees, and they give it to their best programmers. Can those guys knock something out in an hour? A day? A month? What if they brick the device while attempting?

What about the next phone? Sure, Apple patches this exploit, so what about next time the FBI wants in a phone? The "Please break into this phone for me" gets to be that much more difficult, so do Apple dedicate a team to cracking their own security features?

Why Apple at all here? Why not force Google or Microsoft to break into the phone? Because Apple has specific knowledge to the device makes them uniquely qualified, sure, but they are qualified to do all kinds of things. So why not then task them to write the next healthcare.gov system? If Apple is suddenly doing the government's bidding why stop with this one phone?
posted by cjorgensen at 7:39 AM on February 19, 2016 [1 favorite]




fedward: ""Apple can lock it" you're putting a burden on Apple"

Of course, they have their private key and so are uniquely qualified to provide a solution.

fedward: " to create new software with a new type of lock and just expect it to be adequately hardened against attack because … why?"

Because the government thinks it's the law.

fedward: "don't get me started on how wise it is to expect the FBI to protect the new weaponized software Apple wrote for them "

Apple doesn't have to trust the FBI. The FBI is quite willing to give Apple access to the phone at Apple's offices for Apple to install the updated software and run the pin guesser as long as they then provide the encrypted data.

I'm not unsympathetic to Apple's view and pretty much agree with what Bruce has written
(though he is wrong to state the FBI doesn't have the consent of the owner). I'm less sympathetic to the idea that Apple shouldn't have to follow the law (however stupid or wrong that law may be) because they aren't technically competent enough to do so. Bruce writes that this fight is about the policy of allowing access by the FBI to devices and I agree. But the way to fight that is by fighting the policy not by making a bunch of claims that Apple is incompetent to provide the services the policy would require or that complying with the law will be difficult.

I think that fight has already been lost even if Apple prevails in this case. The US government will merely write new anti-terror law making this sort of co-operation explicit rather than trying to apply the more general tool if they lose. The public just doesn't care.
posted by Mitheral at 8:12 AM on February 19, 2016 [1 favorite]


> The US government will merely write new anti-terror law making this sort of co-operation explicit rather than trying to apply the more general tool if they lose. The public just doesn't care.

I fear that you may be right - on both counts - but that doesn't mean Apple should roll over and play dead in advance.

Let's have this right out in the open, that the Republican House and Senate and the Democratic President are both willing to sign on to a law that compels companies to work against their own interest and violate their privacy commitments to their users if the FBI invokes dead terrorists.

Then let's see what happens the first time the Chinese and French and Saudi and Iranian governments require Apple to decrypt their favorite target phones.

I don't think this will work, although our current Congress may be too short sighted to understand that.
posted by RedOrGreen at 9:04 AM on February 19, 2016 [1 favorite]


Because the government thinks it's the law.

My point is that you can't just hand wave over a significant technical challenge based on something that's arguably far beyond the intent of the law. And hand waving is exactly what you're doing here.

If we start from the assumption that these four tests are accurately described, Apple's response probably rests on what constitutes an unreasonable burden. What Apple is being asked to do constitutes an unreasonable burden in that (1) it will pull people off doing their actual jobs for some period of time, (2) the mere existence of such a solution creates a new security risk for Apple that wouldn't have existed before the compulsion, and (3) it requires Apple to undermine its own product, potentially causing a loss of trust and therefore a loss of sales. Each of those is significant, the first and second are easily provable, and the third can probably be documented adequately with some sales data under seal.

None of that even requires the slippery slope argument that once the creation of such a tool has been proven to be possible (and, in fact, the work has been done), other governments, perhaps at odds with our own, will compel Apple to hand it over to them in order to do business in those countries. And while it sounds like a slippery slope argument, from a software security standpoint, any sort of exploit/flaw/backdoor is equivalent to inviting somebody else in. With a neon sign. If you have a way to weaken security, you have to assume somebody else can either figure out how it works or just flat out steal it from you (and the more valuable it is, the harder they'll work to steal it).

I think it's actually arguable whether these are even extraordinary circumstances, although what with the War On Terror we've managed to elevate practically everything to "extraordinary" (but if "extraordinary" is the new ordinary, then what's extra-extraordinary?). Plus it was the guy's work phone, and if we know he was smart enough to destroy his home phone are the circumstances really extraordinary enough to warrant the compulsion on a phone he may not have done anything but his actual job on?This seems like a really poor case for the FBI to waste the opportunity on, because what if they lose? Anyway, on that front I'm glad I don't have to be the one to argue this.
posted by fedward at 9:08 AM on February 19, 2016


Then let's see what happens the first time the Chinese and French and Saudi and Iranian governments require Apple to decrypt their favorite target phones.

The NYT reported today that until last summer, Apple routinely gave this info to the government--in fact over 3,000 times in the first part of 2015 alone.

And none of this slippery slope has happened. And even if Apple wins this, how does that affect the other countries? China could demand this tomorrow as the price of doing business. What is Apple going to do? Refuse and lose access to the biggest market in the world? No. Apple does not have that power. It isn't a charity. Its a money making operation.
posted by Ironmouth at 9:40 AM on February 19, 2016


Regarding comments that Apple couldn't do this for newer iPhones because of the secure enclave - Apple have commented otherwise. From the Bruce Schneier article above:

An earlier version of this post incorrectly stated that the vulnerability the FBI wants Apple to exploit has been fixed in later models of the iPhone. In fact, according to Apple, that is not the case: There are some differences in the details of the attack, but all of its phones would be vulnerable to having their software updated in this manner.

I think you could fix it for this case (confiscated phone) as others have suggested, by not allowing updates when the phone is locked. (You can still support a factory reset, it just has to render the data inaccessible. E.g. resetting the encryption keys in the the secure enclave). OTOH that wouldn't stop Apple compromising a device which is still in active use.
posted by sourcejedi at 9:47 AM on February 19, 2016


Key-Based Device Unlocking (Question/Idea re Apple Case): "what *should* the security design be for unlocking personal devices such as smartphones, but also say smart cars, fitness trackers, etc."
posted by kliuless at 10:23 AM on February 19, 2016


What would happen if Apple gets ordered to do this, and Tim Cook brings down his software engineering team, and they take a stance and say, "No." What happens then?

If Mr. Cook is threatened with solitary confinement unless and until Apple complies, would he escape to Moscow as Mr. Snowden did?
posted by Sir Rinse at 11:08 AM on February 19, 2016


I think you could fix it for this case (confiscated phone) as others have suggested, by not allowing updates when the phone is locked. (You can still support a factory reset, it just has to render the data inaccessible. E.g. resetting the encryption keys in the the secure enclave). OTOH that wouldn't stop Apple compromising a device which is still in active use.

The Ars article linked earlier by MikeKD explains the patching in more detail: it's actually the DFU mode that's being used, which is a feature that allows recovery of a bricked device by re-imaging the firmware. (most phones and other embedded devices support such a feature; it's not unusual)

They kind of handwave the fact that requiring the PIN for a locked device would add complexity to a feature that needs to be as simple as possible, being a last resort method for recovering a potentially bricked device. But I think there's a very valid security argument to be made that this will be necessary additional complexity to add. The security of millions of phones is solely dependent on a single private key remaining secret, which can be (as in this case) be legally compelled to be revealed, or stolen by hackers or a malicious Apple employee.
posted by zixyer at 11:37 AM on February 19, 2016


The simplest way to secure the DFU mode would be for it to automatically wipe user data before installing the new firmware, no?
posted by Holy Zarquon's Singing Fish at 11:42 AM on February 19, 2016


At any rate, even if it's too much of a burden for DFU mode to require the PIN, there's no reason that DFU mode needs to preserve the filesystem keys. I don't think anyone would be terribly upset if reimaging an encrypted phone would always mean wiping the memory; it's a last resort feature intended for recovering a broken phone, not something that most users would ever typically use.
posted by zixyer at 11:43 AM on February 19, 2016 [2 favorites]


Key-Based Device Unlocking

That's Key Escrow. We fought that in the nineties. Google "Clipper Chip."

If Mr. Cook is threatened with solitary confinement unless and until Apple complies, would he escape to Moscow as Mr. Snowden did?

Snowden didn't "escape to Moscow." He was there to make a connecting flight when his passport was revoked. That's the level of competence that the government that wants to hold the keys to your devices has.
posted by ChurchHatesTucker at 12:08 PM on February 19, 2016 [4 favorites]


In their public statements Apple has said that they do not beleive that they can implement the feature so narrowly.

Let us assume that Apple is wrong and their brilliant software developers can lock it to one phone. Now the precedent has been set. The NSA, FBI, local law enforcement, other governments will piggy back on this with their own list of devices they want unlocked. The end result is the same.
posted by humanfont at 12:11 PM on February 19, 2016 [1 favorite]


That's Key Escrow. We fought that in the nineties. Google "Clipper Chip."

And when somebody pointed that out to him in the comments he replied that this case is different. Which is the problem with security concepts like Key Escrow. You, me, and the teapot know that problems with Key Escrow don't go away because you change some other detail, but this is a difficult concept for laypeople to grasp.

And frankly this is an ongoing problem in discussions like the one we're having here. Security researchers and software developers may understand "this entire class of solutions is broken" but to the average person we're speaking in tongues. "You can lock this to one device." No, not really. "You could escrow keys." That's a hindrance, not a barrier, and it provides a single source for a mass exposure. "You guys are smart, you can figure it out." No, we're smart, which is how we know this is a hard enough problem that trying to solve it is foolish.
posted by fedward at 12:15 PM on February 19, 2016 [4 favorites]






DOJ's brief says they are specifically interested in communications between the attackers and their victims (who were their co-workers), which seems to me to reinforce that they're using this as a wedge issue rather than for any real intelligence-gathering purpose.

Also, turning to the legalese, they want the court to adopt the view that only operating time and expense can qualify as "undue burden" that would invalidate an otherwise valid writ; that the order compels activity (expressive or otherwise) to which the recipient is strenuously opposed is immaterial, or at least not material enough to outweigh the government's interest in searching a phone.
posted by Holy Zarquon's Singing Fish at 12:33 PM on February 19, 2016


Apple, FBI, and the Burden of Forensic Methodology

tl;dr: If Apple is being asked to develop a forensics tool rather than perform a black-box service, and for that tool to be useful in a court case, then that instrument must go through a arduous validation process which virtually guarantees it must be disseminated to a variety of parties. So the DOJ statement that Apple can main control of its software is disingenuous.
posted by RobotVoodooPower at 12:34 PM on February 19, 2016 [4 favorites]


I'm sure there are smarter ways to use it, and I've never tried on an iPhone, just android phones and a few other ARM devices, but in my limited experience, DFU erases *everything*

honestly, the people who have commented about desoldering or otherwise cloning the actual flash chips, that seems like a safer method of attack, that's what I would do if it was me attacking the device. you could hammer on that for as long as you want and eventually get through - it seems more of a case of US law enforcement being 'duh we too lazy/cheap, you do it for us' - on preview several mentions of "undue burden" become interesting here.
posted by dorian at 12:36 PM on February 19, 2016 [1 favorite]


DOJ's brief says they are specifically interested in communications between the attackers and their victims (who were their co-workers)

If only they could obtain the other phones...
posted by ChurchHatesTucker at 12:36 PM on February 19, 2016 [6 favorites]


RobotVoodooPower: holy crap. I trust Zdziarski on this subject, so after reading that I have to say that this problem is even worse than I thought. And I already thought it was bad.
posted by fedward at 12:42 PM on February 19, 2016 [1 favorite]


> If Mr. Cook is threatened with solitary confinement unless and until Apple complies, would he escape to Moscow as Mr. Snowden did?

You misread what I wrote. What if Tim Cook gathers the team to do as the government asks and those individuals refuse? Sure, you can put them in jail for contempt (but contempt can't be used punitively), but what then? Apple can't be expected to have to hire people to do this just because the government says so.

> China could demand this tomorrow as the price of doing business. What is Apple going to do? Refuse and lose access to the biggest market in the world? No. Apple does not have that power. It isn't a charity. Its a money making operation.

And I would suggest that compromising the security of a device will hinder their ability to make money. When people buy Google or other products they expect a cheaper price because they know they are part of the product being sold to advertisers. When people buy Apple products there's a different level of trust there. Apple is not selling their customers.

Add in that Apple is trying to break into the financial sector (unless you think Apple Pay is a one time foray). If people don't think they are secure they won't trust Apple.

It is a money making operation, but I think you are not thinking through how Apple is positioning itself for sales.
posted by cjorgensen at 12:45 PM on February 19, 2016 [1 favorite]


"Apple [would have to] invest resources in engineers who are intimately familiar with not only their code, but also why they chose the methodology they did as their best practices. If certain challenges don’t end well, future versions of the instrument may end up needing to incorporate changes at the request of FBI."

oh, hey, great, guys, the government is now driving your development-car. good luck! also make sure to give them access to your source-control and your defect-tracker.
posted by dorian at 12:51 PM on February 19, 2016 [1 favorite]


"[BUG 328011] password cracker fails when username contains any kind of fruit"
posted by dorian at 12:58 PM on February 19, 2016


When Apple is literally selling security, asking them to break the product they are selling isn't reasonable.
Secure by design.

iOS devices are secure right out of the box and deliver a great user experience. This is possible because Apple makes the hardware, software, and services that power all iOS devices — ensuring every element is built with security in mind.
cite
So a question for my fellow SysAdmins, if this device had been enrolled in Mobile Device Management could the government then just decrypt it themselves? I mean, let's take a scenario where you have a sales rep, you give her a phone, she enters data and creates a bunch of contacts, when she parts with the company can you access that data on a company owned phone? Or can you only nuke it?

Are there ways to force backups?

I'm not really finding the answers.

iOS deployment guide.

Like on the PC side of things I've generally worked with PCs that are both backed up and snapshot. It would take a court order, but I could access the user data if I were forced.

In short, I am asking is there any way, had the government set up this phone as a "work product" phone, that they could have had the data they wanted?
posted by cjorgensen at 1:30 PM on February 19, 2016


Commands
When a device is managed, an MDM server can perform a wide variety of administrative commands, including changing configuration settings automatically without user interaction, locking or wiping a device remotely, or clearing the passcode lock so users can reset forgotten passwords.

posted by Holy Zarquon's Singing Fish at 1:42 PM on February 19, 2016 [1 favorite]


When Apple is literally selling security, asking them to break the product they are selling isn't reasonable.
Secure by design.

iOS devices are secure right out of the box and deliver a great user experience. This is possible because Apple makes the hardware, software, and services that power all iOS devices — ensuring every element is built with security in mind.


And I think this is why I'm leaning against Apple's actions--specifically1, the press release and it's attempt to misdirect (IMO) away from this specific situation (e.g., the scope of the All Writs Act, etc.) and make this about the encryption war: their system is already insecure if a new iOS update can bypass the PIN (to torture the analogy further, the specs for the master key exist and the FBI "just" wants Apple to send them to a 3-D printer--it is an analogy; there are some issues, hence "just"). If feels to me that Apple is trying to stir up software developers, cryptologist, and civil libertarians to distract from the check their marketing people wrote that their code can't cash.

Fight the use of the AWA in the court and choose your battle carefully to avoid bad cases leading to bad case law. Don't conflate this with something else somewhat related (user privacy, surveillance overreach, etc.) to stir up support for you case because you might end up with a law stating use/knowledge of strong encryption is prima facie proof of guilt.

1. And I guess, getting people ok with the idea that a corporation can ignore the law when it suits it versus people lobbying for policy changes and good laws.
posted by MikeKD at 2:01 PM on February 19, 2016


the grugq has a good summary of the infosec issues involved.
  • Farook and Malik were amateur terrorists who conducted an attack more reminiscent of American “going postal” workplace violence than an operation directed by a terrorist organisation.
  • Farook destroyed his personal phone. The FBI wants access to his work phone.
  • FBI already has huge amounts of data from the telco and Apple. This is almost certainly enough to rule out clear connection with any other terrorists.
  • FBI is playing politics, very cynically and very adroitly.
  • FBI already has a massive amounts of data, all of which indicates that Farook and Malik were not in contact with a foreign terrorist organisation, nor were they in contact with any other unknown terrorists.
  • Even if, despite all evidence to the contrary, Farook and Malik were somehow in invisible traceless contact with an ISIS handler, that handler would not have revealed information about other cells, because that would violate the most basic tenet of security — need to know.
posted by zamboni at 2:39 PM on February 19, 2016 [4 favorites]


Welp. This is gonna be fun. DOJ Escalates Battle With Apple Over San Bernardino Shooter's Phone

What's interesting is that politicians are dismissing the technical issues as surmountable, when all the knowledge we have suggests they aren't, and they also dismiss the ease by which criminals will make use of the same backdoors.

Either we have unregulated encryption, or we have legalese that adds backdoors that all can use. Current mathematics doesn't permit a "middle" or "third way" approach. Or if it does, it really seems like it should be easy and straightforward for politicians to make this magic technology available to Apple and other tech companies.
posted by a lungful of dragon at 2:40 PM on February 19, 2016 [2 favorites]


In short, I am asking is there any way, had the government set up this phone as a "work product" phone, that they could have had the data they wanted?

Doubtful, since there is news coming out today that someone at the SB Dept of Health reset the iCloud password which invalidated one method that Apple previously tried to gather the data -- hooking it to a known wifi network and letting iCloud do an auto backup. (No word if this was volunteer forensics or an FBI-directed fuckup)
posted by RobotVoodooPower at 3:11 PM on February 19, 2016 [1 favorite]


The New York Times report applies to devices running iOS7 and earlier. Changes in IOS8/9 summarized in this article locked Apple out of these devices.

After the revelations form Snowden and others why are any of you willing to give the Government the benefit of the doubt over Apple? Apple's PR is only in response to the governments own PR. The same government that claimed it was only engaged in limited domestic surveillance but was actually collecting everyone's phone records. Now that government is pushing the story that this is just about unlocking one phone and is totally limited.
posted by humanfont at 4:01 PM on February 19, 2016




Securosis: Do We Have a Right to Security?
posted by ChurchHatesTucker at 4:38 PM on February 19, 2016


Slate also has put together an excellent overview of the FBI's legal strategy here. This isn't the first time they've asked a judge to use the All Writs Act to get Apple to put in a backdoor.
posted by humanfont at 6:41 PM on February 19, 2016


[…] to distract from the check their marketing people wrote that their code can't cash.

Set aside for a second that I don't think this is just a matter of Apple popping down and turning a key, I would suggest if you can't break into the phone without forcing Apple to sideload all new code on the device, well, that's fairly secure. So the marketing seems fairly accurate to me. Now, if I advertise something as unpickable, and you buy this, then I don't think it's really fair for you to come back and demand I pick the unpickable. Seriously, if the government wants insecure phones for their employees they should buy insecure phones. Don't buy the one that claims to be secure, then argue with the makers to tell them they need to undo one of the selling features.

Next, it now sounds like the reason the government can't get into the phone is because the government is fucking incompetent. On of the inept law enforcement officers reset the guy's password almost immediately. Apple tried to help them. At some point you just have to wash your hands of idiots and say, "Look man, you need to know how to do your job before you can ask me to do mine."
posted by cjorgensen at 5:21 AM on February 20, 2016 [5 favorites]




John McAfee is running for President as an independent.
posted by humanfont at 7:11 AM on February 20, 2016


EFF: A Technical Perspective on the Apple iPhone Case
Summary

EFF supports Apple's stand against creating special software to crack their own devices. As the FBI's motion concedes, the All Writs Act requires that the technical assistance requested not be "unduly burdensome," but as outlined above creating this software would indeed be burdensome, risky, and go against modern security engineering practices.
posted by ChurchHatesTucker at 7:28 AM on February 20, 2016


John McAfee is running for President as an independent.

Not sure McAfee has any better understanding of the issue.
posted by ChurchHatesTucker at 7:31 AM on February 20, 2016


I can't wait to see his social engineering technique for getting this PIN that the only people who knew it are dead.
posted by Holy Zarquon's Singing Fish at 8:23 AM on February 20, 2016


John McAfee is running for President as an independent.

If there's any candidate who can bring the Meth Nazi and Deranged Psychokiller demographics together, it's McAfee. But he's still not going to be able to crack an iPhone, short of dropping one on the floor.
posted by a lungful of dragon at 1:50 PM on February 20, 2016


But he can probably eat a shoe. He only promised one or the other.
posted by cjorgensen at 2:34 PM on February 20, 2016 [3 favorites]


I would suggest if you can't break into the phone without forcing Apple to sideload all new code on the device, well, that's fairly secure.

My point is that if it's possible to sideload software to bypass a security feature, it's not "fairly secure", in fact, it's rather poorly secured. I would argue that a secure system would not allow loading of software that bypasses a security feature (or, if needed, wipe the user data before installing the software). And it looks like the only thing now keeping this (a malicious version of iOS that can bypass PIN lockout and that isn't tied to a specific device) from happening is the security practices surrounding Apple's signing key (since the source code is already--or soon to be--out) and that doesn't inspire confidence in me.

Now, if I advertise something as unpickable, and you buy this, then I don't think it's really fair for you to come back and demand I pick the unpickable.

You're begging the question; the existence of the ability to sideload a malicious OS means it is not unpickable. And in fact, this is an perfect example of my complaint: conflating the marketing of with the reality.
posted by MikeKD at 4:23 PM on February 20, 2016


San Bernadino is blaming the FBI for the password reset.

And it looks like the only thing now keeping this (a malicious version of iOS that can bypass PIN lockout and that isn't tied to a specific device) from happening is the security practices surrounding Apple's signing key

Yes, the security depends upon security.
posted by ChurchHatesTucker at 4:54 PM on February 20, 2016 [1 favorite]


Yes, the security depends upon security.

Secure systems should rely on more than a single point of failure.
posted by MikeKD at 5:15 PM on February 20, 2016


You mean like requiring Apple software engineers to write custom signed code that doesn't even exist in order break into a phone?

My point on "begging the question" was that it doesn't even matter if the phone is secure or not, it is advertised as such, and if the government bought it, they really aren't in a good position to then say, "Hey we know you said this was secure when we bought it, but now we want you to open it up." Mostly the answer is, if you think it can be done, then do it yourself.
posted by cjorgensen at 5:42 PM on February 20, 2016 [1 favorite]


Speaking of China
posted by ChurchHatesTucker at 5:47 PM on February 20, 2016



My point on "begging the question" was that it doesn't even matter if the phone is secure or not, it is advertised as such, and if the government bought it, they really aren't in a good position to then say, "Hey we know you said this was secure when we bought it, but now we want you to open it up." Mostly the answer is, if you think it can be done, then do it yourself.


The FBI's demands here are so egregious that I think it's part of a plan to be able to retreat to demands that are closer to sane. I would not be surprised if they already have source code for a firmware mod, and will just demand that Apple compile and sign it.
posted by ocschwar at 6:50 PM on February 20, 2016 [2 favorites]




In other news, there's been another shooting: Uber Driver Allegedly Responsible For Kalamazoo Shooting Spree That Killed 6
posted by homunculus at 3:45 PM on February 21, 2016


Doesn't count. That guy was white…not a terrorist.

And yes, that's a shitty way to make a point, but they already declared that one "not an act of terrorism." In my mind it is, but then I am one of those people that think random gun violence is terrorism.
posted by cjorgensen at 4:04 PM on February 21, 2016 [1 favorite]


FWIW, White House Petition
posted by ChurchHatesTucker at 4:08 PM on February 21, 2016


I'm going to say, "Not much." Other than getting the Obama's a dog and getting a statement on the Deathstar, has anything come of any of those petitions? The Snowden one took them months and month and months to issue a milquetoast condemnation of his actions. Has even one petition actually worked?

I'd love to be wrong, but honestly, rather than sign a feel good petition there are other masturbatory activities I can engage in to make myself feel better (like actual masturbation for one).
posted by cjorgensen at 4:34 PM on February 21, 2016 [1 favorite]


The only thing I can say is that a petition with few signatures is easier to ignore.
posted by ChurchHatesTucker at 5:24 PM on February 21, 2016


Bear in mind that Loretta Lynch is short-listed for the supreme court. The DOJ would not be pursuing this without her OK.
posted by bluffy at 5:47 PM on February 21, 2016


The only thing I can say is that a petition with few signatures is easier to ignore.

And an online petition is easier to ignore than an email, which is easier to ignore than a letter, which is easier to ignore than a phone call, which is easier to ignore than an in-person encounter, which is easer to ignore than fat checks to a reelection fund, which is easer to ignore than fat checks to an opponent's election fund.

So this is literally the least effective way of getting a message across next to a hate tweet. Again, has it ever worked?
posted by cjorgensen at 6:16 PM on February 21, 2016


So this is literally the least effective way of getting a message across next to a hate tweet.

A petition with a lot of petitioners may not get an official response, but it'll certainly get more internal consideration than one without many petitioners.

Unless the Intelligence Cabal already has superior leverage on Obama, in which case it doesn't matter what you do.
posted by ChurchHatesTucker at 6:34 PM on February 21, 2016


This situation is so crucial that civil liberties should be reconsidered!

but not serious enough that gun legislation should be reconsidered
posted by mazola at 10:57 PM on February 21, 2016 [4 favorites]


A petition with a lot of petitioners may not get an official response, but it'll certainly get more internal consideration than one without many petitioners.

Ok, again, give a single example where this has actually changed a policy or law.

I'm suggesting that online petitions are actually counterproductive. They are feel good measures designed to be ignored. If I give you an avenue for your complaints you are less likely to agitate in the streets. If you're filling out petitions rather than voting I don't even have to worry about you at all. If you can feel like you accomplished something by adding your name to the "Kicking puppies is bad!" petition you are happy and I can go about my business.
posted by cjorgensen at 8:26 AM on February 22, 2016




I don't think Feinstein is a DINO. Her reaction here is largely typical of Democrats, and her importance in the Democratoc Senate leadership is no coincidence. (Also, she represents California, not, like, North Carolina.)

I'm sad, though, that her own experience of being spied on by the CIA has not altered her views on security and surveillance.
posted by grobstein at 10:20 AM on February 22, 2016




About the only reason I'd want to move to California would be to vote against Feinstein. I have to comfort myself with the idea that I get to do so with Steve King instead.
posted by cjorgensen at 10:48 AM on February 22, 2016


Here's Daring Fireball on the password change thing.

Apple should win this case on the law and on the merits.

Will they? Jump ball.

I think it's b&w law, but once you invoke "terrorists" or children all bets are off when it comes to what Americans are willing to tolerate.

I keep coming back, in my head, to the idea that either the FBI is astoundingly stupid, or they are maliciously evil. Incompetent or immorally corrupt.

There's only two scenarios that make sense:

Bumbling mistake: They had no idea that changing the password would compromise their ability to access the data. So rather than call in an expert, or try to forensically preserve evidence, they decided immediate access was more important.

Machiavellian villains: They knew they could get everything if interest from iCould, with the benefit of doing so would disable the ability to initiate an over-the-air backup. Now they could make a terroristic test case to force Apple to open up their ecosystem.

Personally, I think they are just idiots, and don't see why they should benefit from their mistakes, but a lot of people are insisting they are the smartest people in the room, so if you want to, go ahead and think of them as evil.

The truth is probably somewhere in the middle.

They for sure are cowardly, as the idea of misrepresenting their own actions, "The owner changed the password," is disingenuous when the government is the owner and did so at the FBI's behest. This whole case is going to blow up in the FBI's face. Apple will double down on making their next phone and OS even more secure and their answer when asked for help will be way different next time.
posted by cjorgensen at 11:01 AM on February 22, 2016 [3 favorites]


Her voting history is not stellar.

Iraq: Yea.
USA PATRIOT Act: Co-Sponsor.
Death penalty: For it.
Free Speech: Main Democratic sponsor of Flag Burning Amendment, voted in favor of COICA, wants to extradite and arrest Snowden.
Loves DRM and eternal copyright.
Loves mass surveillance.

But I digress.
posted by entropicamericana at 11:03 AM on February 22, 2016 [4 favorites]




Online Petitions were helpful in pushing the public debate on Net Neutrality. That's one example that comes to mind.
posted by humanfont at 2:19 PM on February 22, 2016








Her voting history is not stellar.

No argument from me. I was merely observing that the fact that she's bad doesn't disqualify her from being a Democrat.
posted by grobstein at 8:08 PM on February 22, 2016




Pew Research: there’s more support for the Justice Department than Apple in the iPhone unlocking brouhaha

Fortunately, we live in a society governed by a rule of law and not public opinion.
posted by cjorgensen at 7:01 AM on February 23, 2016


The Financial Times interviewed Bill Gates, and published a report that he's siding with the FBI against Apple. He was then interviewed by Bloomberg GO (video), and said the FT headline misrepresented his position. Via Engadget:
He said he was "disappointed" by the headline because it "doesn't state my view on this." Gates went on to explain that he supports a discussion to resolve the issue. "I do believe there are sets of safeguards where the government shouldn't have to be completely blind," he said. In other words, he's not siding with the FBI, but he does believe there are cases where law enforcement should have access, with the proper safeguards in place.
Expanded story was picked up by the Today Show this morning and CNN.
posted by zarq at 7:33 AM on February 23, 2016


Fortunately, we live in a society governed by a rule of law

Oh man, somebody just woke up from their cryogenic deep-freeze. Somebody want to fill him in on the last 15 years?
posted by entropicamericana at 7:49 AM on February 23, 2016


From ChurchHatesTucker 's link:

“The iPhone assigned to Farook also lacked a Touch ID feature, meaning the FBI cannot use the dead gunman's thumbprint to unlock it now.” They couldn’t before. It would have been the same issue. Which finger? How many tries before the phone locks itself? Mine’s set at 3 (I think). Perhaps I get to try again after some time passes between incorrect guesses, but if I fail too many times in a row it insists on the actual code, then we’re back where we started. It also wants it after a day (I believe) passes or whenever I power cycle the phone. Also, TouchID (if the reporter bothered to do a simple google search he’d find that a dead person can’t trigger the phone.

This is why you don’t have to worry about criminals cutting off your thumb like in the movies (OK, you can still worry about this because criminals are dumb, but it still won’t work).

I asked about MDB upthread. This answers that. Seriously, at what point do you just tell the government they need to start dong their jobs and not expecting others to do it for them? If it had been in MDM it could have been unlocked. If it hadn't had the password changed it could have been backed up. If if if if!
posted by cjorgensen at 8:11 AM on February 23, 2016


It’s absurd.

The San Bernadino shooters legally purchased weapons that resulted in all those deaths. And the big legal push the US Government has decided to make in response?

It’s decided to seek a precedent that would allow it to force every American company to create a backdoor for the Government to snoop on anyone it so pleases.

The logic is outrageous: “People got shot. So we need a backdoor into your phone.”

posted by a lungful of dragon at 9:54 AM on February 23, 2016 [3 favorites]




NYT: Narrow Focus May Aid F.B.I. in Apple Case

The New York City police commissioner, William J. Bratton, and the Manhattan district attorney, Cyrus R. Vance Jr., criticized Apple after it refused to comply with the court order and said that they currently possessed 175 iPhones that they could not unlock.

Charlie Rose recently interviewed Mr. Vance and asked if he would want access to all phones that were part of a criminal proceeding should the government prevail in the San Bernardino case.

Mr. Vance responded: “Absolutely right.”

posted by RedOrGreen at 12:14 PM on February 23, 2016


Fortunately, we live in a society governed by a rule of law and not public opinion.

So how is an order to unlock a phone to search different from a warrant to search property? This is one distinction I keep stumbling over.

Why is it ok for the police to search your underthings drawer with a court's say-so, but completely not ok for search warrants applied to your phone? Why are phones different from encrypted USB keys or computer memory/hard drives?
posted by bonehead at 12:44 PM on February 23, 2016 [1 favorite]


Because an order to search property doesn't allow the police executing that warrant to bring in a third party who doesn't own or reside at the property in question and force them to create new search technology over their strenuous objections.
posted by Holy Zarquon's Singing Fish at 12:46 PM on February 23, 2016


So how is an order to unlock a phone to search different from a warrant to search property?

The order is not for Apple to unlock the phone. It is for Apple to create a new version of iOS which contains tools that the FBI can use to unlock the phone. The order specifies that this version of the iOS should be keyed to this one phone, but if the FBI prevails, Apple will almost certainly be ordered to do this over and over again.
posted by notbuddha at 12:47 PM on February 23, 2016


It's just one a dozen hundreds of phones
posted by cjorgensen at 12:48 PM on February 23, 2016


Why is it ok for the police to search your underthings drawer with a court's say-so, but completely not ok for search warrants applied to your phone?

The problem isn't warrants, per se. It's whether you're allowed to use truly secure methods to maintain your privacy. The equivalent would be the government forbidding the use of an uncrackable safe, or requiring the company to make it uncrackable upon demand. That may sound reasonable, but it renders everyone vulnerable to all sorts of other actors.
posted by ChurchHatesTucker at 12:51 PM on February 23, 2016


Or to put it another way, the government considers *you* a bigger threat than China or ISIS.
posted by ChurchHatesTucker at 12:54 PM on February 23, 2016


The order is not for Apple to unlock the phone. It is for Apple to create a new version of iOS which contains tools that the FBI can use to unlock the phone.

Apple is being required to make a tool to facilitate access to software locks that it created, and maintains sole ownership of. The owner of the phone does not own the OS, Apple does. Thus the requirement falls to them.

The problem [is] whether you're allowed to use truly secure methods to maintain your privacy.

Police can require owners to open safes. I am still unclear why a phone should be more inviolable than a safety-deposit box, for instance.
posted by bonehead at 12:55 PM on February 23, 2016


The order is not for Apple to unlock the phone. It is for Apple to create a new version of iOS which contains tools that the FBI can use to unlock the phone.

Which I am not 100% convinced Apple can even do.

The order has three parts:
  1. Make an OS that removes the guess limit
  2. Remove the time between guesses
  3. Create a feature that allows input over the air or with a connected piece of hardware.
Each item has its own drawbacks. Apple has not said they couldn't do any of these, but there pretty strong legal or technical reasons why they can't be compelled to do each step.

1. Would require Apple to write code. For better or worse, code is speech and companies, for better or worse, can't be compelled to engage in speech.

2. Only helps if the passcode is a simple one and not something like D3ath2@meriKKK!

3. Have you ever heard of a new hardware feature working without a reboot? What if Apple is wrong and their solution fails? Even if not, Apple is not in the business of trying to compromise their own security. This is an asinine ask. If the FBI wants to do this they should get a developer's license and hire away some Apple talent.
posted by cjorgensen at 12:57 PM on February 23, 2016


requiring the company to make [a safe] uncrackable upon demand.

Police routinely do this with bank-owned safety deposit boxes.
posted by bonehead at 12:57 PM on February 23, 2016


I am still unclear why a phone should be more inviolable than a safety-deposit box, for instance.

Because it's more than that. It's your medical history, nude pictures of your spouse, it's where you've been, your financial records, the books you've bought and read, your emails to your lover, your journal, your health data, your google searches, etc,.
posted by cjorgensen at 1:00 PM on February 23, 2016 [1 favorite]


Police can require owners to open safes.

They can try. I believe the courts are currently split on whether they can compel you.

Police routinely do this with bank-owned safety deposit boxes.

And San Bernadino could have enabled this with the phone in question, since they own it. They did not. And they fucked up any possibility of going another route (seemingly at the FBI's request.)

At some point you just have to say they should go pound sand and think about what they've done. Not undermine everyone's privacy.
posted by ChurchHatesTucker at 1:01 PM on February 23, 2016 [1 favorite]


Because it's more than that.

So what? In a criminal case, all of those could be relevant, and would be allowed to be searched under current law if not on your phone with appropriate court permissions.
posted by bonehead at 1:02 PM on February 23, 2016


bonehead: Police can require owners to open safes. I am still unclear why a phone should be more inviolable than a safety-deposit box, for instance.

One, you are kind of eliding over the key word there: owner. Apple doesn't own this phone. Second the issue here is not that they are subpoenaing iCloud data or other things that Apple does have access to through normal routes, they are asking that Apple invent a new way of breaking into the phone. It's closer to a safe-maker being told to invent a new kind of drill to get through their safe.
posted by tavella at 1:02 PM on February 23, 2016 [1 favorite]


Bro, do you even search warrant?
posted by ChurchHatesTucker at 1:02 PM on February 23, 2016


Apple owns the OS. They never give up ownership.
posted by bonehead at 1:03 PM on February 23, 2016


I own my house. I'm not going to tear it down no matter what the FBI wants.
posted by ChurchHatesTucker at 1:06 PM on February 23, 2016


In a criminal case, all of those could be relevant, and would be allowed to be searched under current law if not on your phone with appropriate court permissions.

No, we have special protections for some of those. Your spouse can't be compelled to testify against you for one. Your medical records are privileged as well. Etc.
posted by cjorgensen at 1:07 PM on February 23, 2016


Bro, do you even search warrant?

No, I'm just a civilian. I am puzzled why encryption has seemingly enabled a whole new set of privacy rights though, for things that weren't protected even a few years ago.

Is this just a might makes right argument? Government can't break Apple's encryption so therefore they shouldn't be able to? Privacy rights only exist because of governments lack of ability to break security? That seems to be the argument to me here.
posted by bonehead at 1:07 PM on February 23, 2016 [1 favorite]


Government can't break Apple's encryption so therefore they shouldn't be able to?

That's a tautology. What the Gov. wants is to make Apple weaken their encryption. Which imperils everyone using it. There's no such thing as "good enough" encryption. Or "only vulnerable to the good guys" encryption.

This fight has been going on for the past twenty-odd years. Do some googling.
posted by ChurchHatesTucker at 1:12 PM on February 23, 2016


Put it this way, would it be a worse outcome for Apple to build this tool, or for congress to react by making consumer encryption illegal? 'cause that's where I see this train going otherwise.
posted by bonehead at 1:13 PM on February 23, 2016 [1 favorite]


Same difference, really.
posted by ChurchHatesTucker at 1:14 PM on February 23, 2016 [2 favorites]


Apple owns the OS. They never give up ownership.
No, Apple owns the copyright to the OS. It's not the same thing.

Put it this way, would it be a worse outcome for Apple to build this tool, or for congress to react by making consumer encryption illegal? 'cause that's where I see this train going otherwise.

It would have the same long term effect. If Apple can be required to do this, they can be required to write a new version of the OS anytime the government can convince a judge it's necessary.

Given the volume of requests, Apple (and Google) would need to either make a non-phone-specific tool, which would eventually fall into the wrong hands (if you don't already consider the FBI the wrong hands), or just remove the encryption and protections entirely.
posted by notbuddha at 1:19 PM on February 23, 2016 [1 favorite]


That seems to be the argument to me here.

I can cite a lot of the relevant caselaw on search warrants and intellectual property and copyright and speech and search warrants and the history of encryption, but that's a rabbit hole, so it's best if you figure out which part you are interested and start there.

Law is built on precedence, which is why sometimes some of it doesn't make sense. That is why the government treats a physical key differently than a combination lock. (And that difference is how the laws governing passwords are effected.)

Here's a nice overview of some of these issues.
posted by cjorgensen at 1:21 PM on February 23, 2016


Given the volume of requests, Apple (and Google) would need to either make a non-phone-specific tool, which would eventually fall into the wrong hands (if you don't already consider the FBI the wrong hands), or just remove the encryption and protections entirely.

Or just write a backdoor into encryption (which is transparently the FBI's intent in the first place). This is why Apple is fighting and sad that not every company is.
posted by cjorgensen at 1:23 PM on February 23, 2016


This fight has been going on for the past twenty-odd years.

And I've been following it for that whole length of time.

The problem is that, I do think ultimately that just "good-enough" compromised encryption is all that's going to be allowed to civilian devices. There is very little Apple or any corporate entity can do if a sovereign power decides that they want something. If not the US, then the EU or India or Saudi or China.

In democracies, I can't see the right to privacy being held inviolable over the rights to security or property, ultimately. In less open regimes, I can't see the right to privacy being held as important at all. RIM has already been through this with their BBM encryption in a number of places.
posted by bonehead at 1:23 PM on February 23, 2016 [1 favorite]


Or just write a backdoor into encryption (which is transparently the FBI's intent in the first place).

True, but I doubt they would bother with that one, since it would very likely be discovered and exploited in short order.
posted by notbuddha at 1:25 PM on February 23, 2016


The problem is that, I do think ultimately going to be just "good-enough" compromised encryption is all that's going to be allowed to civilian devices.

So all the bad guys need to do is use anything better. Solid plan.

In democracies, I can't see the right to privacy being held inviolable over the rights to security or property, ultimately.

This is literally about securing your property.
posted by ChurchHatesTucker at 1:29 PM on February 23, 2016


So all the bad guys need to do is use anything better.

And we're back to "unbreakable" encryption being re-militarized and perhaps even criminalized. yep. That's a possible endgame, in my view.
posted by bonehead at 1:35 PM on February 23, 2016 [1 favorite]


Of course that's a possible endgame. That's what this is about.
posted by ChurchHatesTucker at 1:40 PM on February 23, 2016


Put it this way, would it be a worse outcome for Apple to build this tool, or for congress to react by making consumer encryption illegal? 'cause that's where I see this train going otherwise.

There is no real, functional difference between the two. If Apple builds this tool, this will effectively render security on all phones moot, not only for the benefit of US law enforcement, but for the benefit of any criminal or non-US government that wishes to crack a phone and who uses the same tool. If Congress makes encryption illegal, that disables security on all phones (sold in the US). In both cases, the individual can no longer rely on the phone to safely store sensitive data.

We need to be careful with analogies. While the authorities can compel the owner of a safe to unlock it, the combination used to open that safe cannot typically be used to open any other safe. What the FBI is asking is for a safe manufacturer — not the owner — to make a device which can indiscriminately open all its safes. This makes these products useless for safeguarding contents.
posted by a lungful of dragon at 1:47 PM on February 23, 2016


re: bill gates' take, fwiw (starts about the 32m30s mark...)
posted by kliuless at 1:49 PM on February 23, 2016 [1 favorite]


What the FBI is asking is for a safe manufacturer — not the owner — to make a device which can indiscriminately open all its safes.

Which governments do not do because they have other recourse, through a drill, for example. Apple has made an undrillable safe which they can't open. The discussion is becoming very absolute. I can't see the government, courts or police being completely ok with forever more giving up all ability to search anyone's information.

This isn't a principled argument, it's about what governments do when backed in a corner. They tend to outlaw things. I guess Apple wants to see how far they can push this and for how long. I do think it's a loser ultimately though, as RIM found out a few years ago too.
posted by bonehead at 1:58 PM on February 23, 2016 [1 favorite]


This isn't a principled argument, it's about what governments do when backed in a corner. They tend to outlaw things.

Outlawing encryption will put tech companies at a competitive disadvantage and compel them to leave the United States, which will have a very negative impact on an economy already under strain. I can't see the public getting behind a government that deliberately does this, but all kinds of fascist elements are coming out of the woodworks this election year, so it's difficult to know what the future holds.
posted by a lungful of dragon at 2:04 PM on February 23, 2016


Outlawing encryption will put tech companies at a competitive disadvantage and compel them to leave the United States,

Access either through encryption or sovereignty is already having effects: many states are already insisting on localized storage for their own citizens data, in the EU and in Canada, for example. There the concern is that non-regional actors should not have legal authority over citizens' data.

However, this was the solution RIM was forced into in India as well, so that that state could have access to all encrypted traffic on the national BBM servers. Apple could be given the same choice for iMessage and other iPhone encrypted data as well, in markets like this: bluntly, play or get out.
posted by bonehead at 2:23 PM on February 23, 2016


Well, you might look at a company like RIM and see where they are positioned in the market as evidence that making an inferior or deliberately crippled product did not help them in the overall global marketplace, in the long term.

Granted, RIM ended up where it did because its incompetence and lack of vision when it came to dealing with the iPhone's success, but when a company is also selling a product that is less functional (and security is as much a function of this hardware as any other function) that makes it more difficult to make a sale.
posted by a lungful of dragon at 2:28 PM on February 23, 2016 [1 favorite]


At the time, 2012, India went after RIM because it was the local market leader.
posted by bonehead at 2:40 PM on February 23, 2016


We are getting far afield here, but is RIM still a meaningful part of the tech economy in the United States, in the way that Apple, Microsoft and Google are? Does the relative lack of security in their products endear them to customers here, or is that an impediment?
posted by a lungful of dragon at 3:15 PM on February 23, 2016 [1 favorite]


Zdziarski: On FBI’s Interference with iCloud Backups

Interesting speculation as to what the feds are doing w/r/t the court.
posted by ChurchHatesTucker at 4:33 PM on February 23, 2016 [1 favorite]




Last paragraph of that new Zdziarski piece:
FBI must clarify which of these two meanings their letter had. Either the FBI has recklessly interfered with the processing of evidence OR FBI has mislead the courts on the amount and the nature of assistance required by Apple under the All Writs Act.
It is not clear to me why this has to be an OR and not an AND.
posted by fedward at 5:47 PM on February 23, 2016 [1 favorite]


It is not clear to me why this has to be an OR and not an AND.

Fair point, although I suspect he's thinking legally here. I.e., either one should throw up red flags.
posted by ChurchHatesTucker at 5:52 PM on February 23, 2016 [1 favorite]


Apple will argue that the FBI's court order violates its free-speech rights

One of only many defenses available to them. I actually think the FBI intends to lose this one, but they want to damage Apple and send a message that resistance is expensive, and I think they also want to force Congress's hands.
posted by cjorgensen at 8:21 AM on February 24, 2016


I just had a couple of other random thoughts on this issue.

Let's say that the court doesn't let Apple off the hook. All appeals are exhausted and Apple is ordered to right this special version of iOS. Apple, the corporation, does not write software. People write software. What happens if all of Apple's qualified engineers refuse to write this software?

Does Apple have to threaten to fire them if they won't? Do they have to follow through?

What if instead of refusing outright, the engineers demand a huge bonus? Can Apple pay that and charge it to the court/FBI?

If Apple has to hire a new engineer to write this software, can they claim that engineer's entire salary+ benefits cost as part of the cost to provide this service?

Also, this order says that this special version of the iOS is to work only on this one phone. Apple is worried about the security implications of this code even existing, so it would make sense for them to delete all source code and version history of this once it has done its job. Now when the next court order for the same thing comes along, they have to develop the special version of iOS from scratch. This would rapidly become a very expensive proposition for the government.
posted by notbuddha at 8:28 AM on February 24, 2016


Apple Brand Could Become Casualty of FBI Tussle Over iPhone Hack And yet the FBI says this is a marketing ploy by Apple….
posted by cjorgensen at 8:29 AM on February 24, 2016


I just had a couple of other random thoughts on this issue.

Let's say that the court doesn't let Apple off the hook. All appeals are exhausted and Apple is ordered to right this special version of iOS. Apple, the corporation, does not write software. People write software. What happens if all of Apple's qualified engineers refuse to write this software?


I had the same question. Or close enough. If Apple refuses they hold it in contempt and perhaps levy fines. Not sure how that works. If they people refuse they can also be held in contempt and jailed, but you can't use contempt to punish, only coerce and once it's apparent that the engineers won't capitulate you'd have to let them go. (My layman's take.)

Does Apple have to threaten to fire them if they won't? Do they have to follow through?

No and no. That would be an internal HR matter.


What if instead of refusing outright, the engineers demand a huge bonus? Can Apple pay that and charge it to the court/FBI?

It would arguably be career damaging to be known as the engineer that compromised security. Even if at the orders of your employer. It's a weird position to put someone in. Also, expecting someone to labor for the government is also outlawed.

If Apple does this they would effectively become and agent of the state. Weirdness abounds!

If Apple has to hire a new engineer to write this software, can they claim that engineer's entire salary+ benefits cost as part of the cost to provide this service?

I think this goes into where it's considered an "undue burden." The Writs Act allows Apple to refuse an "undue burden," but what is really such to a company with billions in the bank? But at what point is it unreasonable to expect Apple to dedicate resources to being the government's bitch?

It would deincentivize Apple from making security a priority. Why would anyone want to spend time making something harder to break into knowing they may be asked to do so again?

Also, this order says that this special version of the iOS is to work only on this one phone. Apple is worried about the security implications of this code even existing, so it would make sense for them to delete all source code and version history of this once it has done its job. Now when the next court order for the same thing comes along, they have to develop the special version of iOS from scratch. This would rapidly become a very expensive proposition for the government.

Apple can't delete the code until it no longer bears relevance to the court. Let's say the government does find incriminating evidence and arrests someone else. That guy's lawyer is going to challenge chain of custody and say that there's no way you don't know an Apple engineer didn't insert the damaging data in the phone. The only defense the government would have would be to produce the code and an engineer to walk the jury through the methods used.

Also, they wouldn't write an OS for that phone. They would write an OS that could be used on any iphone, then limit it to that phone. I don't think the government bears the expense, Apple would, and suddenly Apple is in the phone cracking business without compensation.
posted by cjorgensen at 8:51 AM on February 24, 2016


If they people refuse they can also be held in contempt and jailed,

Wouldn't the individual engineers have to be named specifically in a court order, and get a chance to fight said order before they could be held in contempt? What would be the justification for requiring this work out of any specific engineer? At least with Apple, the corporation, you can say they are the only ones who could do this. But there is no one engineer about whom you could say that.

Apple can't delete the code until it no longer bears relevance to the court. Let's say the government does find incriminating evidence and arrests someone else. That guy's lawyer is going to challenge chain of custody and say that there's no way you don't know an Apple engineer didn't insert the damaging data in the phone. The only defense the government would have would be to produce the code and an engineer to walk the jury through the methods used.

I'm not sure that's true. Especially not for the source code. If they were making a physical tool, would they be required to maintain the dies used? And in any case, the order doesn't say that. The order says deliver this software. Once the order is fulfilled, does it have any further binding force?

Also, they wouldn't write an OS for that phone. They would write an OS that could be used on any iphone, then limit it to that phone.

If they were feeling cooperative, sure. But why would they? It would be easy enough to write the code such that you couldn't easily change the targeted phone once it was compiled and signed, and that would actually be a perfectly legitimate way of fulfilling this particular order, since it specifies that the software only work on this phone.

I don't think the government bears the expense, Apple would, and suddenly Apple is in the phone cracking business without compensation.

Good point. I read this: "5. Apple shall advise the government of the reasonable cost of
providing this service." as indicating the Apple would be compensated, but that isn't actually what it says.

If Apple will not be compensated, that's another pretty big strike against them being compelled.
posted by notbuddha at 9:19 AM on February 24, 2016


It would be easy enough to write the code such that you couldn't easily change the targeted phone once it was compiled and signed

Apple is saying it is not, and I believe them. The problem is once that software exists, *it* becomes the only thing you have to attack. Once you've figured out, for example, how to spoof the ID of the phone you're using it on all bets are off.

Apple says such a thing is too dangerous to exist, and I tend to agree.
posted by ChurchHatesTucker at 10:19 AM on February 24, 2016


I hadn't read Apple's statement quite that way, but I haven't read all their statements, and I may have misinterpreted something. Either way, I agree completely that it's too dangerous to exist.

With the source code, it would be trivial to create a non-phone-specific version. Think of the temptation for an engineer within the company to compile, sign and smuggle that out. How much would something like that bring from a foreign government? Our government? And that is just one of the risks.
posted by notbuddha at 10:49 AM on February 24, 2016


It would deincentivize Apple from making security a priority. Why would anyone want to spend time making something harder to break into knowing they may be asked to do so again?

The alternative for Apple is to defend its decision to maintain absolute confidentiality for its customers in each case where the various governments make these requests. Apparently there are a few dozen waiting in the wings already.

Personally, I think Apple has painted themselves into a lose-lose situation here. Either they comply and break a customer promise or they don't and face these sorts of problems forever more (as well as becoming the convenient punching bag for every Trumped-up pol who wants to call them terrorist and criminal sympathizers).
posted by bonehead at 11:05 AM on February 24, 2016




I just got around to watching that *Charlie Rose* from the other night with Cyrus Vance and John Miller and it's really something. They are most definitely thinking about the implications for being able to bust street level drug dealers as opposed to the "terrorists" that are the headline.
posted by ob1quixote at 12:22 PM on February 24, 2016


Solid support for Apple in iPhone encryption fight: poll

Nearly half of Americans support Apple Inc's (AAPL.O) decision to oppose a federal court order demanding that it unlock a smartphone used by San Bernardino shooter Rizwan Farook, according to a national online Reuters/Ipsos poll.

Forty-six percent of respondents said they agreed with Apple's position, 35 percent said they disagreed and 20 percent said they did not know, according to poll results released on Wednesday.

posted by a lungful of dragon at 12:35 PM on February 24, 2016


Interesting write up on the latest iPhone hardware and how the passcode stuff is encrypted. Depending on which chip is running this will be either a difficult bit of code or a really difficult bit of code.
posted by humanfont at 12:37 PM on February 24, 2016




I'm not sure that's true. Especially not for the source code. If they were making a physical tool, would they be required to maintain the dies used?

It's come up before.
posted by cnelson at 12:59 PM on February 24, 2016


See also the Zdziarski piece linked above.
posted by ChurchHatesTucker at 1:08 PM on February 24, 2016


Techdirt: How Existing Wiretapping Laws Could Save Apple From FBI's Broad Demands

Basically, CALEA should preempt the All Writs Act.
posted by ChurchHatesTucker at 3:57 PM on February 24, 2016 [1 favorite]


Apple Is Said to Be Working on an iPhone Even It Can’t Hack (NYTimes)

Seems like the obvious next step.
posted by glhaynes at 4:12 PM on February 24, 2016 [3 favorites]


If you, like me, have hit your NYT limit for the month, google the title "Apple Is Said to Be Trying to Make It Harder to Hack iPhones" and you can get in.
posted by ChurchHatesTucker at 4:25 PM on February 24, 2016




WSJ: Justice Department Seeks to Force Apple to Extract Data From About 12 Other iPhones

What was that about 'just this one' again?
posted by mephron at 5:33 PM on February 24, 2016 [4 favorites]


I thought that everyone understood that the FBI wanted just this one favor in much the same sense that the mob wants just this one favor, and then you're in their good graces and free to go forever.
posted by DoctorFedora at 6:06 PM on February 24, 2016


The NYPD wants to use similiar court strategy to unlock 175 iPhones and counting.
posted by humanfont at 8:37 PM on February 24, 2016 [1 favorite]


The problem for Cook is Apple's desired outcome. Does he want to pick a fight with law enforcement and with congress? Apple might win in the courts in this particular instance. They might get relief in this case. They might even build an unbreakable scheme for encrypting a phone. But phones will still be evidence in murders, criminal conspiracies, even civil trials like divorce. There's going to be continuing pressure that during legal proceedings, claimants should have access to records.

So then what? Long obstruction of justice sentences? People held indefinitely for contempt of court?

Congress can over-rule the courts, and will do so if Apple gets painted as the enemy of the public good, the defender of terrorists, criminals and cheating spouses.

So where, ultimately does Apple want to go? Cook is calling for a government commission on this, but I don't think he's really thought this through. What compromise would work for law enforcement and for people fighting for civil discovery rights in the courts? An absolute right to privacy on personal data and communications isn't defensible to many people as an unarguable public good.
posted by bonehead at 9:04 PM on February 24, 2016 [1 favorite]


An absolute right to privacy on personal data and communications isn't defensible to many people as an unarguable public good.

As an outsider, I find grim amusement in the way some bits of the US Constitution are fetishised while other bits are deprecated.
posted by Joe in Australia at 9:49 PM on February 24, 2016 [4 favorites]


What compromise would work for law enforcement and for people fighting for civil discovery rights in the courts?

When terrorists buy assault weapons in the US, I think a good compromise would be to go after the people who sold terrorists their weapons, instead of going after a phone manufacturer which is innocent of any wrongdoing.
posted by a lungful of dragon at 3:49 AM on February 25, 2016


Guns don't kill people, people iPhones kill people. Please, think of the children.
(No, not the children slaughtered at Sandy Hook and other schools--other, more hypothetical, children.)
posted by entropicamericana at 5:18 AM on February 25, 2016 [2 favorites]


Wired: Apple May Use a First Amendment Defense in That FBI Case. And It Just Might Work

Interestingly, the experts cited think that the code signing is more relevant than the software writing.
posted by ChurchHatesTucker at 7:36 AM on February 25, 2016 [1 favorite]


Effective immediately, the Maricopa County [Phoenix, AZ area] Attorney’s Office will discontinue providing iPhones as option for replacements or upgrades for existing employees, since Apple is "on the side of terrorists."
posted by ChurchHatesTucker at 9:03 AM on February 25, 2016 [1 favorite]


> Either they comply and break a customer promise or they don't and face these sorts of problems forever more.

Or they say they can't and won't and stand by that. Saying you won't is a lot easier when you can't. Which is where Apple will go. They will make a phone impossible for them to break into, then when asked, they will say no. Apple does not want to be a government lapdog, not should they be.

> But phones will still be evidence in murders, criminal conspiracies, even civil trials like divorce. There's going to be continuing pressure that during legal proceedings, claimants should have access to records.

What did the cops and courts do 15 years ago? Maybe we go back to that. You compel the criminal, not the device. In the case where that's not possible you move on to other methods. Like in this case, if you want to know who this guy called pull his phone records and told to those people. If you want to know where he was in the days leading up to it you take the device ID and subpoena ISPs and phone companies. Etc. You know, police work.

> Cook is calling for a government commission on this, but I don't think he's really thought this through.

I am guessing Cook has thought this trough and knows that if the issue is forced that Congress or the SCOTUS will rule on this and it will land toward the privacy of the individual as being inviolate. Cook wants the clarification and the win, and in the event he loses, well, then at least he'll have a clarified set of rules. But he won't lose. We've pretty well established privacy as a founding principle. It is better that ten guilty persons escape than that one innocent suffer. Even Scalia defended the right to privacy, and believed being sometimes the Constitution protected the guilty in favor of privacy.
posted by cjorgensen at 9:22 AM on February 25, 2016


> Interestingly, the experts cited think that the code signing is more relevant than the software writing.

Same idea though. It would be compelled speech. "Here, sign your name to this loyalty oath."
posted by cjorgensen at 9:24 AM on February 25, 2016 [1 favorite]


What did the cops and courts do 15 years ago? ... You know, police work.

Searching locked private spaces for evidence, including opening safes and cracking phones have been considered normal police work from the time that all of those technological blocks have been invented. So much so that we have centuries of laws about the the compromises police and the courts have to make regarding searches and seizures. In my mind, those are the current safeguards of our privacy rights. There are social contracts about what those rights should be, mediated through country constitutions, laws and court decisions. There are fights about this constantly, but the social contract works for most people most of the time in the US (and most other democratic nations), I'd wager.

I see this as Apple wanting to unilaterally change the social contract on privacy (largely for commercial reasons, but that's what companies do), without regard to knock-on consequences. Apple is claiming this is an unvarnished good, as most companies do when they've created an innovation. They've got a lot of supporters on that, as can be seen here and most other tech-friendly site on the net.

I don't think absolute privacy will be as easy or as consequence free to adjust to, but what I'm worried most about is over-reaction by the state(s) to this development. I think this is going to get a lot more nasty in the future. The courts are really just the beginning of this.

It is better that ten guilty persons escape than that one innocent suffer.

Disproportionately those who commit data-dependent crimes. I've already heard Cook called a terrorist and kiddie-porn supporter. The "think of the children" cards are already being played. Apple is going to be the favourite kick-ball for the police and their statist politician friends in the next decade or so.
posted by bonehead at 9:48 AM on February 25, 2016 [1 favorite]


There's always going to be evidence you can't obtain. Most of the planning for San Bernardino probably happened around their kitchen table. Fortunately the FBI has not yet tried to pressure IKEA into installing surveillance equipment.
posted by ChurchHatesTucker at 9:53 AM on February 25, 2016


The New Yorker: The Dangerous All Writs Act Precedent in the Apple Encryption Case

If the government can tell Apple to sit down and write a new OS for them, what can't they do?
posted by ChurchHatesTucker at 10:03 AM on February 25, 2016 [1 favorite]


There's always going to be evidence you can't obtain.

And having absolute privacy protections on a major communication and storage device won't change the current balance of where bad stuff and evidence of bad stuff accretes? Come on.
posted by bonehead at 10:07 AM on February 25, 2016


I see this as Apple wanting to unilaterally change the social contract on privacy (largely for commercial reasons, but that's what companies do), without regard to knock-on consequences. Apple is claiming this is an unvarnished good, as most companies do when they've created an innovation. They've got a lot of supporters on that, as can be seen here and most other tech-friendly site on the net.

I am an Apple fan (and investor), but I would be the first to suggest they didn't create this innovation. Encryption, email, cameras, etc. Pretty much everything on am iPhone has existed in some manner prior to the iPhone. They brought it together and made it stylish and easy to use. Most of the technologies are decades old. Your phone is a computer in your hand that's encrypted. That's it.

If the government forces these to be insecure they will in effect end things like DRM, smart TVs, online banking, cloud storage, etc. Whole industries will need to move out of the US isn order to continue to exist. Maybe I am alarmist, but I don't see how your could ever trust a US networking company (one that produces hardware), or ISPs, or web hosts or storage like Amazon's AWS, and to that end how do you even trust Amazon. We've long believed there are protected classes of data (medical record, student records, banking transactions), and protected methods of communication (priest penitent, spouse, etc.).

Apple won't lose this one, because Apple can't lose this one. The stakes are way too high.
posted by cjorgensen at 10:44 AM on February 25, 2016


If the government forces these to be insecure they will in effect end things like DRM, smart TVs, online banking, cloud storage, etc.

I agree, there are real technical problems with balancing encryption with the ability for lawful access. I disagree that simply because of those problems, that should be a good enough reason to discard all concerns about abuses of privacy, and that the providers of those tools should be immune to the costs of those consequences.

I don't know what is fair here, but Apple, to me, looks like yet another big company who wants to unilaterally make big changes to the way society works without any consequence to them. They may not have been the first to invent the pieces, but they're the first to put them together in this package, present this sort of problem, and push that out to a large fraction of the populations.

protected methods of communication (priest penitent, spouse, etc.).

So a phone is the equivalent of a confession booth or a lawyer's office? You don't think that's stretching privilege a bit more broadly than it has been exercised before? How often do you talk to a lawyer as a client or to a priest under confessional? More or less than you use your phone? I'll bet there are lots of people who use their phone more than they talk to their spouse.

What happens when your spouse wants access? (After you die, even?)

These are changes to the social contracts. Big ones even, and I don't think it's wise to just breathlessly be herded into accepting them.
posted by bonehead at 11:12 AM on February 25, 2016 [1 favorite]




Did you coin "abuses of privacy?" Doubleplusgood.

Wired: Apple Hires Lead Dev of Snowden’s Favorite Messaging App
posted by ChurchHatesTucker at 12:13 PM on February 25, 2016 [2 favorites]


I see this as Apple wanting to unilaterally change the social contract on privacy (largely for commercial reasons, but that's what companies do), without regard to knock-on consequences.

… and …

unilaterally make big changes to the way society works

You keep saying this, but I think you're wrong on two counts: (1) this is not a unilateral action by Apple; if anything, it's a response to a unilateral action by the DOJ at the behest of the FBI, which is arguably a significant overreach under existing law; (2) evidence that Apple is in fact trying to change the social contract is lacking. Arguably they're just trying to hold up an existing contract with their users (said contract extending to Apple's certification that the devices are secure).

And if you want to talk about knock-on consequences, I think you have to examine the DOJ and FBI with a jaundiced eye, because the claim that it's "just this one device" is demonstrably false. The knock-on consequences are arguably what this case is all about.
posted by fedward at 12:17 PM on February 25, 2016


If your spouse wants access, and you want your spouse to have access, create provisions to grant such access (before you die). Most people die with some regrets. If your spouse can't access your phone because you failed to give over the ability to do so before you died that's not Apple's fault. In fact, I would suggest some of us don't want our loved ones rummaging through out digital lives once we are gone. That's actually fairly clearcut in my mind. Your right to privacy doesn't end when you do.

I'm a fairly open person, but even I wouldn't want strangers crawling though my phone, and I for sure don't want family doing so. Again, I am fairly open (one need only read my comments on this site to see i talk about sexual and medical and mental health issues all the time), but I can see that a lot of people aren't as comfortable knowing their sexuality, their support groups, their medical histories, your reading history, your purchase and credit records, your browser searches, pictures of their kids and spouse (or lovers), and often these things for their confidants as well. Sure, you might be fine with people knowing whether or not you are gay or have a tumor, but what about that friend that told you this in email in confidence? People do use their phones in ways that are even more intimate than what they would tell their priests or spouse or lawyers. I would suggest this means it should be even more protected, not less.

You seem to be under the impression you can somehow have porous encryption or security, that you can somehow have access levels. I'm saying that it's a binary option. Either you have the ability to keep things private, or you don't. Once you cripple security it's crippled. There isn't a "let's let in the good guys, and not the bad guys." It's do we allow it or not? Even if you suggest you can limit it to the good guys what do you propose to do when this power is abused? Because if they can do it they will abuse it. Hell, in the FBI they even have a term for invading the privacy of people you are interested in romantically: LOVEINT. Or just look at how many times the cops have forwarded on nude photos from someone's phone (often just people pulled over for something like a traffic stop) as a game. Or when the DEA uses your social media accounts and the real photos of you and your kids to troll real and real dangerous drug dealers. Sure, these law enforcement people got in trouble, and maybe some even fired, but it's a bit too late to protect privacy after the fact.

Think about the fact that the only reason you are even hearing about this is because the FBI decided to file a case against Apple in open court. This is only being debated at all because Apple decided to fight. Otherwise, the government would have continued to do these things in secret. Look at the Lavabit case. As far as I know he's still being precluded from discussing what the government was asking.

I would suggest Apple isn't making these decisions. The consumer is making these decisions. People are drawn, in part, to Apple products because Apple is promising security. If Apple goes back on this promise at this point it would be incredibly damaging to them.
posted by cjorgensen at 12:17 PM on February 25, 2016 [6 favorites]


Apple won't lose this one, because Apple can't lose this one. The stakes are way too high.

By the same measure Donald Trump can't win the Republican nomination and then can't win in the general election. Let's stay tuned to see how well that defense works out, shall we?
posted by fedward at 12:20 PM on February 25, 2016


(in case it's not clear, I agree that this case is too important for Apple to lose, but I think they need a better defense than that)
posted by fedward at 12:21 PM on February 25, 2016


HuffPo: The U.S. Has Lost Its Damn Mind
posted by ChurchHatesTucker at 12:26 PM on February 25, 2016






APPLE INC’S MOTION TO VACATE ORDER COMPELLING APPLE INC. TO ASSIST AGENTS IN SEARCH, AND OPPOSITION TO GOVERNMENT’S MOTION TO COMPEL ASSISTANCE (DocumentCloud web viewer or PDF)

@bradheath: Apple: If FBI wins, it could also force drug makers to sell death penalty drugs, reporters to produce false stories
posted by glhaynes at 1:33 PM on February 25, 2016 [1 favorite]






That's an interesting quote (from the motion):

For example, under the same legal theories advocated by the government here, the government could argue that it should be permitted to force citizens to do all manner of things “necessary” to assist it in enforcing the laws, like compelling a pharmaceutical company against its will to produce drugs needed to carry out a lethal injection in furtherance of a lawfully issued death warrant,25 or requiring a journalist to plant a false story in order to help lure out a fugitive, or forcing a software company to insert malicious code in its auto- update process that makes it easier for the government to conduct court-ordered surveillance.

That last sentence almost seems like they are burying the lede a little, making me wonder if the government has already gone to Apple to try to compel them to compromise their software update process, and that all of this nonsense has built up to a sort of "last straw" for Apple.
posted by a lungful of dragon at 1:59 PM on February 25, 2016 [2 favorites]


Techdirt: The FBI's Not-So-Compelling Pitch For Sacrificing Security For Safety

"Chekov's Device" Heh.
posted by ChurchHatesTucker at 2:05 PM on February 25, 2016


but I think you're wrong on two counts: (1) this is not a unilateral action by Apple... it's a response to a unilateral action by the DOJ... (2) evidence that Apple is in fact trying to change the social contract is lacking.

The demand happened because the old tools the FBI used to break previous version of iOS don't work anymore. Apple chose to do that two years ago. Did Apple management chose this fight? They must have, they get these requests all the time. They've complied at least 70 times previously according to some accounts.

The DOJ & FBI are being dicks about it, but law enforcement and prosecutors usually are when they think they've been deliberately thwarted. There's more than a little lèse majesté going on.

You seem to be under the impression you can somehow have porous encryption or security, that you can somehow have access levels. I'm saying that it's a binary option. Either you have the ability to keep things private, or you don't.

You can kind of. You can set up something that will keep out your kid sister or your ex-spouse. But I'll agree, you can't keep out an attacker with a few resources, like a professional criminal or a spy agency.

I am saying that I think there's a strong possibility that encrypted phones will not be legal as a result of this. And by we, I don't necessarily mean just the US, but places like Europe and other parts of the OECD, let alone places like China and India which are much less friendly to these ideas. China may be responsive to pokes on things like human rights, but they still have their firewall.

I would suggest Apple isn't making these decisions. The consumer is making these decisions.

I would hopefully frame it as citizens making these decisions, as I'm not comfortable with the basic rules of society being defined by what people buy. But I'll agree that ultimately this won't be decided in the courts or by Apple. This immediate issue will, but the choice of legality of encryption will probably have to be decided by writing new law.
posted by bonehead at 2:12 PM on February 25, 2016 [1 favorite]


Your right to privacy doesn't end when you do.

Ownership does though. You cease to own your phone after your death. This isn't about the privacy of the terrorists at least in part because they're dead.

This is one of the main problems with this issue: what is the right to privacy? What are it's limits?
posted by bonehead at 2:16 PM on February 25, 2016


This isn't a privacy issue. It is a government phone.

Wired: Apple to FBI: You Can’t Force Us to Hack the San Bernardino iPhone
posted by ChurchHatesTucker at 2:18 PM on February 25, 2016




This specific case is not about privacy, but it's at the heart of the broader principles Apple is seeking to defend.
posted by Holy Zarquon's Singing Fish at 3:00 PM on February 25, 2016 [1 favorite]


This isn't a privacy issue. It is a government phone.

It most certainly is a privacy issue, and to suggest otherwise is missing the point. If it weren't for privacy considerations Apple would just comply.

Sure, the employee has no expectation of privacy on a government phone. So if that's what the government wanted, set it up that way. There are provisions in place that allow for just this exact thing. The government even paid for them, but failed to utilize the services they had purchased (which is why IT infrastructure should be seen as an investment and not a line item cost).

It's baffling one can see this as not being about privacy. If Apple knew the code to this phone, and was refusing to give it over, that idea, the idea that no one's privacy is being invaded is something I'd support, but this is like suggesting that Apple give over the code to every phone and no one's privacy will be invaded because it was a government phone.

Something something not seeing trees something something forest.
posted by cjorgensen at 3:07 PM on February 25, 2016 [1 favorite]


Oh, it's a privacy issue because of the abuse that will inevitably follow if the FBI is successful here. But Apple didn't cite the fourth Amendment in its filing.
posted by ChurchHatesTucker at 3:14 PM on February 25, 2016


bonehead: Did Apple management chose this fight? They must have, they get these requests all the time. They've complied at least 70 times previously according to some accounts.

Come on, surely you understand the difference between opening the closet door and building a locked closet door opener? The "70 times" they've complied have all been in response to subpoenas, and they've already provided the very same access to the FBI in this case - everything that was in iCloud has already been handed over.

The FBI wants more: they want the iPhone unlocked so that they can get whatever was not uploaded to iCloud, and the only way to unlock it now (thanks to the FBI's prior bumbling with the password reset - incompetence, or malice?) is for Apple to build and deploy a malicious operating system update to this phone.

If we're going to argue about this, lets at least agree on the facts at hand.
posted by RedOrGreen at 3:16 PM on February 25, 2016 [1 favorite]


Apple has chosen to tighten security with each new version of the operating system. They are continuing to do so even now. Their stated goal, as of at least two years ago, is to produce a phone which can't be compromised. They've indicated in the past week that they also want to do the same with the iCloud data.

Anyway, at least one expert thinks that the FBI might be able to crack the current phone by their own means, but that the FBI has chosen this route now. Does it really matter if it was this phone or another?

Are you arguing that Apple just happened to stumble into the conflict between privacy and access accidentally? That seems far-fetched. They appear to be a bit smarter than that. I don't think they chose this particular fight, or had this particular challenge in mind, but they must have been preparing to fight something like this for at least the last two years.
posted by bonehead at 3:40 PM on February 25, 2016


Humans are becoming cyborgs. Our brains and body parts are increasingly augmented by digital devices. If we do not more carefully restrict the access of the courts to these devices we will effectively lose the 5th amendment and our rights to privacy.
posted by humanfont at 3:43 PM on February 25, 2016 [1 favorite]


bonehead, your idea that Apple anticipated an unprecedented overreach by the Justice Department is... odd.

Techdirt: We Read Apple's 65 Page Filing Calling Bullshit On The Justice Department, So You Don't Have To
posted by ChurchHatesTucker at 5:02 PM on February 25, 2016




I invested in Apple not too long ago. I have also been a user of their products since 1987. I professionally supported them for all of my adult life (minus 3 years where I was a full-time book seller and a part-time writer). I have always used macs. I have a $10,000 Mac Pro setup in my basement. I have every new Apple gadget you can imagine from the iPad Pro to the newest Apple TV (only my 5s phone is older). I tend to buy the top of the line options for storage and speed.

I will only continue to buy their products as long as I believe I can trust them to be secure.

I would rather lose every penny I have invested in Apple than to see them roll over in this case. I contribute to legal defense funds all the time. I sort of wish Apple would put up a gofundme page to cover legal costs so the companies and people supporting them could actually do so (I gave $200 to the defense fund of Lavabit).

I'm not sure what I will do if they lose. I for sure will stop using my phone (and iPad) for anything personal, and I'll consider pulling all my data out of AWS and hosting my websites in another country. I'm small fry when it comes to this stuff. Imagine if Amazon or google or Microsoft felt like they could no longer operate in the US. The NSA started this, the FBI might as well finish it.
posted by cjorgensen at 6:04 PM on February 25, 2016 [2 favorites]


Silicon Valley was late to the lobbying game and it has hurt them.
posted by ChurchHatesTucker at 6:09 PM on February 25, 2016


For their latest filing Apple has brought in Ted Olsen who is a pretty heavy hitter in DC. Recently he was instrumental in the fight for gay marriage rights. Also notable because his wife was killed in one of the plane crashes on 9-11.
posted by humanfont at 7:01 PM on February 25, 2016


Olson also won Bush vs. Gore. He's good.
posted by cjorgensen at 7:36 PM on February 25, 2016


Wikipedia on Olson.
posted by cjorgensen at 7:36 PM on February 25, 2016


Boom! Just got mentioned in the Republican debate.
posted by ChurchHatesTucker at 7:37 PM on February 25, 2016


Debate thread.
posted by homunculus at 7:39 PM on February 25, 2016 [1 favorite]


Yeah, meant to say debate thread. We're not at a point where pols are citing Metafilter...
posted by ChurchHatesTucker at 7:44 PM on February 25, 2016








Ideology aside, since Apple runs iCloud, couldn't they change iCloud to allow the iPhone to perform the backup? Either by reverting to the old hash of the password (Apple doesn't know what the old password is, but they don't need to.), or by changing the server side code to blindly accept whatever password is used when coming from a specific whitelisted IP.

Or does iOS record the response from iCloud that login was denied, and always prompt in the future, even if the phone itself would be able to log into iCloud?

I don't think it's any different from an ideological standpoint, except that the phone is still locked and it becomes (more) publicly aware that iCloud is leaky. It is also far easier to turn off iCloud backups than it is to delete the manufacturer's key from inside your phone.

The FBI's request is also interesting for the level of the request. They could have asked for less, and only ordered Apple to sign a file they supplied, or they could have asked for more, and demanded Apple write the brute forcing tool for them as well, or for Apple to turn over the private key used for signing (which would be far worse).
posted by fragmede at 6:58 AM on February 27, 2016


Most software already has a “golden key” backdoor—it’s called auto update

So when Apple says the FBI is trying to "force us to build a backdoor into our products," what they are really saying is that the FBI is trying to force them to use a backdoor which already exists in their products. (The fact that the FBI is also asking them to write new software is not as relevant, because they could pay somebody else to do that. The thing that Apple can provide which nobody else can is the signature.)

My takeaway from this is that the FBI's request is too broad, but that what they want is still possible. All they need, in principle, from Apple is the ability to use Apple's key to sign their own FBI-written update.
posted by bonehead at 7:44 AM on February 27, 2016 [1 favorite]


Apple's likely response to this to generate a cryptographic key that's long enough to be effectively unbreakable. Their challenge is to do so in a way that the average iPhone/iCloud user can enter and, probably more importantly, not lose or forget. No more 4-digit pass codes.
posted by bonehead at 7:47 AM on February 27, 2016


I think it's more likely that Apple will have the various pin code brute force countermeasures hardwired in try chip. The current code puts in a progressive delay as more codes are tried. The result is that a phone without the wipe feature would still take a decade or more to unlock with a 6 digit passcode.
posted by humanfont at 8:13 AM on February 27, 2016 [1 favorite]


That doesn't help the iCloud problem though.

Anything that relies on physical security would make me itchy. Especially when a crypto chunk can be effectively invulnerable to any attack given a long enough key. Sure it's hard to decap a chip and put readers on it now but cracking technology gets better at roughly the same rate as chip manufacturing technology---they're very similar sets of tools.

At this level, you're really only talking about state-funded actors (law enforcement or intelligence gathering, either sovereign or sponsored). There are few criminal gangs with the capacity to blow a few tens of million dollars on a clean room electronics forensics lab.
posted by bonehead at 8:22 AM on February 27, 2016


Their challenge is to do so in a way that the average iPhone/iCloud user can enter and, probably more importantly, not lose or forget. No more 4-digit pass codes.

Coming Q3 2017: iDent.

Filed under: drill, baby, drill
posted by flabdablet at 9:42 AM on February 27, 2016


Ideology aside, since Apple runs iCloud, couldn't they change iCloud to allow the iPhone to perform the backup?

If they could, they would. That's why they told the FBI to take the phone to a trusted network, and also why the FBI's instruction to reset the password is so suspicious. Apparently they don't log previous hashes (and why would you, normally?) One reason the phone might not have backed up recently was that its iCloud space was full, in which case all Apple would have had to do was grant that account more space.

Congressman Darrell Issa (in Wired) : Forcing Apple to Hack That iPhone Sets a Dangerous Precedent
posted by ChurchHatesTucker at 11:11 AM on February 27, 2016


OTM: Apple vs. the FBI (Included for the sake of completeness. bonehead is apparently not alone in thinking that Apple chose this fight.)
posted by ChurchHatesTucker at 5:35 PM on February 27, 2016


My takeaway from this is that the FBI's request is too broad, but that what they want is still possible. All they need, in principle, from Apple is the ability to use Apple's key to sign their own FBI-written update.

If you read Apple's response to the initial order they lay out what is required to make this happen you'll see it's not just a matter of some intern knocking this out over lunch. Here's tech dirt on it: We Read Apple's 65 Page Filing Calling Bullshit On The Justice Department, So You Don't Have To

Relevant passage from that article:
The compromised operating system that the government demands would require significant resources and effort to develop. Although it is difficult to estimate, because it has never been done before, the design, creation, validation, and deployment of the software likely would necessitate six to ten Apple engineers and employees dedicating a very substantial portion of their time for a minimum of two weeks, and likely as many as four weeks.... Members of the team would include engineers from Apple’s core operating system group, a quality assurance engineer, a project manager, and either a document writer or a tool writer.... No operating system currently exists that can accomplish what the government wants, and any effort to create one will require that Apple write new code, not just disable existing code functionality.... Rather, Apple will need to design and implement untested functionality in order to allow the capability to enter passcodes into the device electronically in the manner that the government describes.... In addition, Apple would need to either develop and prepare detailed documentation for the above protocol to enable the FBI to build a brute-force tool that is able to interface with the device to input passcode attempts, or design, develop and prepare documentation for such a tool itself.... Further, if the tool is utilized remotely (rather than at a secure Apple facility), Apple will also have to develop procedures to encrypt, validate, and input into the device communications from the FBI.... This entire development process would need to be logged and recorded in case Apple’s methodology is ever questioned, for example in court by a defense lawyer for anyone charged in relation to the crime....
But even if all they did need was the key that's an incredible ask. You would effectively be ending Apple's ability to compete in any market.
posted by cjorgensen at 7:34 AM on February 28, 2016


NYT: Apple Shareholders Show Their Support for Tim Cook

Literal applause.
posted by ChurchHatesTucker at 3:32 PM on February 28, 2016 [1 favorite]



My takeaway from this is that the FBI's request is too broad, but that what they want is still possible. All they need, in principle, from Apple is the ability to use Apple's key to sign their own FBI-written update.


And within hours of Apple writing the update for this warrant, they;ll get a warrant demanding release of the update, with means of changing which phones it will be installed to, signed with the Apple key...

...AND delivered to them by the FISA court, with a gag order.

That's the prize being fought over. Apple chose wisely to pick this battle, which is in the public eye.
posted by ocschwar at 7:39 AM on February 29, 2016 [1 favorite]


The point is that Apple's update key is the weak-link the FBI needs. If they want to take the next step to render their phones beyond this sort of request (and it is, in my view, a legal vulnerability), they need to make it impossible to update the phone's hardware, or at least as it relates to the crypto parts. Apple really, truly needs to throw away the keys.
posted by bonehead at 11:36 AM on February 29, 2016 [2 favorites]


Probably just as important for the feds to be denied the ability to order companies to create things on demand.

The Verge: Read Apple's statement to Congress on the FBI warrant fight (in advance of the hearing Tuesday 1pm ET.)

WaPo: More than 25 major technology firms, media organizations and civil liberties groups are filing briefs this week in support of Apple.
posted by ChurchHatesTucker at 1:19 PM on February 29, 2016 [1 favorite]


Engadget: NY judge rules feds can't force Apple to unlock an iPhone (different case)
posted by ChurchHatesTucker at 4:27 PM on February 29, 2016 [1 favorite]




Like Gruber points out, whether or not the phone holds any data worth having is irrelevant to the validity of the request.
posted by cjorgensen at 8:16 AM on March 1, 2016 [1 favorite]


I disagree with Gruber there. The Feds chose this phone to press the issue with because they could claim "terrorism" and the fact that there's likely nothing of value there lets Apple walk that back a bit. It's probably telling that in the drug case the judge ruled for Apple.
posted by ChurchHatesTucker at 8:48 AM on March 1, 2016 [1 favorite]


The Encryption Hearing just wrapped. You can watch it here or read The Guardian's Liveblog.
posted by ChurchHatesTucker at 3:06 PM on March 1, 2016 [1 favorite]


Basically, CALEA should preempt the All Writs Act.

But I would have been surprised if that press release made world-wide headlines.
posted by MikeKD at 2:43 AM on March 2, 2016




That Wired article: "The government, Apple wrote, “has not made any showing that it sought or received technical assistance from other federal agencies with expertise in digital forensics, which assistance might obviate the need to conscript Apple to create the back door it now seeks.” "

There are decent reasons why the NSA shouldn't work on US signals. Apple is saying here that that's less important (eg domestic use of the five/six/nine/fifteen-eyes program results, the things Snowden was talking about) than Apple's ability to keep its methods and keys secret.

I'm not saying that this is a simple trade-off, between commercial and citizen's rights, but it should be recognized that this is a huge, huge ask by Apple. They're arguing that the legal limits and differences in spying between foreign and domestic sources should be erased, that the NSA should be allowed to operate on US soil, against US citizens, for their commercial benefit.

This is exactly the sort of shit I'm worried about.
posted by bonehead at 11:58 AM on March 2, 2016


See, for example: How the NSA is Transforming Law Enforcement. Apple, it appear to me, is arguing that this exchange between the foreign NSA activities and the domestic FBI (and other LEOs) be expanded and accelerated, before it should be required to help the FBI.

That, to me, seems like the worst of many worlds.
posted by bonehead at 12:01 PM on March 2, 2016


From the Guardian: Congress tells FBI that forcing Apple to unlock iPhones is 'a fool's errand.' Legislators accuse Justice Department of overreaching and undermining privacy but warn Apple it’s ‘not going to like’ a congressionally mandated solution.

It's got a number of interesting admissions from the head of the FBI:
Comey said “there was a mistake made” in the FBI working with San Bernardino County officials in December to reset the phone’s password, which potentially cost law enforcement a way into the phone data and out of the impasse. But Comey said he had been assured that all the data from the phone was unlikely to migrate to iCloud.

Comey also testified that he had not considered that China might follow the US’s lead in compelling Apple to provide access to customer data, even as he is challenged to thwart Chinese-attributed cyber-attacks. “I have no doubt there are internal implications,” Comey said.
posted by bonehead at 1:27 PM on March 2, 2016


This example from Brazil shows government overreach taken a bit too far - Brazil frees imprisoned Facebook exec who couldn’t decrypt WhatsApp messages:
"Dzodan was arrested after apparently refusing to provide WhatsApp messages that the Brazilian police sought in connection with a drug case. Since late 2014, all WhatsApp messages sent between Android devices are end-to-end encrypted, which means that not even parent company Facebook can access their plaintext contents.
...
Dzodan’s arrest came after Brazilian courts last month increased fines to $250,000 per day for not complying with the government’s data handover order. When Facebook would still not budge, Dzodan was arrested."
posted by cynical pinnacle at 5:27 PM on March 2, 2016 [1 favorite]


Wired: Top iPhone Hackers Ask Court to Protect Apple From the FBI
Vulnerabilities in Apple’s software have persisted for years even though Apple very much does not want them to. This is a lesson for this case
Apple, it appear to me, is arguing that this exchange between the foreign NSA activities and the domestic FBI (and other LEOs) be expanded and accelerated, before it should be required to help the FBI.

Yeah, the NSA isn't going to do that and Apple knows it.
posted by ChurchHatesTucker at 2:36 PM on March 3, 2016


before it should be required to help the FBI.

Yeah, the NSA isn't going to do that and Apple knows it.


Yeah, the DEA already called dibs.
posted by MikeKD at 12:25 AM on March 4, 2016


Parallel construction doesn't really cost them anything. They're not giving up a spycraft method.

Apple has picked up more support from tech companies, and at least one of the victims' family members (PDF).
posted by ChurchHatesTucker at 6:48 AM on March 4, 2016 [1 favorite]




WaPo: Apple VP: The FBI wants to roll back safeguards that keep us a step ahead of criminals
That’s why it’s so disappointing that the FBI, Justice Department and others in law enforcement are pressing us to turn back the clock to a less-secure time and less-secure technologies. They have suggested that the safeguards of iOS 7 were good enough and that we should simply go back to the security standards of 2013. But the security of iOS 7, while cutting-edge at the time, has since been breached by hackers. What’s worse, some of their methods have been productized and are now available for sale to attackers who are less skilled but often more malicious.
posted by ChurchHatesTucker at 7:24 PM on March 6, 2016 [1 favorite]






Conan: Steve Wozniak On Apple's Battle With The FBI (Woz was an EFF founder)
posted by ChurchHatesTucker at 7:12 AM on March 8, 2016




We're having the wrong conversation about Apple and the FBI - "We're having the wrong conversation about privacy in the U.S. The narrative is focused on how the bullies at the FBI are forcing the powers for good over at Apple to hand over data that they'd rather protect for the good of all. Here's the conversation I'd rather be having: why our government is not protecting our privacy. Instead, we are reduced to relying on an enormous, profit seeking corporation to make a stand for our rights."
posted by kliuless at 7:28 AM on March 8, 2016 [5 favorites]


Middleclasstool, that is an excellent article but the final paragraph is confusing.
posted by ChurchHatesTucker at 9:02 AM on March 8, 2016




Still, I’d hate to be the government official who has to explain this tradeoff to the mother of someone on Germanwings 9525.

And that's where I struggle with this. Apple is making this choice for everyone, in terms of the consequences of their customers' actions. Some poor schmoe at the FBI is going to be asked some day, "why couldn't you find X?" and the answer is going to be that they couldn't find any evidence of (pick your crime here) because a warrant isn't good enough to allow for legal search anymore. Real people are going to suffer because of these choices, and I think that tradeoff of privacy vs security (or even property?) should be talked about before it's allowed to be forced on us, especially by a private company, for profit. It's not just the small cost of a few guilty going un-punished, by that some pretty heinous consequences may be enabled by perfect, unbreakable cryptography. And we're not being allowed to talk about this in a fully democratic way before it happens.
posted by bonehead at 10:35 AM on March 8, 2016


Gates, in his AMA of 8 Mar 16:
I think there needs to be a discussion about when the government should be able to gather information. What if we had never had wiretapping? Also the government needs to talk openly about safeguards. Right now a lot of people don't think the government has the right checks to make sure information is only used in criminal situations. So this case will be viewed as the start of a discussion. I think very few people take the extreme view that the government should be blind to financial and communication data but very few people think giving the government carte blanche without safeguards makes sense. A lot of countries like the UK and France are also going through this debate. For tech companies there needs to be some consistency including how governments work with each other. The sooner we modernize the laws the better.
posted by bonehead at 10:46 AM on March 8, 2016


Apple is making this choice for everyone, in terms of the consequences of their customers' actions.

This isn't a shade of gray issue. Your shit is either secure, or it's not. The FBI would love to be able to hack Jeeps on demand, but we'd be safer if no one could.
posted by ChurchHatesTucker at 12:04 PM on March 8, 2016 [1 favorite]


The FBI would love to be able to hack Jeeps on demand, but we'd be safer if no one could.

You keep assuming that's true, but that's exactly where we're disagreeing. Would the use of perfect crypto make us all safer, will keep more people than present from being degraded, abused and killed, property safer and less likely to be stolen? We've not had that debate. It's not self-evident to me that prefect crypto is an unvarnished good which outwieghs the need for any analysis or consideration of consequence.
posted by bonehead at 12:24 PM on March 8, 2016 [1 favorite]


In the United States, "Freedom" and "Liberty" are *supposed to be* the Default Settings.
posted by mikelieman at 12:49 PM on March 8, 2016


Naw, that's just the marketing copy.
posted by DoctorFedora at 3:03 PM on March 8, 2016 [3 favorites]


An unencrypted mobile phone didn't stop the Germanwings pilot from crashing the plane and killing all aboard. Nor does it provide much comfort for those who lost loved ones.
posted by humanfont at 5:01 AM on March 9, 2016


I'm thinking of things like the Rehteah Parsons case in Halifax recently, where the major evidence that lead to convictions of her tormentors was contained on their cell phones.
posted by bonehead at 11:31 AM on March 9, 2016


Techdirt: The FBI Claims Failure To Guess Password Will Make Data 'Permanently Inaccessible,' Which Isn't True (The method described is what Darrell Issa alluded to during the Encryption Hearing.)
NBC: Americans Divided on Whether Apple Should Help FBI: NBC News/WSJ Poll (Dems slightly favor Apple, Repubs slightly favor FBI, Indies very much favor Apple.)
posted by ChurchHatesTucker at 12:13 PM on March 9, 2016



You keep assuming that's true, but that's exactly where we're disagreeing. Would the use of perfect crypto make us all safer, will keep more people than present from being degraded, abused and killed, property safer and less likely to be stolen?


The design of the iPhone, with perfect crypto, means a stolen iPhone doesn't mean a stolen identity.

That puts a maximum of about $600 gained by robbing you of your iPhone, rather than also whacking you upside the head and using the opportunity to wire your money to places unknown.

So the case for the affirmative is pretty damn strong.
posted by ocschwar at 1:01 PM on March 9, 2016 [4 favorites]


In the Parsons case the photo that resulted in the conviction was not gathered by unlocking the cellphone. Instead an anonymous tipster told police about the photo which the perps were sending around. The latest encrypted iPhone would have made zero difference.
posted by humanfont at 1:41 PM on March 9, 2016


The latest encrypted iPhone would have made zero difference.

So what if the boys had locked phones and had refused to unlock them on the grounds of their rights not to self-incriminate? If the police can't directly show possession (as they did in this case), that makes proving the case much harder.
posted by bonehead at 2:29 PM on March 9, 2016


Or, alternatively, are you saying that the right to privacy (in general) is stronger than the right to not testify against oneself---and that the courts should be ok to lock people up for not giving up their passcodes? Or immigrants at boarders?
posted by bonehead at 2:32 PM on March 9, 2016


wat
posted by ChurchHatesTucker at 5:54 PM on March 9, 2016


IIRC Under Canadian law cellphone passwords are not protected and must be turned over by the user of the phone with a warrant. Just as you must turn over a physical key to a locked file cabinet when presented with a warrant. US courts are split on this question. Eventually the Supreme Court will have to weigh in.
posted by humanfont at 6:35 PM on March 9, 2016 [1 favorite]


It's not been tested in Canada yet either. There's a case in front of the courts right now on this matter, in fact, in regards to not turning over a password at customs. Philippon is currently on bail, facing a maximum charge of $25k/1 year under the Customs Act.
posted by bonehead at 8:45 AM on March 10, 2016


BTW the 2013 case I think that's being referred to said simply that police needed a warrant to search a locked phone, but according to this the police can not legally require the password from someone with such a warrant. I think that's actually a bit optimistic given the Philippon charge above. But again, not tested in court yet.
posted by bonehead at 8:51 AM on March 10, 2016


the right to privacy (in general) is stronger than the right to not testify against oneself

To be clear, what I mean is this:

Assume that everyone has the right to privacy via prefect crypto on their phone.
Assume that the police have a warrant to search your phone, and that this is deemed reasonable by a judge.
Can you claim the right to not self-incriminate and withhold the password? Effectively, legally, is the phone part of your mind, or is it a thing which you are obligated to help the police search when asked through a legal process?
If you refuse, is it right that you should suffer some penalty, fines or imprisonment?
posted by bonehead at 10:05 AM on March 10, 2016


Business Insider: Apple exec: FBI could make us spy on Americans with iPhone cameras or microphones
"For example, one day [the FBI] may want us to open your phone's camera, microphone. Those are things we can't do now. But if they can force us to do that, I think that's very bad," Cue said, according to a translation provided by Apple.
Reuters: Senators close to finishing encryption penalties legislation: sources
posted by ChurchHatesTucker at 10:31 AM on March 10, 2016 [1 favorite]


Techdirt: White House Apparently Not Necessarily In Agreement With FBI's Position On Encryption Backdoors

Wired: Government Calls Apple’s iPhone Arguments in San Bernardino Case a ‘Diversion’
In its response today, the government accused Apple of deliberately raising technological barriers that prevent it from assisting authorities with a lawful warrant. “Apple alone can remove those barriers so that the FBI can search the phone, and it can do so without undue burden,” the government wrote. “Under those specific circumstances, Apple can be compelled to give aid. That is not lawless tyranny. Rather, it is ordered liberty vindicating the rule of law.”
posted by ChurchHatesTucker at 3:03 PM on March 10, 2016 [1 favorite]


Keep in mind that this only applies to a very narrow set of data that is only kept on the phone. The government still gets your phone calling records, stuff backed up to Google or iCloud, Facebook posts, bank records, email, etc.
posted by humanfont at 5:43 PM on March 10, 2016


"Ordered liberty" is a nicely Orwellian phrase.
posted by Joe in Australia at 7:39 PM on March 10, 2016


More Hobbsian than Orwellian.
posted by humanfont at 8:29 PM on March 10, 2016




Meanwhile the ACLU is claiming that the FBI can prevent the auto-erase feature by some clever circuit board hacking.
posted by humanfont at 8:56 PM on March 10, 2016


Confiscating source code and private keys wouldn't disadvantage Apple at all, right?
*eyes roll out back of head*

Curious to see if Apple says anything about this at their product event on March 21st.
posted by strange chain at 4:22 AM on March 11, 2016 [1 favorite]


Is there something I'm missing that would draw a meaningful distinction such that pharmaceutical companies wouldn't then be able to be compelled by the government to make lethal injection drugs if the FBI wins this?

Also, man, I absolutely can't even imagine what happens to the entire computer and e-commerce industry if that happens. This feels like Net Neutrality but way more significant, inasmuch as it's effectively a referendum on whether meaningfully effective encryption should be illegal.
posted by DoctorFedora at 6:33 AM on March 11, 2016


The Right to Keep and Bear Property.
posted by mikelieman at 7:05 AM on March 11, 2016 [1 favorite]


In short: Help us before Congress ruins everything:

"We need the tech community to help us solve [this problem]," Obama said. "What will happen is, if everybody goes to their respective corners—if the tech community says, 'either we have strong, perfect encryption, or it's a Big Brother Orwellian world'—what you'll find is, after something really bad happens, the politics will swing. It'll be sloppy, it'll be rushed, and it'll go through congress in ways we haven't thought through. Then we'll have something really dangerous."
posted by bonehead at 2:46 PM on March 11, 2016


Wired: New Documents Solve a Few Mysteries in the Apple-FBI Saga
  • Farook May Have Changed the iCloud Password on His Phone (Which doesn't seem to be relevant)
  • Farook’s Phone Was Found Powered Off (Which probably is)
  • The County Had a Device Management System on iPhone (but incompletely/incompetently implemented)
  • The iPhone’s Password Was Just Four Digits (which actually makes the brute forcing via flashed memory approach a lot easier.)
  • Data Not Backed Up to iCloud Is Significant (In a very theoretical sense of "is")
posted by ChurchHatesTucker at 5:04 PM on March 11, 2016




The County Had a Device Management System on iPhone (but incompletely/incompetently implemented)
“I learned from [San Bernardino County Department of Health] personnel that the department had deployed a mobile device management (“MDM”) system to manage its recently issued fleet of iPhones, that the MDM system had not yet been fully implemented, and that the necessary MDM iOS application to provide remote administrative access had not been installed on the Subject Device,” Pluhar wrote in his affidavit. “As a result, SBCDPH was not able to provide a method to gain physical access to the Subject Device without Farook’s passcode.”
To be fair, MDM on iDevices is a by-design clusterfuck.

You can remote-reset a passcode via MDM, but only if the phone is connected to a wifi network. If the phone is turned off, and then turned on again, it will not connect to any wifi network until somebody enters the passcode.

You can remote-install an app on an iDevice via MDM, but only if the device is already logged onto the app store with a valid Apple ID. There is no way to do a completely silent over-the-air app installation. At a minimum, the device will show a popup saying that the MDM facility is about to install an app that the user's Apple ID won't be charged for, with Cancel and OK buttons; tap the Cancel button and the app doesn't install.

You can purchase apps in bulk at a discount and distribute them to your iDevice fleet, but the only way to get the app's licence tied to the device itself rather than to whatever Apple ID the device is currently logged on with involves having already assigned the device to a management server using the Apple Device Enrolment Program. Setting your device fleet up with the DEP is a fragile non-automated bureaucratic process that I have still not managed to get completed for our school in a month of trying.
posted by flabdablet at 12:30 AM on March 12, 2016 [1 favorite]


Obama is wrong. The FBI has already sent the tech industry to "its corner." And it'll come back with a distributed approach like co-authorities and reproducible builds. It'll mean not just the the FBI, but even he NSA, must reveal their hacks if they want to use code signing. And it could even mean the end of DNS seizures too. It won't matter if congress later dislikes it.
posted by jeffburdges at 5:02 AM on March 12, 2016 [3 favorites]


Then Apple needs to get off it's high horse here and actually do that---I don't see any indication that they're budging from their perfect encryption vs certain doom rhetoric. If they can work something out that precludes the FBI telling congress that Apple is the enemy of Truth, Justice and The American Way next time there's a horrible school shooting or whatever, the chance of a blindly stupid bill getting rammed down everyone's throats is much less.

The time for a sensible technical fix, perhaps that is decentralized co-signing, is now, not after a 9/11 type of event. When legislatures are buffalo-scared, bad, terrible laws get made, what's technologically possible be dammed. This isn't (just) a technical issue, it's mostly a legal and social one.
posted by bonehead at 12:16 PM on March 12, 2016 [1 favorite]


The "Two man rule" for code-signing? I don't see why that wouldn't be in the pipeline right now, even if kept quiet.
posted by mikelieman at 5:21 PM on March 12, 2016


The "Two man rule" for code-signing? I don't see why that wouldn't be in the pipeline right now, even if kept quiet.

I'm sure Apple does that already, albeit in-house. "Cothority" requires many, many more "witnesses." The FBI is not going to be a fan.

BTW, the discussion has largely moved here.
posted by ChurchHatesTucker at 6:10 PM on March 12, 2016 [1 favorite]








« Older “First up: two hundred and four hours of chanting...   |   Classic Books, and thier punctuaion heat-maps Newer »


This thread has been archived and is closed to new comments