Whatsapp enables end to end encryption for a billion people
April 6, 2016 2:11 PM   Subscribe

Whatsapp is now the mostly widely used end to end crypto tool on the planet. Working with Moxie Marlinspike of Open Whisper Systems the Facebook owned messaging system is now using the Signal protocol for encryption.

Marlinspike spoke at Webstock last year about the recent history of cryptography and surveillance, more user friendly implementations of cryptography and his collaboration with whatsapp.
posted by roolya_boolya (60 comments total) 21 users marked this as a favorite
 
We've been here before. Define "end" in "end to end". If the hardware/OS is compromised no amount of encryption in software that runs in the layer above will make you secure.
posted by lalochezia at 2:20 PM on April 6, 2016 [5 favorites]


Now if only they could let you decide who in your phone's address book is in your contact list. For example, I don't particularly care to have my landlord in my WhatsApp contacts.
posted by acb at 2:21 PM on April 6, 2016 [5 favorites]


If the hardware/OS is compromised no amount of encryption in software that runs in the layer above will make you secure.

As a serious question, are there any proposed schemes where that wouldn't be the case? Running a trusted client on untrusted hardware entirely using homomorphic encryption in memory, for example?
posted by figurant at 2:25 PM on April 6, 2016 [1 favorite]


Huh. I had no idea Facebook owned Whatsapp.
posted by feckless fecal fear mongering at 2:28 PM on April 6, 2016 [1 favorite]


If the hardware/OS is compromised no amount of encryption in software that runs in the layer above will make you secure.

Unless en/decryption is done in your brain, I should think this is an unresolvable issue.
posted by five fresh fish at 2:38 PM on April 6, 2016 [3 favorites]


Unless en/decryption is done in your brain, I should think this is an unresolvable issue.

cf. The Parliament of Birds from Gibson's The Peripheral.
posted by chimaera at 2:58 PM on April 6, 2016 [6 favorites]


If the hardware/OS is compromised ...

Yes, someone who can compromise your phone can still read all your top-secret encrypted whatsapp messages with only a little more effort than before. What they can't do, assuming we can trust the protocol itself, is feed all the text messages in the universe into a gigantic omnipresent mass-surveillance machine to archive forever. Not without a high probability of eventually getting caught.
posted by sfenders at 3:00 PM on April 6, 2016 [27 favorites]


If the hardware/OS is compromised no amount of encryption in software that runs in the layer above will make you secure.

As a serious question, are there any proposed schemes where that wouldn't be the case? Running a trusted client on untrusted hardware entirely using homomorphic encryption in memory, for example?


I don't think it matters what your app does with memory or encryption or Vault or secure enclave or anything. As a last resort, a compromised OS can ignore all that stuff and just log your keystrokes and read your screen. I'm pretty sure the answer has to be "don't have a compromised OS". So hopefully that's possible?
posted by aubilenon at 3:03 PM on April 6, 2016 [2 favorites]


aubilenon: " I'm pretty sure the answer has to be "don't have a compromised OS". So hopefully that's possible?"

It's not.
posted by namewithoutwords at 3:05 PM on April 6, 2016 [3 favorites]


For an encryption dummy, can anyone explain how you can avoid the analog hole when running on an untrusted platform?
posted by Jakey at 3:11 PM on April 6, 2016


if only they could let you decide who in your phone's address book is in your contact list.

i don't use whatsapp, but my partner does, and i was looking at her contacts (trying to get someone in her phone list onto whatsapp) yesterday. and i think this should be possible, on android. something to do with groups, that i didn't really understand. but it seems to be there.

(also, fwiw, being on your whatsapp contact list doesn't mean they can more easily decrypt your messages to another person).
posted by andrewcooke at 3:19 PM on April 6, 2016


looking at the spec, and the people involved, this seems to be the real deal (so presumably there will be complaints about aiding terrorists at some point).

(if anyone is curious, it's elliptic curve, with curve 25519, which is generally considered a good choice (RFC 7748) and gives (we all hope) 128bit level protection. also with forward secrecy).
posted by andrewcooke at 3:21 PM on April 6, 2016 [1 favorite]


It's not.

There's at least one proposed way to detect "trusting trust" attacks. Beyond that, you can just avoid using compilers altogether. Write a tiny Forth in x86 assembly and build everything up from there, maybe.

For an encryption dummy, can anyone explain how you can avoid the analog hole when running on an untrusted platform?

You can't. This is what DRM tries to do, and it certainly hasn't worked yet.
posted by BungaDunga at 3:21 PM on April 6, 2016 [6 favorites]


I'm suspicious of systems that claim to be "end to end" encrypted when it isn't a pain the ass to transfer your account to another device.

If your private key is stored on your phone and nowhere else, how is it possible to log into your Whatsapp account on a different phone? There should be a convoluted process where you have to directly copy the key while in physical possession of your old and new phones when you want to transfer your account. Otherwise there's a third party involved in the transfer of your private key.

I'm guessing in practice your private key is encrypted using your password and stored on the Whatsapp servers, which facilitates the private key transfer when you log into a new phone. That still leaves your password as the weak link in the encryption.

If the Whatsapp people get a National Security Letter and are required to furnish their users' private keys, the encryption would effectively be limited to the number of bits of entropy in your password, which for most people is far below whatever the current recommended standard is, and is particularly vulnerable to phishing attacks.

Along the same lines, it's also a bit suspicious when it's not a pain in the ass to add contacts. Who's to say that someone isn't doing a MITM and forwarding your messages if you're just adding a contact by their phone number? You're trusting that the public key in the Whatsapp database belongs to the person you're communicating with. Relying on Whatsapp for identity verification makes it not end to end.

I haven't read the whitepaper for this and am not a cryptography expert by any means, so I'm just speculating and I'm confident that these issues have been addressed to some degree. But it'd be nice to know exactly how and what the tradeoffs are.
posted by zixyer at 3:27 PM on April 6, 2016 [2 favorites]


Of course, even if it were possible to have a fully trustworthy, un-hackable os/hardware device upon which to run your secure messaging applications, you'd still be stuck with the "analog hole" which ALSO includes the ability to compromise the HUMAN at one end or the other of a secure communication, which is why I see a lot of these technological safeguards as merely shifting the likely attack surface BACK to good old-fashioned humans.
posted by some loser at 3:28 PM on April 6, 2016


Facebook owned. Pass.
posted by Splunge at 3:35 PM on April 6, 2016 [6 favorites]


you'd still be stuck with the "analog hole" which ALSO includes the ability to compromise the HUMAN at one end or the other

This came up with the Panama Papers leaker, and they had an interesting setup. One person would ask an innocent question, and the response was some non-sequitor. So, if you got the wrong question or a normal answer you knew the person was compromised or being impersonated. (They did not explain how the system was initially set up or the particulars of the question and response.)

They also used multiple services and while, again, they didn't get into particulars, it occurred to me that if they were using more than one at the same time it might make it harder to match traffic patterns.
posted by ChurchHatesTucker at 3:41 PM on April 6, 2016 [1 favorite]


Aw, you had me until "Facebook owned".
posted by hoodrich at 3:44 PM on April 6, 2016 [1 favorite]


Google's Project Vault seeks to address the untrusted platform issue.

Or you could buy an iPhone.
posted by indubitable at 3:59 PM on April 6, 2016 [1 favorite]


zixyer: just read the whitepaper -- it's short and pretty easy to understand if you know a bit about crypto. They do not store private keys on their servers. Logging in from a different device just creates a new session with the people you are chatting with. You won't be able to see the old sessions, of course.
posted by zsazsa at 4:02 PM on April 6, 2016


We've been here before. Define "end" in "end to end". If the hardware/OS is compromised no amount of encryption in software that runs in the layer above will make you secure.

It means it's encrypted when it leaves the sender and is only decrypted by the recipient. There is no central service in the middle decrypting messages. Nobody is claiming that this is the end of malware and software exploits. Can we please drop this nonsense and discuss the contents of the post?
posted by indubitable at 4:02 PM on April 6, 2016 [14 favorites]


I'm guessing in practice your private key is encrypted using your password and stored on the Whatsapp servers

Nope. Here's the whitepaper with the details.

Also informative: WhatsApp's Signal Protocol integration is now complete (blog post from Open Whisper Systems)
posted by indubitable at 4:18 PM on April 6, 2016 [2 favorites]


zixyer: "I'm suspicious of systems that claim to be "end to end" encrypted when it isn't a pain the ass to transfer your account to another device."

Yes, your instinct is basically correct.

If my understanding of the paper is correct, this does not provide any more guarantees about the identity of the other party than WhatsApp already does (which is by trusting the server's verification of the other party's phone number) when two parties initiate contact for the first time. Each device belonging to a particular user would have a different encryption key, as would uninstalling and reinstalling the app on the same device. Therefore, this can't really prevent man-in-the-middle attacks unless the server is trustworthy and the server's verification procedure (a phone number verification) is also trustworthy. As long as the server can be trusted to 'introduce' the two parties for the first time (with the same key) it is secure after that and conversations in-progress or in-the-past cannot be intercepted.
posted by WaylandSmith at 4:25 PM on April 6, 2016 [2 favorites]


If you want open source, avoid Facebook, etc., then you should use Signal, Splunge, hoodrich, etc. Or use ChatSecure if you distrust elliptic curves. At least ChatSecure runs fine of Orbot too, presumably Signal and WhatsApp work for at least text.

You should however avoid all the unencrypted messengers like SMS, Facebook chat, Skype, etc. regardless because everything you send unencrypted makes the NSA's mass surveillance program look useful.

Now, if you're like me, then you've probably friends whom you cannot communicate with using ChatSecure, encrypted email, etc. or even Signal. I'd recommend communicating with those people using WhatsApp because there is plenty of broken ass bullshit crypto like Telegram.

We should not really trust closed source software of course, but WhatsApp appears to be as secure as closed source software operating in an opaque close source or semi-closed source operating system can be.

Also : Hacking Team Has Lost Its License to Export Spyware
posted by jeffburdges at 4:25 PM on April 6, 2016 [1 favorite]


If my understanding of the paper is correct, this does not provide any more guarantees about the identity of the other party than WhatsApp already does (which is by trusting the server's verification of the other party's phone number) when two parties initiate contact for the first time.

Scroll down to the Verifying Keys section, they explain how you can optionally verify your contact's identity.
posted by indubitable at 4:29 PM on April 6, 2016


Yes WaylandSmith, both WhatsApp and Signal provides "deniable authentication" using their Axolotl ratchet, but the ratchet only assures that you are still talking to the same person. You must verify their identity and key fingerprint independently.
posted by jeffburdges at 4:32 PM on April 6, 2016 [1 favorite]


In order to prevent others from listening in on my phone conversations or tapping my texts, I switched over to smoke signals. My wife pointed out that anyone could see the clouds of thick black smoke that I would use to order pizza, but I was one step of her - I only send my smoke signals at night!
posted by robocop is bleeding at 4:42 PM on April 6, 2016 [3 favorites]


I'd recommend communicating with those people using WhatsApp because there is plenty of broken ass bullshit crypto like Telegram.

Wait. What?
posted by ChurchHatesTucker at 4:50 PM on April 6, 2016


For those who pass on Facebook, Telegram is the alternative I found that functions similarly to Whatsapp, but isn't owned by a corporation (their ownership is actually unclear, reading through their FAQ). I use both, and like Telegram far more. It suffers from the same cloud-password security hole that was mentioned above, but that comes back to compromising the human, not the hardware or software that the human is using.
posted by thebotanyofsouls at 4:55 PM on April 6, 2016


Also, second Church Hates Tucker. Calling something broken without providing some evidence to your claim doesn't contribute to the discussion jeffburdges.
posted by thebotanyofsouls at 4:57 PM on April 6, 2016


We should not really trust closed source software of course,

Unless you want to go through the millions of lines of code in every shared library and asset that goes into a piece of open-source software by hand AND understand what each line is doing to verify it, the lecture contained in this post shows that open-source code is not as trustworthy as it is often presented to be, and is the target of ongoing efforts to compromise them at the most basic levels in incredibly subtle ways, as well as stymie production and release of projects with the potential to make things more difficult for certain interested third parties .
posted by chambers at 5:11 PM on April 6, 2016 [5 favorites]


A Crypto Challenge For The Telegram Developers [blog post from Moxie Marlinspike, noted cryptographer and co-designer of Signal and the Whatsapp features mentioned here]

A discussion on Hacker News following Telegram's announcement

On the CCA (in)security of MTProto [audit of Telegram's cryptographic protocols]
posted by indubitable at 5:11 PM on April 6, 2016 [1 favorite]


Look, Telegram is obviously broken ass bullshit crypto, or else WhatsApp wouldn't be intentionally crippling links to the service in a manner that would otherwise look to be rigging the market in facebook's favor, right?
posted by 7segment at 5:13 PM on April 6, 2016


...or you could read those links I posted.
posted by indubitable at 5:18 PM on April 6, 2016 [6 favorites]


indubitable, check your memail.
posted by 7segment at 7:31 PM on April 6, 2016


Creating a robust Ken Thompson hack is basically as hard as making and maintaining any other robust secure system, plus the added challenge that nobody can be allowed to detect that the system even exists. It has to understand and modify the behavior of code without incurring any measurable performance or memory overhead. And you better believe people do spend a lot of time analyzing and optimizing toolchain performance, looking at both the speed of the build itself, and the performance of the resulting binaries.
posted by aubilenon at 9:17 PM on April 6, 2016


Scroll down to the Verifying Keys section, they explain how you can optionally verify your contact's identity.

So apparently the server doesn't store private keys, but you have to re-verify the QR code to validate your contacts' identity in person whenever they switch devices.

Which is fine, but the tradeoff is that MITM attacks are possible and the bottom line is that your messages aren't protected from being visible to Whatsapp/Facebook (and by extension, any government that has the power to compel Facebook to act) until you do the contact verification process, which also is invalidated when your contact switches devices.

This seems like an important caveat that people should be informed about.
posted by zixyer at 9:38 PM on April 6, 2016


It costs real money to create or discover an operating system weakness, and build an exploit around it, lalochezia, but using it risks loosing it, depending upon the target. It follows that using an exploit costs money, either by amortizing the costs of the exploit itself, and/or by paying analysts to determine the risks by looking at a target's social circle.


A man-in-the-middle attack becomes visible when you verify keys, zixyer, and you can verify keys by comparing with your friend, old phone, twitter post, etc or simply reading the key out verbally over the audio connection. It follows that any investigation employing a man-in-the-middle attack has good odds of being exposed. I'd hope this gives the victim grounds for discovery proceedings, which might lead to them challenging the warrant or NSL.

We must teach people to verify keys to elevate those odds obviously, but realistically a man-in-the-middle attack costs more than exploiting the phone itself because human agents must monitor your meat space behavior more aggressively.


Afaik, Telegram has only engaged with the wider cryptographic community like a snake oil salesman, 7segment and thebotanyofsouls, as documented in indubitable's links. We trust WhatsApp far more basically because they've engaged through simply adopting serious cryptographer's work, even when that meant making small usability sacrifices around multi-device usage.

If Telegram cared, they could make the same small sacrifices by adopting the same cryptographic toolbox. If they really cared, they could go open source too. I believe those two moves would build enough trust to switch the cryptographic community's preference from WhatsApp to Telegram.
posted by jeffburdges at 1:31 AM on April 7, 2016 [1 favorite]


I'm doing a PhD in this area, so it's cool to see it come up on Metafilter. Signal, though still not formally analysed to academic standards, is pretty awesome and is actually bringing a lot of modern security notions to the large scale. The fact that WhatsApp is doing e2e encryption is really cool to me.

Telegram is full of crap. I'd love to see it go away.
posted by katrielalex at 3:11 AM on April 7, 2016 [1 favorite]


I'd rather Telegram did not just "go away" but instead decided to compete with WhatsApp on cryptography and trustworthyness.

All they'd need to do is hire some qualified engineers, like maybe some of Moxie's disciples, give them the run of the protocol, etc., and open source the results. At that point, there is a real advantage to them not being owned by facebook and not being based in the U.S.

It'd make business sense for them too because currently they're kinda a small player, but if they built themselves into a trustworthy open source social transport layer, then they could really become something other apps build upon.

In particular, if you want people to depend upon an API you maintain, which they do, then you either need money like Apple, Google, or Microsoft, so that people believe you'll stick around, or else you need to go open source.
posted by jeffburdges at 5:10 AM on April 7, 2016 [1 favorite]


Which is fine, but the tradeoff is that MITM attacks are possible and the bottom line is that your messages aren't protected from being visible to Whatsapp/Facebook (and by extension, any government that has the power to compel Facebook to act) until you do the contact verification process, which also is invalidated when your contact switches devices.

Yup, key management remains a difficult problem. On the other hand, everything is opportunistically encrypted by default for over 1 billion users. That's huge.

Now, if someone with the need to consider a state-level adversary in their opsec is using it, they have the tools to easily verify their contact's key via some side channel (meeting face to face, several of the methods jeffburdges mentioned, etc.). Whatsapp can't just MITM everything without being very quickly discovered once everyone who cares to verify finds out that their keys don't match up.
posted by indubitable at 5:34 AM on April 7, 2016 [3 favorites]


What I like about Marlinspike's stance is that he believes everyone deserves secure communications, not just people who are tech savvy. As with any security measure, if someone is determined to get into your stuff and has enough time/money then there's not a lot you can do about it. But that shouldn't mean we have no protection against casual invasions of privacy by our government or nosey roommates.
posted by harriet vane at 6:24 AM on April 7, 2016 [2 favorites]


We glossed over an important point on man-in-the-middle attacks: The Axolotl ratchet by Trevor Perrin and Moxie Marlinspike for TextSecure provides long-term forward secrecy, as opposed to the session level forward-secrecy of the ratchet in off-the-record messaging, or the single message forward-secrecy of non-ratchet based key exchanges. In other words, an attacker must run the man-in-the-middle attack from the first message and keep it going forever, or somehow convince your machines that they randomly lost contact.

A man-in-the-middle attack incurs some hourly rate because a relatively knowledgeable human agent must monitor live feeds of all your communication to ensure you do not do anything that might reveal the attack, and pressing the big read "Crash WhatsApp" button if you do. There are many factors here like several agents monitoring numerous people with machine help, but a human agent must still be able to dedicate all their time to you on seconds notice, so they need very high availability.. or very good machine learning.

These agents would still screw up occasionally blowing some investigations, with indeterminate cost. If the program's managers were sloppy, these costs might entail political losses, like extra restrictions on NSLs, or undercover agents being killed. Ideally, they'd want electronic access to you even when you were offline too, but that entails stuff like drones, devise exploits, and hardware implants. If they do not want to leave any evidence behind when they disconnect you, then they again need a device exploit.. or to steal the device.

It's absolutely still necessary that you verify keys because if people do not verify keys then these costs become much cheaper, but assuming a reasonable verification rate, and a reasonable rate of existing users learning to verify, the costs of man-in-the-middle attack become extremely high with this sort of long-term ratchet. It's no longer like "We'll just man-in-the-middle this short OtR session" but "We must steal his phone so that he buys a new phone, spend many thousands watching him like a hawk for months, and then either hack or steal his phone again". It's cheaper, safer, and easier to just hack the phone in the first place.
posted by jeffburdges at 7:12 AM on April 7, 2016 [1 favorite]


Moxie caught scabies from punks - I take that as a strong indicator of trustworthiness
posted by liliillliil at 8:17 AM on April 7, 2016 [1 favorite]


We must steal his phone so that he buys a new phone

I think you're assuming that there's nothing that Whatsapp can do on the server side to force a new session. I don't think there's anything in the spec that requires that; I wouldn't be surprised if removing your public key from their database will cause the client to silently start a new session.

Still kind of skeptical about the identity verification process. You might say that if the verification succeeds one day and fails when you try it again at a later date, it would indicate an ongoing MITM attack, but is there anything to prevent a short term attack? Just update the public key to what it was earlier after the surveillance is completed. Your contacts would switch back to the old session key and any attempted verification would succeed again. It would be possible to detect this but that's assuming that there's code in the app that does it.
posted by zixyer at 8:22 AM on April 7, 2016




Axolotl forces a new session to end the man-in-the-middle attack. There is no way to get the Axolotl root keys to converge on a common value without both a hash collision and breaking ECDH.

I'd expect that adding new devices, or restoring from backup, would be the only reason that Signal, WhatsApp, etc. start new sessions. Assuming that, any new session should warn the user that their contact is using a new device, or restoring from a backup, and that the user needs to reauthenticate.

I'm suddenly worried that, if you never authenticated before, then they do not warn you if a new session starts because the authentication state does not change. If so, that's bad.
posted by jeffburdges at 8:57 AM on April 7, 2016


Also, there are necessarily different properties for group chants since these nice ratchets do not work there. I've heard WhatsApp uses pairwise ratchets for group chats, which beats all the group ratchets. It's still bad though if an attacker can coerce the interface into saying that you're chatting with Bob, while failing to mention that you're actually talking with both Bob and the tablet Bob lost last month. Arguably OMEMO fucked that up.
posted by jeffburdges at 9:12 AM on April 7, 2016


To address a point brought up in that article and in this thread re: open source:

There is no way to know whether WhatsApp's parent company, Facebook, has added backdoors -- or might be forced to add them at a later date. Strong crypto doesn't provide much protection if it has been subtly and invisibly compromised.

Open source is neither necessary nor sufficient to audit a cryptographic application. Cryptographic backdoors are subtle in ways that having source code will not help spot, and sometimes source code is even worse than nothing. Here's a good explanation.

re: metadata, that is true, there is not much here to obscure who you are communicating with and when.

That being said, it's important not to lose sight of the fact that this is still a huge upgrade from what came before, and it's done almost entirely automatically (upgrade to the latest version, have your contacts upgrade to the latest version, and that's all you need to get going). Regular people who are not cryptonerds get state-of-the-art encryption designed by some of the leaders in the field specifically for messaging. This will go a long way toward thwarting lazy dragnet surveillance and force intelligence agencies to exercise more discretion in who they target.

If you need the absolute cutting edge in security, you upgrade your opsec (use other free applications that don't make any tradeoffs against security, like Signal, Pond, PGP, OTR; use more secure endpoints like an iOS device, for example, etc.) and accept that you're going to deal with bigger hassles to maintain security for whatever it is that you're doing.
posted by indubitable at 10:05 AM on April 7, 2016 [1 favorite]


I'd expect that adding new devices, or restoring from backup, would be the only reason that Signal, WhatsApp, etc. start new sessions. Assuming that, any new session should warn the user that their contact is using a new device, or restoring from a backup, and that the user needs to reauthenticate.

I'm suddenly worried that, if you never authenticated before, then they do not warn you if a new session starts because the authentication state does not change. If so, that's bad.


It's the ambiguity that bugs me. You would think that new sessions are only created if someone changes devices or clears the app cache by reinstalling the app, but are these the only legitimate scenarios? The whitepaper just lists those as examples of scenarios where the session is restarted, but doesn't specify whether there are others.

I'd expect if you've never authenticated it's not going to show you the "this conversation may not be secure" message because that would be bad for branding. It would mean showing users a scary message in all of their chats (until they authenticate with their contact in person or via a back channel), and it also doesn't seem likely that it'll show a warning message if an unauthenticated session restarts.

I'd like to know what the scary message actually looks like if you have authenticated and a session has restarted: does it show up prominently in the chat window or do you have to pull up a property dialog to check? From what I've seen of modern app design trends, you'd probably just get a visually low-key open padlock icon somewhere in the interface.

That being said, it's important not to lose sight of the fact that this is still a huge upgrade from what came before, and it's done almost entirely automatically

I see your point, but my issue is that a commercial company is promoting this upgrade as "end to end" encryption while glossing over the fact that it isn't under typical usage patterns. The measures that users have to take to ensure that the encryption is truly end-to-end should be totally clear. Shouldn't be hard to add a line to the announcement like "To ensure complete security, you will need to authenticate your contacts and check for the padlock icon every time you send a sensitive message." or whatever the procedure actually is.

Instead, this is what they wrote:
"And if you're using the latest version of WhatsApp, you don't have to do a thing to encrypt your messages: end-to-end encryption is on by default and all the time."
posted by zixyer at 10:29 AM on April 7, 2016


Since everyone is just speculating -- here's what actually happens. I updated snapchat and did a little back-and-forth with Mr. antinomia. Nothing new. Then I navigated to Settings->Account->Security and enabled "Show Security Notifications". When I went back to continue chatting it gave me a message about things being encrypted with buttons to learn more. When the hubby updated his app, I got "Mr. antinomia's security code changed. Tap for more info." And we were able to verify by scanning QR codes on each others' phones.

I have a group chat session going with some other folks who I haven't verified with, and it only says that the session is secured (really?) and pops up a message that a security code has changed for anyone who upgrades (I'm assuming that's what prompts it).

So tl;dr, it would be good for them to not just say "This is now end-to-end encrypted." without letting you know that you might want to verify, or to say that a security code has changed without also letting you know that that means you should verify before you continue to chat to maintain security. And security notifications are not on by default.

I'm used to using Signal, so this for me is an upgrade since the UI is a little better, and I can use the Whatsapp web thing and type with a real keyboard when I'm by one, and otherwise the verification steps feel like signal but with the added convenience of QR codes.
posted by antinomia at 10:48 AM on April 7, 2016 [4 favorites]


Thank you very much for checking on that. It's about what I expected.
posted by zixyer at 10:54 AM on April 7, 2016


I'm maybe okay with either security notifications being off by default, or slightly overly simplified notifications, but if a security notifications setting exists then you should damn well do it properly once they turn it on.

In particular, an Axolotl based system should notify a user when a session restarts for some reasons. It's easy to have a message like "Your session has restarted. This probably means Joe bought a new phone!" Anything that gets users to discuss the session restart makes executing a man-in-the-middle attack much harder. It's hard for an attacker to know what users does or does not know, much less what they'll know in the future.
posted by jeffburdges at 11:46 AM on April 7, 2016


Leak of Senate encryption bill prompts swift backlash - "Security researchers and civil liberties advocates on Friday condemned draft legislation leaked from the U.S. Senate that would let judges order technology companies to assist law enforcement agencies in breaking into encrypted data."

-Senate encryption bill draft mandates 'technical assistance'
-Tech and Privacy Experts Erupt Over Leaked Encryption Bill
-"The thing about end-to-end crypto is that it increases the importance compromising the *ends*."
posted by kliuless at 1:57 PM on April 8, 2016 [1 favorite]




Absolutely hilarious image about the various encrypted messengers and Telegram.
posted by jeffburdges at 1:47 AM on April 10, 2016 [1 favorite]




Uber gave U.S. agencies considerable amounts of data on riders
I read it as likely anonymized, but it does not say that, so who knows.
posted by jeffburdges at 7:01 PM on April 12, 2016




Chrome has had online revocation checking disabled by default since 2012 and instead uses its own push-based system that isn't vulnerable to the same attacks and privacy leaks.
posted by mbrubeck at 11:51 AM on April 13, 2016 [2 favorites]


« Older Four Years a Student-Athlete   |   Ain't Misbehavin' - Louis Armstrong transfer from... Newer »


This thread has been archived and is closed to new comments