How The Law Decrypts Your Phone’s Encryption

Law enforcement agencies around the world have been whining about the “going dark” problem at least since the early 1990s when they tried really hard to put Phil Zimmerman in jail for creating encryption that mere mortals could use. There is no question that bad folks use encryption to hide stuff, but good folks also do and it is going to be impossible to create a master key that will only be used by the good guys for good. Not going to happen.

So that leaves the police with the option of hacking your phone, which, is less impossible than they often claim.

Johns Hopkins cryptographer Matthew Green managed a team of experts to tear apart the secrets and see what they found.

They looked at available documentation and also did some hacking. They also reviewed all of the existing news that they could find about what the cops have done in the past to break in.

Green thought, going in, that security on Apple and Google phones was pretty good, but coming out he realized that almost nothing is protected as well as it could be.

The researchers figured that it would be really difficult to steal any of the many levels of encryption keys that iPhones use, but that turns out not to be the case.

If your iPhone was powered off and someone turned it on, the security would be pretty good – what Apple calls “Complete Protection”. But as soon as you log in, you move from “Complete Protection” to “Protected Until First User Authentication”. That is likely the state your phone is in 99.99% of the time.

The major difference between these two states is that in the after the first login, many of the keys are available in memory. At this point, if someone can exploit your phone, getting those keys and decrypting the data those keys protect is easy.

This is likely how all forensic tools like Cellebrite and Grayshift work.

Android works very similarly except while Apple has a way for apps to protect small bits of data more securely after first login – like a banking password – Android does not have a feature like that. That means that tools like Grayshift can grab more data once you have logged in.

Android also suffers from dozens of manufacturers and hundreds of models and many people who have not seen an upgrade or patch in years.

When the researchers explained what they had done to the folks at Apple, they basically said that they were concerned about protecting your stuff against street thieves and not well funded attackers and they chose user convenience over security (my words). From a marketing standpoint that makes sense, but they don’t really tell people that up front.

Google, like Apple, said these attacks require physical access (like what might happen when you cross the border and the customs person says “papers please” and “phone please”. They said it also requires these folks to know about bugs that have not been patched. Google said that you can expect to see additional hardening in the next release of Android.

If you think it is only the FBI or NSA that buys these Celebrite and similar tools, you are very wrong. Researchers found nearly 50,000 examples of police in all 50 states using these tools between 2015 and 2019 and that was just what they were able to uncover. Law enforcement has not exactly volunteered that they can hack your phone at the push of a button.

Given this, you might wonder why the police are complaining about going dark. I think it is because they can’t just snoop on anything, any time, any where, including over the air and unless they can do that, they will complain. Credit: Wired

Leave a Reply

Your email address will not be published. Required fields are marked *