Tag Archives: Crypto Backdoor

Crypto Backdoors are Good – Except When The Other Side Has Them

Attorney General Barr and FBI Director Wray have been lobbying strongly for companies such as Facebook and Google to add backdoors to their cryptography so that they can eavesdrop on conversations when they need to.

But there are problems with backdoors to encryption.

Mostly, you cannot control who uses them.

Case in point Huawei.  The U.S. says that Huawei has a backdoor into their telephone gear.  One which, I might add, the U.S. requires them, by law, to put there – so this is not the first crypto backdoor rodeo.

But now the U.S. says that Huawei is using that backdoor that we made them install.  Probably on behalf of the Chinese government.

It is not clear to me why the U.S. thinks that if we make Google or Facebook or some other company install a crypto backdoor that we will be the only ones that use it.  That puts companies in a bind when some non-friendly government makes them decrypt conversations that might get people killed.

All this is just a lead in to today’s post.

There is a Swiss company, Crypto AG, that built encryption hardware for governments.  Apparently the crypto was pretty strong. And the company, being neutral, sold it to countries that the U.S. was friendly to.  And not friendly to.

So how could we break the crypto?

Secretly, the CIA, in partnership with West German Intelligence, bought the company.  This enabled them to do, well, whatever they might want to do.  Such as sabotaging the software so that Germany and the U.S., as well as some other governments could read other governments supposedly secure communications.  Ones that were protected by systems that they paid Crypto AG a lot of money to secure.

Talk about supply chain risk.  Holy cow.

Crypto AG sold their systems to as many as 120 countries, so, for the CIA, it was a target rich environment.  They knew what agencies in which governments were using their systems and had installed backdoors to allow them to decrypt those supposedly secure messages.

In this case, it was the good guys who had the master key, but they were read the messages of our allies in addition to our adversaries.

If they didn’t sell their systems to the good guys, the bad guys would get suspicious.

But this is kind of how the spy business works.  Sometimes collateral damage is OK.

But this is also the problem with crypto backdoors.  Once you have them, it is hard to control how they are  used.  Source: Washington Post

Security News for the Week Ending July 26, 2019

Equifax Agrees to Pay UP TO $700 Million to Settle Breach Lawsuits

First – the settlement hasn’t been agreed to by the court yet, so this is all speculation.

Of the $700 million pot, at least $300 million is set aside to pay damages to consumers.  Another $100 million plus is to pay for credit monitoring.

There are lots of details.  For the most part, unless you can prove damages and prove that those damages were caused by the Equifax breach and not some other breach, you probably will not get paid much.  You can get paid up to $250 if you file a claim and without proof.  Everything past that requires proof.   With 150 million victims and a $300 million pot, that averages to $2 a person.

BUT there is one thing you should do and that is get the free credit monitoring.    Go to EQUIFAXBREACHSETTLEMENT.COM and wait until it says that the court has approved it.  Note this is not a site owned by Equifax and given what a mess they are, this is good.  Read more details here.

The Next NSA Hacker Gets 9 Years

Harold Martin, the NSA contractor (employed by Booz, like Edward Snowden) was sentenced to 9 years for stealing 50 terabytes of data over the course of his 22 year NSA career.  The leak is something like 5 times the size of the Snowden leak.  He didn’t sell it;  he just liked data.  He had so much he had to store in in sheds in his back yard.  Many of the documents were clearly marked SECRET AND TOP SECRET.

The fact that he was able to steal hundreds of thousands of documentss doesn’t say much for NSA security, which is sad.  Source: Nextgov.

Huawei – Bad – Not Bad – Bad?!

President Trump said that Huawei is a national security threat and needs to be banned and then he said that maybe we can trade that threat for a better deal with China on trade.

Now it is coming out that Huawei helped North Korea build out their current wireless network.  The equipment was shipped into North Korea by Chinese state owned Panda International.  This has been going on since 2006 at least.  Huawei is likely continuing to provide technical support to North Korea.

This seems like a national security threat and not a bargaining chip for the President to toss in to get a trade deal that he wants, but what do I know.  Source: Fox News.


AG  Barr Says He Wants Encryption Back Door And Why do You Need Privacy – Just Suck it Up.

Attorney General William Barr said this week that if tech companies don’t provide a back door into consumer encryption,  they will pass a law forcing it.  And while this will allow hackers and Chinese spies to compromise US systems, it is worthwhile.

He said that they might wait for some terrorist event that kills lots of people and blame it on encryption (whether that is true or not).

He did seem to exclude “custom” encryption used by large business enterprises, whoever that might include.

Barr said that bad guys are using crypto to commit crimes what the police can’t investigate.  If that were true we would expect that crime would be going up.  If it is a really bad problem, it would be going way up.

Only problem is that the statistics say crime is going down.

You may remember that Juniper added such a back door, likely at the request of the NSA and it worked great until word got out about it and hackers had a field day.

This conversation is not over.  Source: The Register.


Proof A Government Crypto Backdoor Is A Really Bad Idea

It was really only a matter of time.  As the FBI (but interestingly NOT the CIA or NSA) keeps pressing for a crypto backdoor – or whatever they would prefer to call it to make it seem more palatable  – and security experts keep saying this is a really, REALLY bad idea, the universe decided this week that it was time to explain why it is such a bad idea.

In this case, it didn’t even involve a back door but rather a challenge in keeping secrets secret.  Keeping secrets secret is really what a crypto backdoor is all about.  What the government would like is a secret key that only they would have that would allow them to unlock any crypto that they wanted unlocked.

Ignore for the moment the warrant issue – that is to say, can we trust the government to only use that key when they should use it – but instead focus on whether the government – which seems to have trouble keeping anything important secret – can keep something as sensitive as that secret.  FOREVER!  If the secret key is compromised 10 years for now, then anything protected with that key in the previous ten years would be exposed.  And that assumes that somehow you could change the key going forward INSTANTLY in a couple of billion users systems so that new data would be safe.  Not likely to be possible.

OK.  So what happened?

For the last few years something called UEFI or Unified Extensible Firmware Interface allows software vendors to better protect the software that is loaded into our computers.  Using the UEFI spec, vendors can make sure that the software that is loaded when we boot our computers hasn’t been changed since it left the factory.

They do that with encryption and, as is usually the case, this requires something that is secret.  In this case, a secret key.  Anyone who has that secret key can sign any piece of software that they want and the UEFI boot process will say that the software is valid and secure, even though it is not.

As a result, that key has to be kept really, really secure.

Only one problem.  Microsoft kind of blew it.

Without going into the technical details (if you are interested in that, click on the link for the source article and you will get those details), Microsoft let some “policies” loose in the wild that allows anyone who has access to them to bypass this cryptographically secure booting process for Windows computers.  Technically, this isn’t a “golden skeleton key” problem, but rather something related to it,

When the researchers contacted Microsoft in March, Microsoft, apparently, declined to do anything about it.

In July, after Microsoft decided that letting the golden skeleton key out of the bag, so to speak, they released a patch, MS16-094, which adds these policies that were accidentally exposed to a “revocation list” that is checked during boot.  The revocation list is there just in case something like this happens.

It turns out that really doesn’t solve the entire problem, only part of it.  So this month Microsoft released patch MS16-100 to revoke more stuff, but, apparently, that doesn’t really solve the problem either.

So, we are told, Microsoft is going to release yet a third patch next month to try and get the genie back in the bottle.

Remember, all this time that Microsoft is trying to rebottle the genie, our systems are exposed.

According to sources, someone is scheduled to release a tool next week that will bypass this month’s patch and allow hacked code to be loaded.

This article is not really about Microsoft’s blunder.  While it is bad, it is not horrible and for most people, they don’t really care.

What the article is about is the challenge of keeping something that is really, really important secret for a really, really long time.  The reality is that it is not possible to do. To be fair, if you put the secret in a vault in Fort Knox and never opened that vault, then your odds of keeping the secret are much better.  But in this case, the secret will be used to unlock thousands or millions of data objects over the years, so the Fort Knox analogy won’t work.

If the government decides to go along with this, besides the fact that enforcing it against companies that make software and who are not located in the United States is going to be a tad bit difficult, I predict a breach will occur within the first couple of years that will make the Target, Home Depot, Anthem and OPM breaches look like no big deal.  Just sayin’.

Information for this post came from The Register.

How Would Congress’ Effort To Install Crypto Backdoors Actually Work?

While the question of how cypto backdoors would work is unknown since there are no actual proposals on the table at this time, I am concerned that it will turn into a disaster.  Partly this is because Congress does not understand technology.  Out of 500 plus Congress critters, there are 5 that have a computer science degree.  While that is not surprising, it means that mostly lawyers will be writing laws about something they know almost nothing about.

Option 1 – Force Apple and Google to install secret backdoors into their phones.  One option would be a skeleton key.  That is one single key that unlocks all phones past, present and future.  That option would be a disaster since if that key got into the wild, every phone ever made would be compromised.  Hopefully, that is not the option chosen.  Another option would to have a key per phone.  When you  make the phone, you create a key for it, put the key in a mayonnaise jar on Funk & Wagnalls back porch (to quote Johnny Carson) and open that mayonnaise jar if asked.  If this were done, we would need to securely store around two billion keys and growing by hundreds of millions a year between Apple and Android phones.  We could ask the government to store them.  I am sure that would be secure.  Maybe the OPM could do it for us?  Alternatively, the manufacturers might keep them.  The third option might be to have the key algorithmly derived such that you would not have to store the keys. I think that would mean that you would have to keep the algorithm secret otherwise anyone could decrypt a phone and that is not likely possible.

I don’t think that anyone has actually come up with a way to do this that would work.  I am open to possibilities, but haven’t heard one.  Neither have many, many cryptographers who are a lot smarter than I am.

How do we deal with the close to two billion phones that are out there.  In this situation, Apple is a little easier to deal with than Android.  Since Apple users tend to keep their software more current than Android users, you could, possibly, push an update to the close to a billion iPhones, installing the backdoor.  Not to mention the could hundred million iPads.  NOT!

In the Android world the problem is harder.  There are still hundreds of millions of Android phones running version 2 of the operating system even though version 6 is the current version.  Do you really expect each phone manufacturer to dust off their software archives and update that antique software. Not likely.

Then there is the question of who is going to pay for the creation – and more importantly – the ongoing maintenance of this huge intelligence network.  I assume Congress doesn’t want to pay for it, but I certainly don’t want to either.  The cost would likely be in the billions of dollars if not more.

And what about phones that are not made in the US?  Do we really have any leverage to force Chinese manufacturers that sell knock off Android and iPhone clones to do anything that the US wants?  I didn’t think so.  So maybe the objective is to reduce the sales revenue of US phone manufacturers?

But now the real problem.  Encryption is implemented in software in millions of applications.  These applications are written by tens of thousands of developers all over the world.  Many of them are open source meaning the developers don’t have any money to do anything and do not have a company to force to do anything – assuming you can even find these people.

If you don’t remove the encryption from software, cracking the iPhone or Android phone is basically useless.

Maybe Option2 is to ban all software that does not have an encryption backdoor.  How exactly do you do that?  There are likely thousands of new applications released every week.  Some in the US but many more outside the US.  Maybe we should block all non-US IP addresses so that we can make sure that terrorists don’t download software from non US companies or developers.  Maybe we should rename the Internet to the USNet.  Maybe we should pay someone to check every new application that is available on the Internet to see if it has a backdoor.  That would be good for the economy.  The government would have to hire tens of thousands of computer experts. nah,  that’s not going to happen.

Another issue is cost.  When Congress did this the last time in the 1990s, it was called CALEA.  It was Congress’ attempt to install a backdoor into all phone switches sold in the United States to commercial phone companies (the Ma Bells in particular).   There were a handful of phone companies and another handful of phone switch manufacturers,  Congress agreed to pay for the insertion of the backdoors.  They allocated a billion dollars in 1990s money and ran out.  They had to get another billion to finish the job.  And, I think, it took around 10 years to complete.

Fast forward to 2015.  Instead of 10 phone switch manufacturers you have, say, 100,000 software developers.  Instead of a product that is sold through a sales force, installed in known locations (the phone company central office) and maintained by a paid technical staff, you have products that are given away (open source), by people that do not have any paid staff, that are not physically delivered at all and come from all over the globe.  ASSUMING you could do this, how much would this cost?  Of course, you can’t do it.

And what about software made in other countries that don’t have laws like whatever this Frankenlaw might be?  A few countries – like England for example – might be persuaded to pass a similar law, but other countries – like Germany – are actually moving in the other direction saying that strong encryption is a good thing.

What about software made in Russia?  Ukraine? China? and many other countries that are not friendly to the US?  They are not likely to comply.

And, already ISIS has released their own software.  It is encrypted, of course.  Maybe we can ask Daesh (as they do not like to be called) to insert a backdoor for us and give us the keys.  Let me think about that.  Nope. Not gonna happen.

So, in the end, Congress will be able to thump their collective chests and say how wonderful they are and it will do nothing to help fight terrorism other than to make Bin Laden right even years after his death.  Remember that he said that he wanted to bleed us to death?  Well, he certainly is succeeding.  Even in death he is succeeding.

Stay tuned because no one knows how this play will end – tragedy or comedy?  Not clear.


Information for this post came from Network World.

The Government Wants Us To Believe They Can Keep A Crypto Back Door Secret …

FBI director James Comey has been telling everyone that the world will end unless every company around the world provides the FBI and only the FBI a back door to allow them to decrypt your communications.  This includes countries we like and ones we don’t like.

FBI Director Comey

So far the world isn’t listening to him, but the Justice Department is not giving up the quest.

In their defense, they will have to use other tactics if they cannot browse through your digital life at will.  The evidence shows that they already have done that when they needed to.  It is just more complicated and time consuming.

There are two problems with their fantasy of crypto back doors.

The first is to think that they really can abolish software that does not provide a back door.  There is an article in Boing Boing, linked below, that talks about the challenge of policing billions of app downloads, some from well known app stores and some from app stores that don’t even have a web site name – only an IP address.  Do you think terrorists will voluntarily use software that they know the U.S. government can tap?  Maybe the FBI is that foolish, but I am not.  The Boing Boing article goes into great detail explaining why this is a pipe dream.

That of course doesn’t stop the FBI from asking for a back door.  They are apparently pretty smooth about it according to Nico Sell of Wickr;  she talks about it in the PC Magazine article linked below.  While Wickr told them to pound sand, apparently AT&T was more than cooperative with the NSA, going back ten years before 9-11 (see second ARS link below).  The deal with AT&T was so cozy that the NSA apparently told their agents to be very polite when visiting AT&T facilities.

The second is the fallacy that the government, any government, can keep a secret for any extended period of time.

This past week, the government gave us proof that their goal of keeping secrets secret is unlikely to be successful for very long.

Since 9-11, the TSA has required that passengers traveling by air only use padlocks that have a TSA bypass mechanism so that if the TSA suspects there is a bomb in your suitcase, they can open it and look.  This is a backdoor into “physical cyber”, but it is a perfect example of the problems with back doors.

There have been numerous complaints, lawsuits and payments by the TSA as a result of TSA employees stealing things out of passenger’s luggage using these physical back doors.

This past week, the TSA in an amazing act of stupidity allowed the news media to photograph these same master keys.  The media, doing what the media does, published the pictures on the web.  Within a few days, hobbyists created a CAD file that allowed anyone with access to a $1,000 3D printer could print one of these master keys.

Compare this to the FBI accidentally or maliciously exposing the crypto back door keys.  The cost to use these accidentally exposed crypto keys is zero.

But there is a MUCH bigger problem with the crypto back door.  With the luggage locks, everyone now knows that these locks are no longer secure and can stop using them.  You can’t use those keys to open the suitcases that were in airports last month or last year.

HOWEVER, those purloined or accidentally exposed crypto keys could be used to decrypt files sent years ago.  Even if the government were to somehow discover that the keys had been exposed and magically snap their fingers and get every manufacturer of software new keys the next day (think about the logistics of that), every communication ever sent that could be opened using that compromised back door key is compromised.  AND, there is no way to undo that because those files and communications are out of your control.

Which is why the idea of a crypto back door is insanity.

Ignoring the fact that the bad guys will continue to use crypto that doesn’t have a back door.  Ignoring that minor detail.

Luckily, Congress seems, for the moment, to understand this problem.

You decide.   Would you trust a government that can’t even keep a padlock key secret to keep a crypto key that opens billions of communications secret?


Information for this post came from ARS Technica, boing boing, PC Magazine, ARS Technica and Wired.