Government Forced Tech Companies To Hand Over Source Code and Private Keys

Unlike the very public fight between the FBI and Apple, the U.S. Government has made many quiet attempts to force tech companies to turn over source code and private encryption keys.

In some cases, this was done via civil cases sealed by the court, but in other cases, it was done via a secret order from a secret court that in many cases, the CEO or Board of the company can not be told about.

According to ZDNet, their source has “direct knowledge” but can’t be named as the information revealed is likely classified.

The source said that the tech companies are losing their cases in the FISA court “most of the time”.

The Justice Department did admit that they have demanded source code and private encryption keys before, so that seems to validate what the source told the media.

One very public case was that of Lavabit who decided to shut down their service and erase their disks rather than turn over the information.

The spokesman for the Justice Department declined to answer the question about whether they would demand source code and encryption keys in the future.

While I doubt the Justice Department would give that source code or keys to a rival, it is certainly possible that the code could be hacked.  After all, sensitive information in government custody has been hacked on numerous occasions.

Depending on how the encryption is implemented, revealing the keys MAY allow the government to decrypt information captured in the past.  There are ways to mitigate that, but most companies don’t use them.  Many companies, such as Google and Microsoft, among others, want to be able to decrypt your data so that they can serve up better ads for you.

The Justice Department might use that source code to create a fake honey pot web site to lure in a suspect or they might use it to look for security holes in order to obtain information.  It is unlikely that the government would tell that company if they did find any security holes.

While most of the tech companies contacted by ZDNet refused to comment, Cisco did say that they have not and will not hand over source code to any customers, especially governments.

IBM said that the company does not provide source code or encryption keys to the NSA or any other government agency for the purpose of accessing client data (emphasis added).  IBM would not say if source code had been handed over to the government for any other reason.

Apple said in court recently that it has never revealed its iOS source code to any government.  I am not sure what that means about Os X.  That document was related to a concern that Apple had agreed to security checks from China, including turning over source code.

FISA Court orders are so secretive that only those people necessary to execute the order may be told about it and that may not include the C-Suite or the Board.

Documents leaked by Edward Snowden certainly indicate that companies seem to cooperate with the feds in placing backdoors in their code and then go “Oh, My!” when the backdoors are discovered.

Depending on your level of paranoia, you will need to make your own decisions regarding protecting yourself, but I would certainly suggest that if the vendor has the encryption key, it is likely that they would turn it over to the government if asked.  Whether they would do the same for foreign governments is less clear, but certainly of concern.

Information for this post came from ZDNet.

Facebooktwitterredditlinkedinmailby feather

Feds Bust 100 for ID Theft – Unfortunately, A Drop in the Bucket

The FBI charged 104 people with identity theft related charges.  The people were operating in South Florida.  Included in the group was a former secretary for Jackson Health System – she played a key role in stealing patient records that were used in a tax fraud scheme.

The group stole records on 24,000 patients in an effort to obtain false tax refunds.  The secretary, Evelina Sophia Reid was suspended in February 2016, a year ago, on suspicion of stealing patient records.  She was arrested this month.  I don’t know, but I suspect it took the feds a year to connect the dots.  Possibly, she was helping the feds.

The 104 defendants represent 81 cases where 30,000 people’s information was stolen.

In case you are wondering why these people were doing this, they planned to collect $60 million in fake tax refunds.

If you consider it, however, that represents an average of a half million per person.  That’s not very much given the risk, it seems to me.

Jackson Health is Miami-Dade’s public hospital system.  How, exactly, a secretary had permission to access and download patient records for 24,000 patients seems to represent a bit of a security hole.

The hospital, in a bit of a P.R. spin, said the secretary had finally been fired and that they had upgraded their security – better late than never, I suppose.

Jackson Health is in the middle of a $1 billion plus makeover, much of it with publicly financed bonds and they would just like this to go away.

Assuming there are some trials, and with a hundred defendants, there will be trials,  it will be years before it “goes away”.

I am sure that there was no ill intent on the part of the hospital with respect to protecting patient information, but there were choices made – spend money, or not.  Change processes, or not.  Prioritize security over convenience, or not. Things like that.  None of those priorities change until the feds are swarming all over town and your name is all over TV, radio, the Internet and even the newspaper.

Now the priorities change.

People figure the bad guys are going to attack someone else, not me.  That works until it doesn’t work.  That’s what Jackson Health is learning the hard way.

It is great that the FBI did capture these 100 people.  Hopefully,  many of these people will be convicted and spend a long time in the crossbar hotel.

In the grand scheme of things, however, this is just a drop in the bucket and this will will make absolutely no difference – except for to those 100 or so people.

If people and businesses don’t take cyber security more seriously, the feds will continue to be totally and completely overwhelmed and things will only get worse.  The most popular password is still 123456 and less than 10 percent of people use two factor authentication.

Security is inconvenient.  That is a reality.  However, if you think security is inconvenient, consider this.  That inconvenience is trivial compared to the inconvenience of dealing with your company getting breached.  How much do you think Miami’s Jackson Health has spent over the last year and will spend over the next several years dealing with this?  Several years. Really.  It will likely be at least two or three years before all of these defendants make it through just the trial court round.

Information for this post came from the Miami Herald.

Facebooktwitterredditlinkedinmailby feather

1 Million (Likely More) Google Accounts Compromised by Gooligan

I am not sure what rock I have been hiding under, but somehow I missed this item.

About two months ago, the security company Checkpoint revealed a new Android malware family called Gooligan.

The malware can attack about 74% of Android phones world wide.  The good news, if there is any,  is that it only works (today) on old, obsolete, versions of the Android OS.  Specifically, it works on Version 4 (Ice cream sandwich, Jelly bean and Kit kat) and Version 5 (Lollipop), but not Version 6 (Marshmallow) or Version 7 (Nuggat).

Many phone manufacturers dump support for a phone as soon as the next bright shiny object comes along to distract them, so, except in a few circumstances, whatever version of the Android OS came on the phone is what it will die with, years later.

This is somewhat different than iPhones in that there are far fewer models.  However, when Apple decides to end-of-life a phone model, the user has two choices – live with the fact that there are no more security patches for that iPhone or buy a new phone.

So in a sense, there is not a huge difference in this respect between Apple and Google.

Users on the other hand have paid off the phone and don’t want to buy a new one until they have to or can’t resist.

The problem is that if you are using a phone with known vulnerabilities and which your phone provider has decided to stop upgrading, you are walking around with a potentially large hole in your security net.

In the case of the Gooligan malware, hackers pay app developers to insert their malicious payload inside otherwise good apps.  Typically, these are apps that are distributed from shady app stores and not Google Play.

Once the app runs, it downloads more malware after contacting its command and control server.

The newly downloaded malware is customized for the version of the Android OS that you are running and “Roots” the phone, giving it super-human powers.

Once the malware has super-human powers, it downloads more malware- in this case to steal your Google account information and security tokens, install more apps (to get ad revenue and improve the app’s reputation) and install adware.  Of course, at this point, it could do anything it wants to including “bricking” (killing) the phone.  Bricking it isn’t in the hacker’s best interest because they want to have the phone be a zombie to do the hacker’s bidding whenever it wants it to.

Google has been working with the researchers to try and protect users – even to the point of suspending user’s access to Google services until they securely change their password, but if phone vendors don’t cooperate, it is hard.

It appears that most of the affected phones are in Asia with some in Europe and only a small number (about 20 percent) in the United States.

What this means is that both Apple and Android users need to understand that just because a phone can make and receive calls does not mean that it is a smart thing to keep using it.

For Android users, if you are not running Marshmallow or Nuggat today, it might be time to buy a new phone.  And, while some shiny new top of the line $800 phone might be cool, there are many much cheaper phones available.  And almost all carriers (including Apple) will lease you a phone on a monthly payment plan.

For companies who allow users to BYOD, those companies should consider a policy to not allow users who are using unsupported versions of the Android and Apple OSes to access corporate resources, including email.  Doing so puts the entire corporate network at risk.

One question to phone vendors – Apple or Android – is how long they will commit to issuing patches on a phone you are considering buying.  That length of time is when you have to buy a new phone, again, to stay secure.  If they don’t have an answer you like, look for a different phone or a different carrier.  If people don’t vote with their wallets, the carriers will ignore the issue.

I never said that improved security will make you popular.

Information for this post came from Ars Technica.

Facebooktwitterredditlinkedinmailby feather