Tag Archives: Backdoors

Secure Software Development Lifecycle Process Still Lacking

In late 2015 Juniper announced that it had found two backdoors in the router and firewall appliances that it sells.  Backdoors are unauthorized ways to get into these systems in a way that bypasses security.  Kind of like going around to the back of the house and finding the kitchen door unlocked when no one is home. Researchers said that there were telltale signs that this was the work of the NSA, although they would never say, of course.  If these backdoors were the work of the intelligence community, lets at least hope it was OUR intelligence community and not the CHINESE.  Whether these backdoors were intentionally installed in the software with the approval of Juniper management at the request (and possibly payment) of the NSA is something we will never know (See article in Wired here).

At the time, Cisco, Juniper’s biggest competitor, said that they were going to look through their code for backdoors too.  They claimed that they did and that they didn’t find any.

Fast forward two years and now the shoe is on the other foot.

Cisco has announced the FOURTH SERIES of backdoors in the last FOUR months in May.  Possibly their code audit from 2015 is still going on, but if so, that would be going on for more than 30 months, which seems like a long time.

The most recent SET of bugs includes three bugs which are rated 10 out of 10 on the government’s CVSS3 severity ranking.

The first of the three is a hardcoded userid and password with administrative permissions.  What could a hacker possibly do with that?

The second provides a way to bypass authentication (AKA “we don’t need no stinkin passwords”) in a component of some Cisco software (DNA Center).

The third is a another way to bypass authentication in some of Cisco’s APIs that programmers use.

In fairness to Cisco, they do have a lot of software.

But to beat Cisco up – WHAT THE HELL WERE THEY THINKING TO ALLOW HARD CODED PASSWORDS IN THE SOFTWARE IN THE FIRST PLACE?

Source: Bleeping Computer

Okay, now that I am done beating up Cisco (actually, not quite, I have one more), what lessons should you learn from this?

First (the last time today that I am going to beat Cisco up), in order for a Cisco customer, who paid a lot of money to get the equipment in the first place, to get these security patches – patches that plug holes that should have never been there in the first place – that customer has to PAY for software maintenance.  If you let the maintenance lapse, you can re-up, but Cisco charges you a penalty for letting it lapse.   For this policy alone, I refuse to recommend Cisco to anyone.

Second, if you are a Cisco user, because of this very user unfriendly policy, you must buy software maintenance and not let it expire.  If you do, you will not be able to get any Cisco security patches.  Remember that, as one of the biggest players in the network equipment space, Cisco is constantly under attack, so the odds of bugs turning up is like 100%.

Third, no matter who’s network equipment you use, you must stay current on patches.  These flaws were being exploited within days and since hackers know that many Cisco customers do not pay for maintenance, those holes, which are now publicly known, will be open forever.

Only half in jest, my next recommendation would be to replace the Cisco equipment.  There are many alternatives, some even free if you have the hardware to run it on.

Okay, that handles the end user.

But there is an even bigger lesson for software developers here.

How did these FOUR sets of back doors get in the software in the first place?

Only one possible answer exists.

A poor or non-existent secure software development lifecycle program (known as an SSDL) inside the company.

AS AN END USER CUSTOMER, WHEN IT COMES TO SECURITY SOFTWARE ESPECIALLY, YOU SHOULD BE ASKING ABOUT THE VENDOR’S SECURE SOFTWARE DEVELOPMENT LIFECYCLE PROGRAM.  

IF YOU GET AN EVASIVE ANSWER, FIND A DIFFERENT VENDOR.  VOTE WITH YOUR CREDIT CARD.

As a developer or developer manager, it is your responsibility to make sure that customers don’t vote with their credit cards.

IMPLEMENT a secure software development lifecycle program.

CREATE and MONITOR security standards.

TEST for conformance with those standards.

EDUCATE then entire development team – from analysts to testers  – about the CRITICALITY of the SSDL process.

Advertisement: we can help you with this.

While Cisco is big enough to weather a storm like this, smaller companies will not be so lucky.  The brand damage could be fatal to the company and all of its employees.

 

 

Facebooktwitterredditlinkedinmailby feather

The Challenge Of Encryption Backdoors

In the wake of the recent London Bridge terrorist attack using a truck as a weapon, British Prime Minister Theresa May has renewed her desire for software vendors to provide her with an encryption backdoor so that British law enforcement can look at messages from iPhones and Facebook’s Whatsapp, among other software.

In the U.S., some law enforcement officials, most notably the FBI, have asked for similar backdoors, while the U.S spy agencies – the NSA and CIA – have said that would be a really bad idea.

This past week we had a real world example of why giving any government what is referred to as a “golden key” would be a bad idea.

A hacker attempted to hack into British Members of Parliament email using a brute force attack (just keep trying passwords until something works).  The British Parliament IT folks detected this attempt and solved the problem by turning off Parliament’s email servers – sort of a self inflicted denial of service attack.  This, they claimed, was part of “robust measures” to protect their accounts and systems.

The mistake they made, apparently, was turning the servers back on.

Now they are saying that some number – they say less than 1% but they are still looking – of the accounts on the Parliament email server were compromised.

They blamed the user’s poor choice of passwords.  Likely true, but blaming the user won’t get you many brownie points.

That brute force attack came just days after reports surfaced of Russians selling MP’s credentials stolen from other breaches – working on the assumption, I guess, that people reuse passwords.  Which, of course, they do.

Apparently their systems did not have very robust protections against simple brute force hacking;  they did not require users to change their passwords in light of the Russian report, they did not force users to choose secure passwords and they did not implement what is becoming the new norm, two factor authentication.

But they want us to trust them to be able to protect a golden encryption key.  What makes you think that if they cannot protect an email server, they can protect what is likely a much more coveted target – a golden encryption key.

On this side of the pond, both the NSA and CIA – organizations that rightfully pride themselves as being among the most security conscious organizations in the work – continue to be the source of leaked hacking tools.

The CIA continues to be embarrassed by Wikileaks disclosure of more and more hacking tools as part of what they are calling Vault 7 and the NSA has to deal with the likes of Edward Snowden and Hal Martin, both Booz Allen contractors to the NSA who stole massive amounts of highly sensitive data from the Agency.  In Martin’s case, they are saying it amounts to tens of terabytes of highly classified information.

But, we should trust these folks – and the much lower echelon folks such as city police departments – with golden encryption keys.

I am not beating them up.  If one person knows something, it is a secret.  As soon as two people know it, it is not a secret any more.

In the case of encryption keys, reality says that it will be tens, hundreds or thousands of people, whether government employees or vendors to the government, like Booz, that will have to know these keys.  It is just hard to do and those keys will be HIGHLY prized by hackers.

If one of these keys is compromised, what do you do then?  There is likely NO WAY to undo the damage of compromising any communications that were protected using those compromised keys.  No way at all.  You just can’t get that genie back in the bottle again.  You might be able to change the key, but that would require updating every copy of the software anywhere in the world – not a simple task.

This is all in the name of what some people call the “Going Dark” problem – of people using encryption.

At the same time the NSA built a data center – over one million square feet – near Bluffdale, Utah.  Forbes estimates that it will have a storage capacity of between 3 and 12 exabytes of data in the short term.  Of course, the real number is classified, so do not expect the NSA to confirm or deny that number.  And that capacity, whatever it is, will only grow over time.

An exabyte is 1,000,000,000,000,000,000 bytes of data.  A somewhat large number.

Even with that massive capacity, reports are that the NSA can only store what it currently collects for a few days, quickly filtering what it wants to keep while trashing the rest.

It is, as they say, an interesting problem.  One which I am sure that politicians – and likely NOT computer security folks – will try to solve by passing a law.

Stay tuned;  this has just begun.

Information for this post came from Bleeping Computer.

 

Facebooktwitterredditlinkedinmailby feather

The Problem With Buying Chinese Electronics

Electronics made in China are often less expensive than products sold by western companies such as Cisco and Juniper.  But there may be a cost associated with that price.

The Chinese security firm Boyusec is working with the Chinese Ministry of State Security intelligence service in conducting cyber espionage, according to the Pentagon.  This would not be a surprise except that they are also working with the Chinese network equipment manufacturer Huawei that the Pentagon banned from DoD purchasing a few years ago.

While Huawei denies this, the Pentagon says that Huawei/Boyusec is putting back doors in Huawei networking gear so that the Chinese can spy on purchasers of Huawei equipment.  In addition to spying on customer’s phone and network traffic, using these backdoors also allow the Chinese to take control of these devices – likely to subtly reprogram them to allow them even more effective spying.

This follows a report earlier this month that software was found on more than 700 million phones, cars  and other smart devices that was manufactured by Shanghai Adups and used by Huawei, among others.  The software phoned home every three days and reported on the users calls, texts and other data.  Another Chinese technology manufacturer, ZTE, also uses the software.

The moral of the story  is that you should consider the reputation of the vendor that you are considering prior to making your purchase decision.

Sometimes that vendor is hard to detect.  If you buy a piece of electronic gear – such as those security web cams that took out Amazon and hundreds of other companies last month – had software and internal parts that were made by a vendor that didn’t care about security, but that company was not the name on the outside of the cameras – sold by many different companies.

Unfortunately, those vendors are price sensitive, so if they can find software for a few cents per device sold, they may decide to use it and not ask any questions about security.  After all, there is no liability in the United States if a company sells a product with poor or even no security.  That is up to the customer to figure out. 99% of the customers have no idea how to figure out whether a web cam or baby monitor is secure.  Unfortunately, what is needed is for companies to be held accountable for the security of these products.  This doesn’t mean that they should be clobbered for every bug found, but if they are ignoring reasonable commercial security practices, well, then, that might be a different story.  My two cents, for what it is worth.

Information for this post came from the FreeBeacon.

Facebooktwitterredditlinkedinmailby feather