Category Archives: Security Practices

Warning For Symantec Customers

As I have reported before, Symantec has had problems with its server SSL certificate business for years and was on double-super probation.  Symantec bought its certificate business mostly from Verisign in 2010 for about 1.2 billion dollars.  It also bought the certificate businesses of Thawte, Equifax and others

Last month it sold that business to Digicert, a move that was designed to preserve its equity.  It sold that business for $950 million plus a minority stake in Digicert.

But now the other shoe is dropping.

The reason Symantec was in trouble was that the browser vendors didn’t trust the security of the certificates that were issued before June 2016.

OK, so what is there to do.

First, each browser maker does its own thing.  Except, Chrome has the largest share of the browser market, so what Chrome does is more important than what anyone else does and, for the most part, everyone will follow what Chrome does in this case.

As of December 1 of this year, Chrome will no longer trust any NEW certificates issued by Symantec after this date.  That means that if your web server uses a Symantec certificate issued on December 2, when a user visits that site, Chrome will pop up a warning saying that the site is not to be trusted.

Starting with Chrome version 66 which should be released around April 1, 2018, no Symantec certificate issued before June 1, 2016 will be trusted.

Finally, When Chrome 70 is released in October 2018, NO Symantec certificates will be trusted at all.

So, for those of you webmasters that bought Symante certificates – for certificates bought before June 2016, you have until early next year to replace those server certificates and for those of you who bought Symantec certificates after June 2016, you have until late 2018 to replace your certificates.

Since most people buy certificates that last one, two or three years, some of this will be solved by attrition, but we were examining one certificate today that expires TEN years in the future.

If you don’t know what vendor your certificates came from please reach out to us and we will be happy to assist you.

Information for this post came from ZDNet.

 

Facebooktwitterredditlinkedinmailby feather

Another Day, Another Amazon Data Exposure – And How Not To Handle It

Last week I wrote about an incident with a vendor to the City of Chicago who left close to two million voter records exposed on Amazon and how the vendor, in spite of the initial mistake of exposing the data, handled the breach very well (see blog post).

Today we have another case and, this time, an example of how not to handle it.

Today’s case also came from researcher Chris Vickery and the data in question was an Amazon storage bucket with resumes for what the news is calling “mercenaries”.  In fact, the company is Tigerswan, a private security firm.

Like many private security firms that cater to the military or paramilitary world, many of the employees and applicants are ex-military and hold or have held high level security clearances.

On July 20th, Vickery discovered an Amazon S3 bucket named TigerswanResumes with almost 10,000 resumes of veterans and others who were interested in working for Tigerswan.  As is typical for resumes, they included a lot of personal details including former activities in the military and clearance information.  This data was totally exposed to anyone who happened on it – including, potentially, agents of foreign powers who might want to blackmail (or worse) these people.

On July 21st Chris emailed Tigerswan about the situation.  He followed up on the 22nd with a phone call and email and was told they were working with Amazon to secure the data.

On August 10th, with the data still exposed, Chris reached out to Tigerswan again and was told that they were unsure as to why the data was exposed and would bring it to the IT director’s attention.

Finally, on August 24th, a month after being notified, Tigerswan the data was secured.

THE ONLY REASON THAT THE DATA WAS SECURED ON AUGUST 24TH WAS BECAUSE CHRIS WAS ABLE TO GET AMAZON TO INTERVENE.

Tigerswan blamed the situation on a former recruiting vendor – in order words, the data was effectively abandoned and unprotected.  No one “Owned” that data.

Chris’s blog post provides a lot of examples of the backgrounds of people who’s information was exposed and, it would seem, this information would be attractive to intelligence agents.  Included in the resumes were police officers, sheriff deputies, people who worked at Guantanamo and many others.

Also on some of the resumes were references with contact information including one former director of the CIA clandestine services.  You kind of get the idea.

The fact that this took a month to secure the data is an indication of a lack of an effective incident response program and also a lack of a program to manage the location and ownership of data inside the company.  The fact that Amazon finally had to intervene makes the situation even worse.  Unfortunately, neither of these is unusual.

While it does take some work to build and maintain the data maps to document data storage locations – which should include data managed by vendors and ex-vendors on behalf of the company – compared to taking a month to fix a problem like this, the cost is low.  Very low.  For the veterans who were affected, the cost, assuming this data is now in the hands of our adversaries (and I can only assume that if Chris could find it, so could the Russians or the Chinese), is high and those veterans and others will have to deal with it.  That could, realistically, be sufficient grounds for a class action lawsuit against tigerswan.

Information for this post came from Upguard and ZDNet.

 

 

Facebooktwitterredditlinkedinmailby feather

The Insider Threat Cost One Mortgage Company $25 Million

This case of intrigue may seem like it belongs in a spy novel, but in this case, it is winding up in the Board Room and the court room.

Here is the story.  Chicago based Guaranteed Rate courted the employee of a much smaller rival mortgage company, Benjamin Anderson.  While still employed at the smaller company, Mount Olympus Mortgage, Anderson signed an employment contract with Guarantee Rate.  While an employee  considering moving to a new job wants assurances that if he or she quits his or her current job, there will be a job waiting at the new company, this is usually done via written offer letter, not a signed employment agreement.  Once he signed the agreement he was, in fact, working for two competing mortgage companies at the same time.

While this may be unethical – and possibly a violation of his contract with Mount Olympus – it may not be illegal.  What happened next, however, was illegal.

Over a period of weeks, Anderson downloaded and transferred loan files – hundreds of them – to his new employer.  Anderson’s new contract with Guaranteed Rate paid him a much higher commission during his first few months, encouraging him to close as many loans as possible during that time-frame.  Some of those loans closed before he even left Mount Olympus.

Eventually, Mount Olympus discovered what he was doing and sent cease and desist letters and then, ultimately, filed a lawsuit.  It is certainly possible that if Anderson had been less greedy and only transferred tens of loans, he might not have ever gotten caught.

Even though Mount Olympus was small, they were able to detect what was happening.  One way to detect this would be when they contacted a borrower and the borrower said that they were no longer working with that company.

The judgement, with a total value of around $25 million includes $13 million in punitive damages, $5.6 million in lost profits and $4.6 million in lost business value.  For a company as big as Guaranteed Rate, who funded $18 billion in loans last year, this is a blip, but for smaller companies this could be a death sentence.

There are several messages in this verdict –

First, if you are luring an employee away from a competitor, make sure that they are not working for both you and the competitor at the same time.  One strike against Guaranteed Rate.

Second, make sure that compensation is not structured to encourage a new employee to steal intellectual property from the employee’s former company.  Strike two against Guaranteed Rate.

Third, make sure that employees understand that bringing their former employer’s (stolen) intellectual property with them will not be tolerated and will be grounds for immediate dismissal.  This has to be a policy with teeth.  As Uber is learning right now in a lawsuit they are fighting, saying one thing but winking that they don’t mean it will land you in court.  Strike three and $25 million later…

Finally, for all companies, the ability to deter and detect the insider threat scenario is critical.  The theft of intellectual property can ruin a company.  Failing that, it can cost large legal fees on both sides and in some cases multi-million dollar judgments.

In this case it was likely easy to detect the theft, but in many cases you don’t have the obvious smoking gun, which means that logging and alerting becomes much more important.

Unfortunately, it is likely more common than you might guess that employees take at least some intellectual property with them when they leave an employer.  Strong policies and good insider threat detection can slow that theft down.

Information for this post came from the Chicago Tribune.

Facebooktwitterredditlinkedinmailby feather

Browser Makers Doing What Needs to be Done – Finally

When you log on to a “secure” web site – one that that you access via HTTPS:// instead of HTTP:// , you do that because the web site bought a certificate from a certificate authority.  Those certificates work because the browsers – all of them – “trust” the makers of those certificates.

How do those certificate authorities become trusted?  The certificate authorities apply to each of the browser makers and those browser makers each decide who to trust.

If a browser or more than one browser decides to not trust a certificate authority, then any time a user goes to a web site that uses that vendor’s certificate they will get an error message saying the certificate is not trusted.  Every single time.

What that means is that if any of the major browser vendors don’t trust you, then you cannot sell your certificates.

If you look at any browser or computer, if you know where to look, you can find a list of all of the certificate authorities that the browser or computer trusts.  That used to be a handful of companies, but over time that has mushroomed to a ridiculous number, like 150 or more.  For some reason the browser makes have made it incredibly hard for Joe or Jane user to see what certificates are installed or to delete one of them.

There is a group called the CA/Browser Forum and they set standards for certificate authorities to follow.  The process of disciplining a CA can take years, but recently the CAB Forum started getting tough.

Two Chinese certificate authorities were not following the rules so the CAB Forum scolded them.  Then they didn’t change their actions.  So finally, one by one, the browsers started the process of the death sentence.  This week, the last major browser maker said that come September they are no longer going to trust certificates made by WoSign and StartCom.

Of course smart people would be asking why the <bleep> we were trusting security certificates from China in the first place.

My answer?  Beats me.  I guess they want to be inclusive.

I would appreciate it if they allowed me to make that decision.  But they figure that I am not smart enough to decide whether I want to trust certificates from China.

For a certificate authority, losing the trust of the browser makers is basically a death sentence – which is why they keep giving certificate authorities that screw up another chance.  Personally, I vote for ONE strike and you are out.

On a related front, one of the biggest U.S. certificate authorities, Symantec (formerly Verisign) just sold it’s certificate business to Digicert.

Symantec/Verisign has been in CAB Forum “time out” for a year or two now because of oopsies that they have made, like issuing certificates for Google.Com to someone other than Google and stuff like that.  Symantec has been given several chances to clean up their act but does not appear to be getting it right.  Fearing that they were going to go down the same path that WoSign went down and pour a billion plus dollar investment down the sewer, this week they sold that business for $950 million plus some stock, to Digicert.  This is good for users because Digicert is well respected, unlike Symantec.

So, while certificate authorities have, historically, received the death penalty like never, it appears that the browser makers have had their fill of it and ARE NOT GOING TO TAKE IT ANY MORE!!!

I hope this is the beginning of a trend.  I could do with maybe a dozen trusted certificate authorities.  That would be enough for me.  3 down, one hundred plus to go.

Information for this post came from ZDNet and eWeek.

Facebooktwitterredditlinkedinmailby feather

MasterPrints

Before a few days ago, I had never heard of MasterPrints.  Of course, there are many things that I have never heard about.  MasterPrints, of course, have to do with security.  Have you ever heard of MasterPrints?

Here is the story.  Everyone is familiar with fingerprint sensors on cell phones and other devices.  They are used to make credit card payments, unlock phones, disable alarms and perform other sensitive transactions.

But how do these fingerprint sensors work?

Well it turns out that the sensor is so small that it cannot capture the entire fingerprint, so, instead, it captures multiple partial pieces of the fingerprint – say maybe 6.  If the system allows you to “enroll” more than one finger, you might have 12 or 18 partials.

Since fingerprint security is more about convenience than it is about security, the system will consider it a match if any piece matches any one of the stored pieces.

The Apple Touch ID is said to have a 1 in 50,000 chance of a false match.  That makes it more secure than the old, discarded, 4 digit Apple PIN.  And twenty times less secure than a 6 digit PIN.

There have been many ways that people have tried to attack password authentication before, but this is a new way.

What if one of the parts of one of the fingerprints was a match to many fingerprints – maybe some of which are yours and some are not yours.  The MasterPrint concept is born.

Not much research has been done regarding this small fingerprint sensors on phones.  Yet.

What if an attacker was able to lift a partial fingerprint from the owner of the device – say off a glass.  What happens to the probability of success then.

What if a researcher – or a hacker – could synthesize a MasterPrint – kind of like a skeleton key, but for fingerprints.  What then?

The team did a series of tests and was able to create matches when matches did not exist for a significant percentage of their test (around 7 percent).  That is a much higher false positive rate than a 1 in 10,000 on a 4 digit PIN or 1 in 1,000,000 for a 6 digit PIN.

This area of fingerprint science is relatively new.  Compared to traditional fingerprint forensics where the scientist has most or all of the fingerprint, this is very different.  In one test they used 12 partials.  That means that if only 8% of the total finger matches, that would be considered a match.

Of course you could increase the rejection rate;  that would improve security, but it would also increase the rejection rate of fingerprints that should have matched.  Security or convenience – pick one.  Likely the smart thing to do in high security situations is to raise the rejection rate.  Assuming the system even allows the user to control the rejection rate.  Most do not allow that.

Alternatively, vendors could build larger and more precise fingerprint sensors.  That likely will happen, but it will take time.

In the meantime, users and system security pros need to consider the consequences of a false match and make decisions based on those consequences.  Security is never simple.

Information for this post came from a team at Michigan State University.

 

Facebooktwitterredditlinkedinmailby feather

Is Kaspersky Software a Russian Spy Front?

Gene Kaspersky, CEO of Russian Software Firm Kaspersky Labs

Some in Congress and the Intelligence Services are concerned that Kaspersky’s security software could be co-opted by the Russian government and be used to spy on American companies who use the software.

Fundamentally, this is no different than concerns that people have that the U.S. spy agencies could or already have forced U.S. companies to insert back doors into their software to allow U.S. spies to use U.S. software to spy on people as well.

We already know that Yahoo did that by running all email through filters and feeding the data to the Intelligence Community.

The challenge in both cases – Russia and the United States – is that any efforts on the part of the respective spy agencies to do that would be highly classified and those agencies would not admit that they are doing so, even if they are.

Since it is the job of spy agencies to spy on people, it is not unreasonable to assume that they would do that if they could.

Some people, including me, have been concerned for a long time that Gene’s software could be used for no good.  Even though I think he makes good products, I find it hard to trust him.  He has had very close times to the KGB and FDB for a long time, including training him at a school run by the KGB.

Kaspersky’s software, they say, is used by 400 million people world wide, including many people in the United States.  There is a bill working it’s way through Congress right now that would ban the DoD from using it.  It is used in some places inside U.S. government agencies.

While suspicions have run wild for years, there has been no hard evidence.  Now a media outlet has found something unusual in a document that Russian companies need to have in order to operate in Russia.  This document has a military intelligence unit number attached to this document.  While some people are making a big deal of this, it could be legit – no different than, maybe, a U.S. defense contractor might have some ID numbers.  Some former spies say that this MI unit number is a pretty unusual thing.  Stay tuned.

Kaspersky has offered to let the government look at his source code to verify that there are no back doors.  Of course, no back doors today does not equal no back doors after the next update.

In the U.S. Verizon and AT&T shared call data with the intelligence community and there are thousands of FISA court orders issued every year.  Those are all classified so we have no clue what they might entail.

Kaspersky IS the company that paid General Flynn those consulting fees that he forgot to declare.

While I don’t know if his software has been compromised, my theory is that is isn’t worth the risk.  There are plenty of American and European software products that would see to me, on the face of it, less risky.

Listening to the rumblings of the U.S. Britain, Germany, France and others, I am not sure HOW much less risky, but probably at least somewhat less risky.

Information for this post came from MSN.

 

Facebooktwitterredditlinkedinmailby feather