Viacom is the Newest Company to Leave Data Unprotected on Amazon

Viacom is playing down the significance of this, but that could just be damage control.

One of our favorite security researchers, Chris Vickery, discovered yet another Amazon S3 storage bucket unprotected on Amazon.

In this case,  it did not contain non-public personal information of customers, according to Viacom.  They touted this as a good thing,  After Equifax, it probably is a good thing FOR US, but what was there is definitely worse for Viacom.

For those of you not familiar with Viacom, they own the likes of Paramount, MTV, Nickelodeon and Comedy Central, among other brands.

What was in there was the access key and secret key to an Amazon web services account owned by Viacom.  Whether it is their main corporate Amazon web services account or maybe a test account, we don’t know (yet), but the attempt to deflect the question leads me to believe that it is the main corporate account.  If it was, it would allow anyone who had that key to totally own the account, all the servers in it and all of the data associated with it.  Likely nothing important.

But that is not all that was there.

The Amazon storage bucket also contained the GPG (open source version of PGP) data encryption/decryption keys.   Depending on what those keys were used to encrypt, having the decryption keys would have allowed an attacker to read any data protected with those keys.  Generally speaking, encryption keys don’t protect the lunch menu.  If you go to the trouble to encrypt something, it is likely important and sensitive.

Chris contacted Viacom on August 31st and within a few hours, the data was gone.

The Amazon subdomain in question was called mcs-puppet.  MCS likely refers to Viacom’s Multi-platform Compute Services.  Puppet likely refers to the devops automation tool Puppet that allows IT operations teams to automate the deployment and management of corporate compute services.

While Viacom attempted to deflect the seriousness of the matter, without knowing what those Puppet scripts controlled, what the PGP keys controlled and what the AWS private keys were used for we really don’t know how much damage it could have done.

We also don’t know whether Chris was the first outsider to find the stash or whether it was downloaded many times.

Viacom’s attempts to make it go away would suggest to me that the damage was worse than they wanted to let on.

In the bigger picture, this is just one more case of a company not understanding where their data is, how it is protected and who has access to it.  In this case, access was restricted to anyone who could find the data.  Not a great plan for your private encryption keys and configuration scripts.  Not a great plan at all.

But Viacom is hardly unique.

Does your company actively track the location, access controls and identities of all data and users who can access that data, whether it is located in a company owned data center or some cloud service that an employee set up without asking or telling anyone?  I didn’t think so.  THAT is what needs to happen and it is not a one time event;  it needs to be managed in real time, FOREVER.

Or, your company could become the next Viacom.  Your choice.

Information for this post came from Gizmodo.

An Equifax Lesson For Everyone To Learn

One of the MANY lessons to be learned from the Equifax breach is how not to handle a breach.  Here is just one of those lessons and it is a lesson for BOTH users and webmasters.


When the breach finally became public – months after it happened – they created a web site for victims to go to in order to find out about the breach.  That web site,, looks like this:

You will notice that it has the Equifax logo on it and that it has the little green padlock indicating that it is encrypted, but, of course, anyone can steal the Equifax logo and put it anywhere they want – like right here, for example:

But that doesn’t mean that the site belongs to Equifax.

You will notice that the web site URL includes the name Equifax, but so does (yup, a real site.  Totally benign, but real – see below).  So, just because the word Equifax is in the web site name does not mean that it is owned by Equifax.

In this case, since the word Equifax is probably a trademark, they can, eventually, get this site taken down if they want.   But, Equifaxx is not a trademark (note that there are two xxs and not one).  That site is real (see below) and curiously, it seems to belong to EXPERIAN, their biggest competitor.  Why they didn’t buy up similar sounding web sites for $10 a year each is beyond me and a lesson to learn from this.  Here is

But that is not the worst failure.

Why wouldn’t they send you to a site that you KNOW is theirs. Send people to or Equifax.con/BREACH or something like that?  At least people know that they are going to a site owned by the company that they are looking for.  In fact, this site was hastily set up and initially, if you looked, it wasn’t even owned by Equifax, it was owned by an Equifax vendor.

Still, that is not the worst failure.

Here is the worst failure and the lesson for everyone – users and webmasters both.

While they secured the site with HTTPS – what we geeks call an SSL (or more correctly a TLS) certificate protected site, they used the cheapest, least secure certificate they could find.  What is called a DOMAIN VALIDATION certificate.  All that certificate proves is that the person who requested it – you, me, my kid, whoever – had sufficient access to the web site to store a file on it.  If the site had been hacked, a hacker could buy that kind of certificate.


Now lets look at Apple’s website for a minute (see below).

Note that the address bar is different from the address bar on Equifax’s breach web site.  This has the name Apple, Inc [US] in green in front of the URL.  This is an EXTENDED VALIDATION certificate.  In order for Apple (or Equifax) to get this, they had to prove they were Apple and not Mitch.  This is a higher level of verification and a more expensive certificate.

It is designed to give the user a higher level of confidence that they really have landed on an Apple – or Equifax – web site.

Why is this important.

One more time, Equifax is the poster child for how to screw up.

Equifax’s offical Twitter account tweeted not once, not twice but three times, an incorrect web site for people to go to.

Instead of sending people to, they instead sent people to

Now it turns out that this alter ego site was set up by a security researcher, so even when Equifax’s crisis communications team sent people to the wrong site, it didn’t infect their computer.  But if it was a hacker’s web site, it certainly could have.  Or asked for and stolen even more information.  Here is a look at the wrong web site.  This site proved it’s point so it has been taken down, but the Internet never forgets, so here is a copy from the Wayback machine, the Internet Archive.

Notice that this web site ALSO had a green padlock and was accessed using HTTPS.

Which is why, as users, we need to look for the company name in the address bar and why, as webmasters, we need to pay a little bit more for an extended validation or EV certificate.

In this case, if, say, there was a phishing campaign and it got people to click on the link and it sent people to a bogus web site, the extended validation certificate is much harder to forge.

Be a smart Internet user.  Look for the extended validation certificate.

Now that you are aware, as you surf the web, notice what companies have extended validation certificates.  And which ones do not.

Information for this post came from The Verge.


Bluetooth Vulnerability Does Not Require Any User Interaction

Similar to the WiFi bug we reported about in July (see post), this Bluetooth bug does not require the user to interact with the hacker, does not require the user to connect to an infected Bluetooth device or anything like that.  All it requires is that Bluetooth is turned on in the device.

The good news, if there is any, is that this is not a hardware problem and it is not a protocol problem, it is a software implementation error.  A plain old bug.  Which means that it can be patched.

Of course, every COOL bug has to have a name;  this one is called BlueBorne.

ASSUMING that the manufacturer of your phone is still releasing patches for the model of phone that you have.  For example, most Android 4 and earlier users are not getting any patches and many Android 5 users are not getting patches.  iPhone 4 users are not going to get patched and this newest version of iOS will be the last patches for the iPhone 5 and 5c.

And, this is not limited to phones.

While Apple has patched this bug in iOS 10 (so most recently purchased iPhone users are good), Microsoft just released a Windows patch in July.  This means that Windows users are safe IF they are running on a supported version of Windows and have installed the July patch release.  Google says that the September patch release fixes the bug, but that has to wind its way through the manufacturer’s release process and then your carrier’s release process UNLESS you are using a Google Pixel phone, in which case, you should already have the patch.  Linux teams are working on a patch, but that has not been released yet.

The bigger issue is all of those Internet of Things appliances from light bulbs to TVs that will likely NEVER be patched and will, therefore, always be an opportunity for a hacker.

Of course, as with all Bluetooth connections, the attacker has to be within 30-100 feet or so, depending on the equipment that the hacker is using.  That makes Starbucks a perfect place to launch an attack on unsuspecting users.

For those of you who do not have the patch yet, such as users using obsolete Android phones, and Linux based IoT devices, the only possible defense is to disable Bluetooth.  That may not be what you want to hear, but that will protect your device.

Information for this post came from Wired.

You’re Not Gonna Believe This – Another Equifax Breach

Apparently Equifax had another, separate breach in March of this year, 5 months before the breach that they have already announced.

Equifax hired the security firm Mandiant to check into both breaches, but since they have not said anything about this first breach, we really don’t know much about it.

One assumes that this secret earlier breach will only fuel the fires behind the dozens of lawsuits and separate dozens of investigations.

It will also make people wonder about those executive stock sales – the ones NOT on the SEC sale schedule and which occurred a couple of days before the announcement of the second breach but months after the first breach.

It is possible that they discovered the first breach before any data was stolen, but if that was the case, how do you explain how the second breach, only a few months later, went undetected for several months?  There is no logic that can explain this.

We have also seen cases where the breached company didn’t want to find any evidence of something that would require them to notify anyone.  Breach?  Breach?  What breach?  I don’t see any breach.  If you tell the investigators to only look in one corner where nothing happened, they likely won’t find any problems.  The company said that they have complied with all mandatory notifications regarding the March breach.

The fact that Equifax was lobbying Congress to reduce their breach reporting requirements at the same time that they were investigating the first breach is, shall we say, a bit problematic.  And it has terrible optics.

Is this the final straw that has the board fire the CEO?  I don’t know but I would not be surprised.

Another source is saying that the goal of the attackers may have been to use Equifax to breach some of Equifax’s large banking partners.  At least one bank appears to have been compromised and Equifax says that it is working with its banking partners to mitigate damage.

Information for this post came from Bloomberg.

Legal Risks of Cloud and Collaboration Tools

Many employees use consumer grade, unmanaged cloud services such as Dropbox and Google Drive as part of their work.  This is sometimes called BYOC for Bring Your Own Cloud.  It is convenient, but is it a good idea for the business?

Loss/theft of intellectual property

One of the obvious risks of BYOC is the loss of control (AKA theft) of corporate intellectual property.  These personal cloud services make it quick and easy to steal hundreds to thousands of confidential files by merely dragging and dropping.  AND, since the account does not belong to the company, the only way the company can force an employee to let them into their account is via a court order – an expensive and dicey proposition.  By the time that order is granted and appeals are exhausted, any evidence is likely gone.

Data breach and regulatory violations

Just because your company chooses to allow (or not stop) employees from using BYOC does not mean that company does not have liability if the data on the employee’s personal cloud, that the company does not control, is breached.  In fact, likely, the company is fully liable even though they have no authority over that data.  Violation of regulations such as HIPAA also fall on the company.

Litigation risk and electronic discovery exposure

If a company allows users to use BYOC and is involved in litigation, it is very difficult to preserve evidence that could exist on employee’s personal clouds.  If it is discovered that evidence has been destroyed or compromised, the judge could hold the company in contempt or even instruct the jury that they should assume the worst – that whatever was destroyed would have helped the plaintiffs and hurt the company.  A Florida court recently faulted a company for allowing an employee to destroy files in a personal Box account.  Also, depending on what an employee does with the files on the BYOC account, the company  may lose the ability to assert attorney-client privilege.

So what is a company to do?

There are only a couple of options –

Allow BYOC and deal with the risk.  This doesn’t seem like a great solution, but it is what many companies are doing today – understanding that they are going to lose corporate intellectual property in the best of circumstances.

Outlawing BYOC.  Done right, this can work.  After all, the employee just wants to get his or her job done, but done wrong, it can really annoy the employee.

Allow but regulate.  This is likely more complicated.  The company has to decide what BYOC services are OK, create rules for using them and then enforce these rules, but it is possible for this option to work.

For most companies, providing a corporate owned solution that works at least as easily as the employee owned consumer grade solution is probably the best solution, but every company will need to decide for itself.

Information for this post came from JDSupra.

Warning For Symantec Customers

As I have reported before, Symantec has had problems with its server SSL certificate business for years and was on double-super probation.  Symantec bought its certificate business mostly from Verisign in 2010 for about 1.2 billion dollars.  It also bought the certificate businesses of Thawte, Equifax and others

Last month it sold that business to Digicert, a move that was designed to preserve its equity.  It sold that business for $950 million plus a minority stake in Digicert.

But now the other shoe is dropping.

The reason Symantec was in trouble was that the browser vendors didn’t trust the security of the certificates that were issued before June 2016.

OK, so what is there to do.

First, each browser maker does its own thing.  Except, Chrome has the largest share of the browser market, so what Chrome does is more important than what anyone else does and, for the most part, everyone will follow what Chrome does in this case.

As of December 1 of this year, Chrome will no longer trust any NEW certificates issued by Symantec after this date.  That means that if your web server uses a Symantec certificate issued on December 2, when a user visits that site, Chrome will pop up a warning saying that the site is not to be trusted.

Starting with Chrome version 66 which should be released around April 1, 2018, no Symantec certificate issued before June 1, 2016 will be trusted.

Finally, When Chrome 70 is released in October 2018, NO Symantec certificates will be trusted at all.

So, for those of you webmasters that bought Symante certificates – for certificates bought before June 2016, you have until early next year to replace those server certificates and for those of you who bought Symantec certificates after June 2016, you have until late 2018 to replace your certificates.

Since most people buy certificates that last one, two or three years, some of this will be solved by attrition, but we were examining one certificate today that expires TEN years in the future.

If you don’t know what vendor your certificates came from please reach out to us and we will be happy to assist you.

Information for this post came from ZDNet.