Hackers Shut Down Entire School District For Days

All schools in Flathead County, Montana schools were closed on September 14 and 15 and all extracurricular activities and athletic events cancelled as a result of a ransom threat from the well known hacker(s) called The Dark Overlord.

This was not a ransomware attack where the district’s data would have been encrypted, demanding a ransom to decrypt it.

Instead the hackers broke into the district’s server (the district has 15,000 students;  I suppose it is possible that it only has one server, or at least the server they hacked had those records in it) and stole addresses, medical records, behavioral records, and other data from past and present students, staff and parents.

They sent threatening messages to parents saying that the hackers would kill as many people as possible if the ransom was not paid.

The hackers demanded $75,000 in Bitcoin if paid quickly, $100,000 in Bitcoin if someone wrote an embarrassing letter and $150,000 in Bitcoin if paid out over a year.

Given that the ransom notes were sent to parents, the cat was out of the bag.  The Sheriff decided, as a result, to release the ransom note sent to the District Board.

Historically, The Dark Overlord – if that who is really doing this – has not resorted to threatening to kill people.  This would be a new low.

After several days, the police, working with other law enforcement agencies, decided that the hacker(s) were not local to northern Montana and therefore, as a result, would not realistically be able to carry out the threat to  kill children and schools resumed after being closed Thursday and Friday and sports and extracurricular events being cancelled on Saturday and Sunday as well.

The hacker(s) contacted the Flathead Beacon, the local newspaper and in a conversation, the hacker(s) said the goal was to kill as many people as possible in a place where no one would expect.

The hacker said that he wanted people to live in a state of fear before he makes his move.

When asked if this was politically motivated, the hacker claimed that the goal was to exterminate human life and smear the government.

Law enforcement said that all district schools were taking necessary precautions to ensure that no data breach occurs.  I am somewhat skeptical of this claim, unless they turned off and unplugged all the other computers, since the district was already breached.

Law enforcement said that they feel that there is no threat to the physical safety of our children.

This is totally a crap shoot on their part.  The odds are in their favor, which is a good thing, but there are no guarantees.

That fact is a problem.  I am going to side with them and hope this is an empty threat.  At least this time.

As long as organizations make it as easy as taking candy from a baby to break into their computer networks, they are making it easy for the hackers.  Once hackers are armed with stolen data (either by encrypting it or actually stealing it), they have many more options than before.

Hopefully, this is a one-off and not a trend and hopefully this is one mentally deranged individual, but whether that is true is unknown.

Whatever this is, it is certainly an escalation of hostilities.  *IF* this an indication of what hackers might do in the future, that represents a scary future.

Assuming this was a target of opportunity, and it likely was  – a small school district in rural Montana is unlikely to be a strategic target – then our objective has to be to make it difficult for that random cyber attack to succeed.

Information for this post came from the Flathead Beacon and Naked Security.

Facebooktwitterredditlinkedinmailby feather

Viacom is the Newest Company to Leave Data Unprotected on Amazon

Viacom is playing down the significance of this, but that could just be damage control.

One of our favorite security researchers, Chris Vickery, discovered yet another Amazon S3 storage bucket unprotected on Amazon.

In this case,  it did not contain non-public personal information of customers, according to Viacom.  They touted this as a good thing,  After Equifax, it probably is a good thing FOR US, but what was there is definitely worse for Viacom.

For those of you not familiar with Viacom, they own the likes of Paramount, MTV, Nickelodeon and Comedy Central, among other brands.

What was in there was the access key and secret key to an Amazon web services account owned by Viacom.  Whether it is their main corporate Amazon web services account or maybe a test account, we don’t know (yet), but the attempt to deflect the question leads me to believe that it is the main corporate account.  If it was, it would allow anyone who had that key to totally own the account, all the servers in it and all of the data associated with it.  Likely nothing important.

But that is not all that was there.

The Amazon storage bucket also contained the GPG (open source version of PGP) data encryption/decryption keys.   Depending on what those keys were used to encrypt, having the decryption keys would have allowed an attacker to read any data protected with those keys.  Generally speaking, encryption keys don’t protect the lunch menu.  If you go to the trouble to encrypt something, it is likely important and sensitive.

Chris contacted Viacom on August 31st and within a few hours, the data was gone.

The Amazon subdomain in question was called mcs-puppet.  MCS likely refers to Viacom’s Multi-platform Compute Services.  Puppet likely refers to the devops automation tool Puppet that allows IT operations teams to automate the deployment and management of corporate compute services.

While Viacom attempted to deflect the seriousness of the matter, without knowing what those Puppet scripts controlled, what the PGP keys controlled and what the AWS private keys were used for we really don’t know how much damage it could have done.

We also don’t know whether Chris was the first outsider to find the stash or whether it was downloaded many times.

Viacom’s attempts to make it go away would suggest to me that the damage was worse than they wanted to let on.

In the bigger picture, this is just one more case of a company not understanding where their data is, how it is protected and who has access to it.  In this case, access was restricted to anyone who could find the data.  Not a great plan for your private encryption keys and configuration scripts.  Not a great plan at all.

But Viacom is hardly unique.

Does your company actively track the location, access controls and identities of all data and users who can access that data, whether it is located in a company owned data center or some cloud service that an employee set up without asking or telling anyone?  I didn’t think so.  THAT is what needs to happen and it is not a one time event;  it needs to be managed in real time, FOREVER.

Or, your company could become the next Viacom.  Your choice.

Information for this post came from Gizmodo.

Facebooktwitterredditlinkedinmailby feather

An Equifax Lesson For Everyone To Learn

One of the MANY lessons to be learned from the Equifax breach is how not to handle a breach.  Here is just one of those lessons and it is a lesson for BOTH users and webmasters.

NOTE:  TO SEE A BIGGER IMAGE OF ANY OF THE PICTURES IN THIS POST, JUST CLICK ONCE ON THE IMAGE.

When the breach finally became public – months after it happened – they created a web site for victims to go to in order to find out about the breach.  That web site, equifaxsecurity2017.com, looks like this:

You will notice that it has the Equifax logo on it and that it has the little green padlock indicating that it is encrypted, but, of course, anyone can steal the Equifax logo and put it anywhere they want – like right here, for example:

But that doesn’t mean that the site belongs to Equifax.

You will notice that the web site URL includes the name Equifax, but so does www.equifaxsucks.com (yup, a real site.  Totally benign, but real – see below).  So, just because the word Equifax is in the web site name does not mean that it is owned by Equifax.

In this case, since the word Equifax is probably a trademark, they can, eventually, get this site taken down if they want.   But, Equifaxx is not a trademark (note that there are two xxs and not one).  That site is real (see below) and curiously, it seems to belong to EXPERIAN, their biggest competitor.  Why they didn’t buy up similar sounding web sites for $10 a year each is beyond me and a lesson to learn from this.  Here is Equifaxx.com.

But that is not the worst failure.

Why wouldn’t they send you to a site that you KNOW is theirs. Send people to BREACH.equifax.com or Equifax.con/BREACH or something like that?  At least people know that they are going to a site owned by the company that they are looking for.  In fact, this site was hastily set up and initially, if you looked, it wasn’t even owned by Equifax, it was owned by an Equifax vendor.

Still, that is not the worst failure.

Here is the worst failure and the lesson for everyone – users and webmasters both.

While they secured the site with HTTPS – what we geeks call an SSL (or more correctly a TLS) certificate protected site, they used the cheapest, least secure certificate they could find.  What is called a DOMAIN VALIDATION certificate.  All that certificate proves is that the person who requested it – you, me, my kid, whoever – had sufficient access to the web site to store a file on it.  If the site had been hacked, a hacker could buy that kind of certificate.

THAT IS WHAT A GREEN PADLOCK PROVES.  NOTHING MORE.

Now lets look at Apple’s website for a minute (see below).

Note that the address bar is different from the address bar on Equifax’s breach web site.  This has the name Apple, Inc [US] in green in front of the URL.  This is an EXTENDED VALIDATION certificate.  In order for Apple (or Equifax) to get this, they had to prove they were Apple and not Mitch.  This is a higher level of verification and a more expensive certificate.

It is designed to give the user a higher level of confidence that they really have landed on an Apple – or Equifax – web site.

Why is this important.

One more time, Equifax is the poster child for how to screw up.

Equifax’s offical Twitter account tweeted not once, not twice but three times, an incorrect web site for people to go to.

Instead of sending people to EquifaxSecurity2017.com, they instead sent people to SecurityEquifax2017.com.

Now it turns out that this alter ego site was set up by a security researcher, so even when Equifax’s crisis communications team sent people to the wrong site, it didn’t infect their computer.  But if it was a hacker’s web site, it certainly could have.  Or asked for and stolen even more information.  Here is a look at the wrong web site.  This site proved it’s point so it has been taken down, but the Internet never forgets, so here is a copy from the Wayback machine, the Internet Archive.

Notice that this web site ALSO had a green padlock and was accessed using HTTPS.

Which is why, as users, we need to look for the company name in the address bar and why, as webmasters, we need to pay a little bit more for an extended validation or EV certificate.

In this case, if, say, there was a phishing campaign and it got people to click on the link and it sent people to a bogus web site, the extended validation certificate is much harder to forge.

Be a smart Internet user.  Look for the extended validation certificate.

Now that you are aware, as you surf the web, notice what companies have extended validation certificates.  And which ones do not.

Information for this post came from The Verge.

 

Facebooktwitterredditlinkedinmailby feather

Bluetooth Vulnerability Does Not Require Any User Interaction

Similar to the WiFi bug we reported about in July (see post), this Bluetooth bug does not require the user to interact with the hacker, does not require the user to connect to an infected Bluetooth device or anything like that.  All it requires is that Bluetooth is turned on in the device.

The good news, if there is any, is that this is not a hardware problem and it is not a protocol problem, it is a software implementation error.  A plain old bug.  Which means that it can be patched.

Of course, every COOL bug has to have a name;  this one is called BlueBorne.

ASSUMING that the manufacturer of your phone is still releasing patches for the model of phone that you have.  For example, most Android 4 and earlier users are not getting any patches and many Android 5 users are not getting patches.  iPhone 4 users are not going to get patched and this newest version of iOS will be the last patches for the iPhone 5 and 5c.

And, this is not limited to phones.

While Apple has patched this bug in iOS 10 (so most recently purchased iPhone users are good), Microsoft just released a Windows patch in July.  This means that Windows users are safe IF they are running on a supported version of Windows and have installed the July patch release.  Google says that the September patch release fixes the bug, but that has to wind its way through the manufacturer’s release process and then your carrier’s release process UNLESS you are using a Google Pixel phone, in which case, you should already have the patch.  Linux teams are working on a patch, but that has not been released yet.

The bigger issue is all of those Internet of Things appliances from light bulbs to TVs that will likely NEVER be patched and will, therefore, always be an opportunity for a hacker.

Of course, as with all Bluetooth connections, the attacker has to be within 30-100 feet or so, depending on the equipment that the hacker is using.  That makes Starbucks a perfect place to launch an attack on unsuspecting users.

For those of you who do not have the patch yet, such as users using obsolete Android phones, and Linux based IoT devices, the only possible defense is to disable Bluetooth.  That may not be what you want to hear, but that will protect your device.

Information for this post came from Wired.

Facebooktwitterredditlinkedinmailby feather

You’re Not Gonna Believe This – Another Equifax Breach

Apparently Equifax had another, separate breach in March of this year, 5 months before the breach that they have already announced.

Equifax hired the security firm Mandiant to check into both breaches, but since they have not said anything about this first breach, we really don’t know much about it.

One assumes that this secret earlier breach will only fuel the fires behind the dozens of lawsuits and separate dozens of investigations.

It will also make people wonder about those executive stock sales – the ones NOT on the SEC sale schedule and which occurred a couple of days before the announcement of the second breach but months after the first breach.

It is possible that they discovered the first breach before any data was stolen, but if that was the case, how do you explain how the second breach, only a few months later, went undetected for several months?  There is no logic that can explain this.

We have also seen cases where the breached company didn’t want to find any evidence of something that would require them to notify anyone.  Breach?  Breach?  What breach?  I don’t see any breach.  If you tell the investigators to only look in one corner where nothing happened, they likely won’t find any problems.  The company said that they have complied with all mandatory notifications regarding the March breach.

The fact that Equifax was lobbying Congress to reduce their breach reporting requirements at the same time that they were investigating the first breach is, shall we say, a bit problematic.  And it has terrible optics.

Is this the final straw that has the board fire the CEO?  I don’t know but I would not be surprised.

Another source is saying that the goal of the attackers may have been to use Equifax to breach some of Equifax’s large banking partners.  At least one bank appears to have been compromised and Equifax says that it is working with its banking partners to mitigate damage.

Information for this post came from Bloomberg.

Facebooktwitterredditlinkedinmailby feather

Legal Risks of Cloud and Collaboration Tools

Many employees use consumer grade, unmanaged cloud services such as Dropbox and Google Drive as part of their work.  This is sometimes called BYOC for Bring Your Own Cloud.  It is convenient, but is it a good idea for the business?

Loss/theft of intellectual property

One of the obvious risks of BYOC is the loss of control (AKA theft) of corporate intellectual property.  These personal cloud services make it quick and easy to steal hundreds to thousands of confidential files by merely dragging and dropping.  AND, since the account does not belong to the company, the only way the company can force an employee to let them into their account is via a court order – an expensive and dicey proposition.  By the time that order is granted and appeals are exhausted, any evidence is likely gone.

Data breach and regulatory violations

Just because your company chooses to allow (or not stop) employees from using BYOC does not mean that company does not have liability if the data on the employee’s personal cloud, that the company does not control, is breached.  In fact, likely, the company is fully liable even though they have no authority over that data.  Violation of regulations such as HIPAA also fall on the company.

Litigation risk and electronic discovery exposure

If a company allows users to use BYOC and is involved in litigation, it is very difficult to preserve evidence that could exist on employee’s personal clouds.  If it is discovered that evidence has been destroyed or compromised, the judge could hold the company in contempt or even instruct the jury that they should assume the worst – that whatever was destroyed would have helped the plaintiffs and hurt the company.  A Florida court recently faulted a company for allowing an employee to destroy files in a personal Box account.  Also, depending on what an employee does with the files on the BYOC account, the company  may lose the ability to assert attorney-client privilege.

So what is a company to do?

There are only a couple of options –

Allow BYOC and deal with the risk.  This doesn’t seem like a great solution, but it is what many companies are doing today – understanding that they are going to lose corporate intellectual property in the best of circumstances.

Outlawing BYOC.  Done right, this can work.  After all, the employee just wants to get his or her job done, but done wrong, it can really annoy the employee.

Allow but regulate.  This is likely more complicated.  The company has to decide what BYOC services are OK, create rules for using them and then enforce these rules, but it is possible for this option to work.

For most companies, providing a corporate owned solution that works at least as easily as the employee owned consumer grade solution is probably the best solution, but every company will need to decide for itself.

Information for this post came from JDSupra.

Facebooktwitterredditlinkedinmailby feather
Visit Us On FacebookCheck Our Feed