Is Your Cybersecurity Program Working?

That’s kind of a loaded question, but still important.

After all, you are spending a bunch of money on it;  how do you know if you are getting your money’s worth?

Or maybe you are not spending very much at all – in that case how do you know if you are adequately protecting your company?

Given those questions, Larry Ponemon, the researcher who performs research for almost anyone who pays him (but there is no evidence that his research is skewed because of that) and AttackIQ conducted a study.  AttackIQ is a security tool vendor.

Larry’s study says that on average, enterprises spend around $18 milion on cybersecurity every year (what is included in that is, of course, somewhat variable) and more than half of them plan to increase that by as much as 14% next year.

53 percent of those responding said that they have no idea how well the tools are working in their corporate networks.

On average, these IT folks say that they have almost 50 cybersecurity tools installed.  Larger companies run sometimes as many as a couple hundred.  How could you know if the tools are working if you have that many?

A little over a third think they are getting “full value” from their investments.

Worse yet, over 60% said that they have actually experienced a tool that said that it blocked a security threat, when, in fact, it had not.

Almost 60% of the respondents said that lack of visibility was the reason there were still breaches, even though they have almost 50 tools installed.

40 percent think that their teams are effective at finding and plugging security holes.  This means that almost two thirds do NOT think their teams are effective at their primary mission.

Almost two thirds said that their is no set schedule for penetration tests.

Click here to see the full report.

So what does all of this mean?

It likely means that buying more tools will not fix the problem.

It doesn’t mean that you should halt your security program either, however.

It does mean that you have to have a robust cybersecurity governance program.  That should not come as much of a surprise.  At some levels, cybersecurity is a hard problem.  At other levels, it is very straight forward.

The basics need to be done –  governance, planning, training, policies, backups, incident response, endpoint protection, encryption and so on.

What requires more analysis is some of the very expensive tools that some of the vendors are selling.  Some of the tools cost tens of thousands of dollars – or more. 

It is fair that companies need to assess the programs that they have in place.  No different than any other program that a company runs.

The challenge is how do you measure whether the program is working or not?    Is it working because you didn’t get hacked today?  At some level, yes, but at other levels no.  How do you measure success?

I don’t have all the answers.  I wish I did.  But every company needs to consider what they are doing.  If you are just doing the basics then that analysis is pretty simple.   But if you are looking, like enterprises are, at spending $18 million a year, then you need to figure out how to define success.

Most of our clients are not in the league of spending that kind of money on security, but security is a $125 billion a year business according to Gartner and growing. so for every company that is spending way less than that $18 million, there are some that are spending way more.

Cybersecurity is a big investment for every company.  Make sure that you are spending that money wisely.  Start with the basics.  Do those basics right.  Then look at the advanced things.  Set up metrics.  Brief management.  Ask questions.  It is, after all, something that could take down your company if you do not do it right.

Again, the Ponemon study is available here.

 

 

 

 

Facebooktwitterredditlinkedinmailby feather

Apple Contractors “Regularly Hear Confidential Details’ on Siri Recordings

Apple uses contractors to listen to Siri recordings to figure out whether Siri responded correctly.  Apple says that these contractors are under non-disclosure agreements and the Siri conversations are not directly tied to the person’s iPhone or Apple credentials.

Still, these people hear about:

  • Confidential medical conversations
  • People having sex
  • Drug deals
  • Other likely illegal activities
  • Business deals

While they grade Siri on it’s responses, they don’t have to grade it on the subject matter of those conversations.

Apple does not specifically disclose that they hire contractors to listen to your requests, but they did not deny it either.  They say only about one person of the conversations per day are reviewed by humans.  Still, that is likely millions of sound bites.  Per day.

You are probably saying why would someone ask Siri a question while having sex?  Well, the short answer is that they do not.  But Siri can get confused and think that you said the activation word when you did not, hence the recordings.

If you have an iPhone or other Siri enabled Apple device around you, you implicitly consent to Apple recording you and humans listening to that conversation sometimes, whether you asked it to or not.  Siri can be activated accidentally, apparently, by the sound of a zipper.  Really?!

Another way that Siri can be activated is if an Apple Watch detects it has been raised, which could easily happen during drug deals. Or during sex.

So lets assume that you are OK with the possibility, maybe even likelihood that Siri may record you in compromising or private situations.

Does that mean that other people in the room are okay with that?  Like your sec partner.  Who may use your name.

Are other people in the room even aware that they are being recorded?

Is that even legal?  Answer: probably not in states that require two party consent, but I am not aware of a court decision yet,

In some companies, you are not allowed to bring your electronic devices into the building.  You may remember that Snowden required reporters to put their iPhones in the refrigerator to block signals to them.

If you are concerned about the confidentiality of a conversation you are having then you need to ask these questions.  Samsung was forced to put a disclosure on their TVs to this effect after a lawsuit.

Remember, it is not your device that you have to be worried about, it is everyone else within earshot that you should be concerned about.

Not only does this include Siri devices, but it includes any other smart device that has the capability to covertly record.

Source: The Guardian

Facebooktwitterredditlinkedinmailby feather

Security News for the Week Ending July 26, 2019

Equifax Agrees to Pay UP TO $700 Million to Settle Breach Lawsuits

First – the settlement hasn’t been agreed to by the court yet, so this is all speculation.

Of the $700 million pot, at least $300 million is set aside to pay damages to consumers.  Another $100 million plus is to pay for credit monitoring.

There are lots of details.  For the most part, unless you can prove damages and prove that those damages were caused by the Equifax breach and not some other breach, you probably will not get paid much.  You can get paid up to $250 if you file a claim and without proof.  Everything past that requires proof.   With 150 million victims and a $300 million pot, that averages to $2 a person.

BUT there is one thing you should do and that is get the free credit monitoring.    Go to EQUIFAXBREACHSETTLEMENT.COM and wait until it says that the court has approved it.  Note this is not a site owned by Equifax and given what a mess they are, this is good.  Read more details here.

The Next NSA Hacker Gets 9 Years

Harold Martin, the NSA contractor (employed by Booz, like Edward Snowden) was sentenced to 9 years for stealing 50 terabytes of data over the course of his 22 year NSA career.  The leak is something like 5 times the size of the Snowden leak.  He didn’t sell it;  he just liked data.  He had so much he had to store in in sheds in his back yard.  Many of the documents were clearly marked SECRET AND TOP SECRET.

The fact that he was able to steal hundreds of thousands of documentss doesn’t say much for NSA security, which is sad.  Source: Nextgov.

Huawei – Bad – Not Bad – Bad?!

President Trump said that Huawei is a national security threat and needs to be banned and then he said that maybe we can trade that threat for a better deal with China on trade.

Now it is coming out that Huawei helped North Korea build out their current wireless network.  The equipment was shipped into North Korea by Chinese state owned Panda International.  This has been going on since 2006 at least.  Huawei is likely continuing to provide technical support to North Korea.

This seems like a national security threat and not a bargaining chip for the President to toss in to get a trade deal that he wants, but what do I know.  Source: Fox News.

 

AG  Barr Says He Wants Encryption Back Door And Why do You Need Privacy – Just Suck it Up.

Attorney General William Barr said this week that if tech companies don’t provide a back door into consumer encryption,  they will pass a law forcing it.  And while this will allow hackers and Chinese spies to compromise US systems, it is worthwhile.

He said that they might wait for some terrorist event that kills lots of people and blame it on encryption (whether that is true or not).

He did seem to exclude “custom” encryption used by large business enterprises, whoever that might include.

Barr said that bad guys are using crypto to commit crimes what the police can’t investigate.  If that were true we would expect that crime would be going up.  If it is a really bad problem, it would be going way up.

Only problem is that the statistics say crime is going down.

You may remember that Juniper added such a back door, likely at the request of the NSA and it worked great until word got out about it and hackers had a field day.

This conversation is not over.  Source: The Register.

 

Facebooktwitterredditlinkedinmailby feather

New Android Spyware Found, Created By Russian Company Who Interfered with 2016 Elections

Researchers have found a new piece of Android spyware that was likely developed by a Russian contractor that has been sanctioned for interfering with the 2016 U.S. Presidential elections.

The spyware, called Monokle, has an amazing range of spying capabilities and can steal data, even without having root access on the phone.

The spyware, distributed as seemingly legit copies of popular apps such as Signal, Google Docs, Facebook Messenger, WhatsApp, WeChat and others, reads data on the screen.  It also looks at the predictive dictionary to see what the user might be interested in.

If the spyware can get root access, it installs a security certificate so that it can intercept encrypted traffic.

The spyware is very sophisticated.  It has “modules” and can be added to it.  Some of the modular functionality includes:

  • Tracking the device’s location
  • Recording audio in the room
  • Recording phone calls
  • Recording what is on the screen
  • Recording keystrokes
  • “Fingerprinting” the device
  • Stealing browser and call histories
  • Stealing emails, text messages and other messages
  • Stealing contacts
  • Stealing calendar info
  • Making calls pretending to be the user
  • Sending texts pretending to be the user
  • If root access is available, run arbitrary commands

It has 78 separate commands that it can run.

Many of the infected apps even have the regular functionality of the real app.

The company that wrote it, STC, has been sanctioned by the U.S. and is known to create drones and other RF equipment for the Russian military and government customers.

The researchers found samples of iOS malware, so likely they are working on an iPhone version.

From a user’s standpoint, there a few things that you an do to help things.

Only install apps from the app store.  While this is not foolproof as both Apple and Google have been known to distribute infected software, both try to keep infected software out.

If you get an email or text message telling you to click on a link to install a fix for an app that you have, do not click on it,  Go to the app store directly to look for any updates.

Many endpoint protection software products have a mobile version for phones.  While these typically cost money, that is better than being infected.

Right now it appears that the spyware is targeting high value targets, but of course that could change and there could be knockoffs of the software.

Bottom line, be vigilant.

Source: The Hacker News

Facebooktwitterredditlinkedinmailby feather

How Long Does It Take For a Public RDP Server to be Hacked

Even though we keep telling people not to enable Microsoft’s Remote Desktop Protocol (RDP) on Internet facing servers, a recent check showed there were still a million servers vulnerable.

“In recent years, criminals deploying targeted ransomware like BitPaymer, Ryuk, Matrix, and SamSam have almost completely abandoned other methods of network ingress in favor of using RDP,” say Sophos researchers Matt Boddy, Ben Jones, and Mark Stockley.

Hackers use password cracking tools and buy passwords for already cracked servers in order to get in.

To see how long it took for servers to be compromised, researchers set up 10 geographically dispersed Windows Server 2019 installations in the Amazon cloud.  Those servers had RDP enabled.

To make life interesting, the servers were set up with extremely strong passwords.

The first server was hit with an attempted login in ONE MINUTE AND TWENTY FOUR SECONDS after it was brought online.

The last one was attacked in a little over fifteen hours.

The test servers were live for a month.  During that time period, there were over 4 million attempted logins to those servers.

The hackers are creative in their attacks so as to not get detected or blocked.  Sometimes people claim that the search engine SHODAN is the reason for these attacks, but these 10 servers were never listed in SHODAN.

Given this, what should you do?

First, unless you have no other viable alternative, do not expose RDP publicly on the Internet.

Security teams have been trying for years to get everyone to use strong passwords but that really has not worked.  Not at all.

You can make the hacker’s job harder by turning on two factor authentication, but if you do that, make sure that second factor is strong – not a text message,  Installing client side security certificates is one good idea because once they are installed, they are invisible to the user.

The preferred method is to require users to connect to the company network via strong VPN solution if you must absolutely use RDP.

Source: HelpNet Security

 

Facebooktwitterredditlinkedinmailby feather

Cloud Service Providers Are Not Immune from Ransomware

You moved your applications to the cloud.  Now you don’t have to worry about managing IT systems.  The headaches are someone else’s.

Well sort of.

Here is what customers of Quickbooks cloud hosting provider iNSYNNQ are seeing when they try to log on:

This is what they have been seeing for the last three days.

The hosting provider experienced the ransomware attack on July 16.

The company’s web site says that they are now beginning to restore user’s data but the process will take a while.

They are saying that some files (they are not saying how many) were encrypted and they hope that you made your own backups.  They are trying to figure out how to deal with those encrypted files.

And, oh yeah, from now on you should probably make your own backups.

And what, exactly, am I paying you for?

So what does this mean for you?

Lets assume for the moment that you are not an iNSYNQ customer, since most of the planet is not.  And, I suspect, many of their current customers will not be their current customers for long.

First, DO NOT assume that because you moved something to the cloud, things are not your responsibility any more.  Kind of like your self driving car. You better be ready to stomp on the brakes in case your car makes a mistake.

Check your cloud service provider’s TERMS OF SERVICE.  Likely it says that they are not responsible for many things.  Make sure that, for those things, you have a plan.

Many cloud service providers have a “shared responsibility” model at the core of their offerings.  That means that they acknowledge that they are responsible for some things, but you are responsible for others.  Make sure that you know who is responsible for what.

Understand what the provider’s guarantee is regarding uptime.  iNSYNQ has been down for 7 days and says that it will be more days before they are back up – possibly minus your data.   Most of the time it says that they will get things working again as best they can, but with no time frame.  Is that going to work for your business.  In this case, it is the client’s accounting software.  Is not being able to write checks a problem?  Is not being able to run payroll going to bother anyone?  Is losing years worth of financial data going to upset your investors, your regulators and your customers?

DO YOU HAVE A PLAN FOR WHAT TO DO IN A CASE LIKE THIS?

Lastly, does the provider offer a guarantee?  Often they will not charge you for the time they were down.  Lets say they charge you $200 a month for their service and they are down for two weeks.  Likely that means that they want you to pay your bill for the month, but they will very generously give you a $100 credit on that bill.

DOES THAT COVER YOUR PAIN?  I DIDN’T THINK SO.

Maybe your accounting software is not terribly important you?

What about your web site?

Or your manufacturing software?

Or whatever else you moved to the cloud.

Understanding the risk is a good thing.  I strongly recommend it.

Source:  The iNSYNQ website, here and here.

 

Facebooktwitterredditlinkedinmailby feather