Are You Ready for California’s New Privacy Law?

Security vendor ESet interviewed 625 business owners and executives to understand their readiness for California’s new privacy law that goes into effect on January 1, 2020.  What most businesses are missing is that Nevada’s version of the law goes into effect on October 1, 2019.  Most of the respondents were from small businesses, some of whom are exempt from the requirements of the law.  Here are the results:

  • 44% had never heard of the law
  • 11% know whether the law applies to them or not
  • 34% say that they don’t know if the law will require them to change the way they collect and store data (it likely does)
  • 22% say they don’t care if they break the law (great if you can get away with that)
  • 35% say they don’t need to change anything to be in compliance (very unlikely)
  • 37% say that they are very confident that they will have the required security in place by January 1.  Another third say that they do not know if they will have security in place
  • Half said that they did not modify their behavior or processes to bring their businesses into compliance with GDPR (most likely because they don’t know what GDPR requires)

40% of the businesses said that they did not have anyone responsible for security or privacy in their company and another 18% said they didn’t know if they had someone.

9% said they are moving to avoid having to comply with CCPA, the new California law.  Those people need to understand that they will also need to block Californians from going to their web site and refuse to ship products or deliver services in California.  None of that is realistic for most businesses.

Given the law goes into effect in less than 6 months and Nevada’s version goes into effect in two months, this lack of knowledge is concerning.  However, attorneys, especially those that specialize in class action lawsuits, are thrilled.

There is one aspect of the law that should be a cause for concern for these businesses who think they understand the law – and likely do not.

Any California resident can sue any California business that has a breach that compromises their personal information.

They do not have to show that they have been damaged to sue.

The maximum you can sue for is $750 per person.  A breach of say 10,000 records – a tiny breach by today’s standards (the Capital One breach last week compromised 106 million people) – would generate a potential lawsuit asking for $7,500,000.

Are you prepared for that?

A one million record breach – still small by today’s standards – translates to a $750 million lawsuit.

My suggestion to small businesses – think again about whether you are prepared.  If you need help, contact us.  Source: HelpNet Security.

Facebooktwitterredditlinkedinmailby feather

Security News for the Week Ending August 2, 2019

Capital One Breached – 100+ Million Applicants Compromised

Among the data compromised are 140,000 US social security numbers and 80,000 bank account numbers.  Also in the mix were one million Canadian social security numbers plus names, addresses, phone numbers, birth dates and incomes.

The data included applicants who applied between 2005 and 2019.  Yes, 15 years worth of applicant data, floating around in the cloud.  I ask WHY?

The hackers were inside between March and July and the breach was discovered in July.  In this case, a U.S. person was identified as the source of the hack and arrested.  She is still in jail.

The feds say a configuration error allowed her to access their data which was stored in the cloud.  See more information at The Register.

 

Florida Senator Admits He Hasn’t Read the Report on Russian Hacking of Florida’s Election Systems

After the Republican controlled Senate Intelligence Committee released the first volume of it’s report of Russian hacking of the 2016 Presidential elections, Florida Senator and at the time Florida Governor Rick Scott said on national TV that he has not read the report.  The report, which is heavily redacted, talks about Russian efforts to hack “State-2” which is widely believed to be Florida.

The report is only 67 pages;  much less if you read the redacted version, but Scott has only gotten the Cliff-Notes version from his staff.  At the time, Scott was adamant that his state was not hacked.  Florida’s other Senator, Marco Rubio, has been working hard to sound the alarm bells on the report.  Perhaps the report hit a little to close to Scott’s denials for comfort.  Source: The Tampa Bay Times.

 

Honda Exposes the Family Jewels

134 million rows of sensitive data was accidentally exposed.  Wait.  Guess.  On an unprotected elastic search database.

Information on the company’s security systems, network, technical data on workstations, IP addresses, operating systems and patches were all exposed.  Basically, these are directions for even an inexperienced hackers to attack Honda.

Honda  is being pretty quiet about this, but it is one more more case of corporate governance gone wrong.  Or missing.  Source: Silicon Republic.

 

Apple Suspends Program Of Listening to Siri Recordings

After it was reported last week that Apple had contractors listening to people’s Siri recordings, including sensitive  protected health information,  Apple announced it was suspending the program and will conduct an investigation.  Apple said they will provide an option for people to participate in the program or not, in a future software release.  Source: The Guardian.

 

On Eve of Amazon Getting Awarded $10 Billion DoD Contract, Capital One Happens

Amazon and Microsoft are locked in mortal combat over a $10 billion DoD cloud contract called Jedi.  Now the Capital One breach happens exposing information on 100 million customers and it turns out the person who is accused of doing it is a former Amazon tech employee who may have hacked other Amazon customers as well.

So Congress wants some answers – and probably so does Microsoft.  $10 billion could be hanging in the balance.

This is a message for cloud customers to ask some hard questions of their cloud vendors, even though this particular attack was helped by a configuration error. Source: Bloomberg.

Facebooktwitterredditlinkedinmailby feather

Is The Encryption Debate Over?

Attorney General Barr said that he wants an encryption back door and if it compromises your privacy, well, we are not talking about protecting nuclear launch codes.  So we  know where he stands.

What came as a bit of a surprise is that Facebook says that they are going to build a back door into WhatsApp.  Not sure why.  Where is the pressure?  Who has the compromising pictures? Likely it is just greed.  They want to be able to operate in every country and since there are a number – a small number right now – that won’t let them operate without allowing those governments to spy on their users, the simple answer is to cave.

Here is what Facebook says they are going to do.  They are not going to, technically, insert a back door.  They might even claim this is a service to their users.

Think about this for a moment.  Right now WhatsApp cannot read your messages so they can’t target ads at you.  If they did know what you are saying, they could use or sell that data to advertisers.  That is just one possible use.

They are going to modify their app to do “content moderation”.  Content moderation is a covert word for censorship.  If China, for example, doesn’t want anyone to say anything bad about Xi, the moderation software will look for people saying bad things about him and stop it.

Since this happens on the user’s device, the encryption is not an issue because the user can decrypt stuff on their device.

Then, to make sure that the government will allow them to operate, they will send any banned content to a central moderation facility (AKA the government censors) to figure out who the local goon squad should come visit.

Obviously, the country can tell Facebook what they want them to look for.

Now say that you decide that you don’t like that and you switch to Signal.

The government could go to Signal and say “if you don’t want to be blocked you have to do content moderation.  It has nothing to do with your encryption.  Don’t say you can’t do it, because Facebook is doing it”.  At that point, privacy is pretty much done with.

It is *possible* that Signal, since it is not a commercial profit making company, might say go for it, block us.  That is not great for Signal, but, it might be better than compromising their principles.  Who knows.

Any government, no matter how repressive, now has a way to demand that software vendors give the their back door.

Facebook won’t say when this will be deployed – assuming it is not already deployed.  Why?  Because it might cause their customers to leave and that would, kind of, defeat the purpose.  I can already see the handwriting on the wall, so I am working to migrate away from WhatsApp and delete the application.

The total end game here could be to force Apple and Google to add “content moderation” to the operating system.  That is really what the repressive regimes like China and other repressive regimes (including, apparently, the US) would like to happen.

Stay tuned.  It is not clear how this is going to come down, but we certainly have a roadmap.

Source: Forbes.

Facebooktwitterredditlinkedinmailby feather

Is Your Cybersecurity Program Working?

That’s kind of a loaded question, but still important.

After all, you are spending a bunch of money on it;  how do you know if you are getting your money’s worth?

Or maybe you are not spending very much at all – in that case how do you know if you are adequately protecting your company?

Given those questions, Larry Ponemon, the researcher who performs research for almost anyone who pays him (but there is no evidence that his research is skewed because of that) and AttackIQ conducted a study.  AttackIQ is a security tool vendor.

Larry’s study says that on average, enterprises spend around $18 milion on cybersecurity every year (what is included in that is, of course, somewhat variable) and more than half of them plan to increase that by as much as 14% next year.

53 percent of those responding said that they have no idea how well the tools are working in their corporate networks.

On average, these IT folks say that they have almost 50 cybersecurity tools installed.  Larger companies run sometimes as many as a couple hundred.  How could you know if the tools are working if you have that many?

A little over a third think they are getting “full value” from their investments.

Worse yet, over 60% said that they have actually experienced a tool that said that it blocked a security threat, when, in fact, it had not.

Almost 60% of the respondents said that lack of visibility was the reason there were still breaches, even though they have almost 50 tools installed.

40 percent think that their teams are effective at finding and plugging security holes.  This means that almost two thirds do NOT think their teams are effective at their primary mission.

Almost two thirds said that their is no set schedule for penetration tests.

Click here to see the full report.

So what does all of this mean?

It likely means that buying more tools will not fix the problem.

It doesn’t mean that you should halt your security program either, however.

It does mean that you have to have a robust cybersecurity governance program.  That should not come as much of a surprise.  At some levels, cybersecurity is a hard problem.  At other levels, it is very straight forward.

The basics need to be done –  governance, planning, training, policies, backups, incident response, endpoint protection, encryption and so on.

What requires more analysis is some of the very expensive tools that some of the vendors are selling.  Some of the tools cost tens of thousands of dollars – or more. 

It is fair that companies need to assess the programs that they have in place.  No different than any other program that a company runs.

The challenge is how do you measure whether the program is working or not?    Is it working because you didn’t get hacked today?  At some level, yes, but at other levels no.  How do you measure success?

I don’t have all the answers.  I wish I did.  But every company needs to consider what they are doing.  If you are just doing the basics then that analysis is pretty simple.   But if you are looking, like enterprises are, at spending $18 million a year, then you need to figure out how to define success.

Most of our clients are not in the league of spending that kind of money on security, but security is a $125 billion a year business according to Gartner and growing. so for every company that is spending way less than that $18 million, there are some that are spending way more.

Cybersecurity is a big investment for every company.  Make sure that you are spending that money wisely.  Start with the basics.  Do those basics right.  Then look at the advanced things.  Set up metrics.  Brief management.  Ask questions.  It is, after all, something that could take down your company if you do not do it right.

Again, the Ponemon study is available here.

 

 

 

 

Facebooktwitterredditlinkedinmailby feather

Apple Contractors “Regularly Hear Confidential Details’ on Siri Recordings

Apple uses contractors to listen to Siri recordings to figure out whether Siri responded correctly.  Apple says that these contractors are under non-disclosure agreements and the Siri conversations are not directly tied to the person’s iPhone or Apple credentials.

Still, these people hear about:

  • Confidential medical conversations
  • People having sex
  • Drug deals
  • Other likely illegal activities
  • Business deals

While they grade Siri on it’s responses, they don’t have to grade it on the subject matter of those conversations.

Apple does not specifically disclose that they hire contractors to listen to your requests, but they did not deny it either.  They say only about one person of the conversations per day are reviewed by humans.  Still, that is likely millions of sound bites.  Per day.

You are probably saying why would someone ask Siri a question while having sex?  Well, the short answer is that they do not.  But Siri can get confused and think that you said the activation word when you did not, hence the recordings.

If you have an iPhone or other Siri enabled Apple device around you, you implicitly consent to Apple recording you and humans listening to that conversation sometimes, whether you asked it to or not.  Siri can be activated accidentally, apparently, by the sound of a zipper.  Really?!

Another way that Siri can be activated is if an Apple Watch detects it has been raised, which could easily happen during drug deals. Or during sex.

So lets assume that you are OK with the possibility, maybe even likelihood that Siri may record you in compromising or private situations.

Does that mean that other people in the room are okay with that?  Like your sec partner.  Who may use your name.

Are other people in the room even aware that they are being recorded?

Is that even legal?  Answer: probably not in states that require two party consent, but I am not aware of a court decision yet,

In some companies, you are not allowed to bring your electronic devices into the building.  You may remember that Snowden required reporters to put their iPhones in the refrigerator to block signals to them.

If you are concerned about the confidentiality of a conversation you are having then you need to ask these questions.  Samsung was forced to put a disclosure on their TVs to this effect after a lawsuit.

Remember, it is not your device that you have to be worried about, it is everyone else within earshot that you should be concerned about.

Not only does this include Siri devices, but it includes any other smart device that has the capability to covertly record.

Source: The Guardian

Facebooktwitterredditlinkedinmailby feather

Security News for the Week Ending July 26, 2019

Equifax Agrees to Pay UP TO $700 Million to Settle Breach Lawsuits

First – the settlement hasn’t been agreed to by the court yet, so this is all speculation.

Of the $700 million pot, at least $300 million is set aside to pay damages to consumers.  Another $100 million plus is to pay for credit monitoring.

There are lots of details.  For the most part, unless you can prove damages and prove that those damages were caused by the Equifax breach and not some other breach, you probably will not get paid much.  You can get paid up to $250 if you file a claim and without proof.  Everything past that requires proof.   With 150 million victims and a $300 million pot, that averages to $2 a person.

BUT there is one thing you should do and that is get the free credit monitoring.    Go to EQUIFAXBREACHSETTLEMENT.COM and wait until it says that the court has approved it.  Note this is not a site owned by Equifax and given what a mess they are, this is good.  Read more details here.

The Next NSA Hacker Gets 9 Years

Harold Martin, the NSA contractor (employed by Booz, like Edward Snowden) was sentenced to 9 years for stealing 50 terabytes of data over the course of his 22 year NSA career.  The leak is something like 5 times the size of the Snowden leak.  He didn’t sell it;  he just liked data.  He had so much he had to store in in sheds in his back yard.  Many of the documents were clearly marked SECRET AND TOP SECRET.

The fact that he was able to steal hundreds of thousands of documentss doesn’t say much for NSA security, which is sad.  Source: Nextgov.

Huawei – Bad – Not Bad – Bad?!

President Trump said that Huawei is a national security threat and needs to be banned and then he said that maybe we can trade that threat for a better deal with China on trade.

Now it is coming out that Huawei helped North Korea build out their current wireless network.  The equipment was shipped into North Korea by Chinese state owned Panda International.  This has been going on since 2006 at least.  Huawei is likely continuing to provide technical support to North Korea.

This seems like a national security threat and not a bargaining chip for the President to toss in to get a trade deal that he wants, but what do I know.  Source: Fox News.

 

AG  Barr Says He Wants Encryption Back Door And Why do You Need Privacy – Just Suck it Up.

Attorney General William Barr said this week that if tech companies don’t provide a back door into consumer encryption,  they will pass a law forcing it.  And while this will allow hackers and Chinese spies to compromise US systems, it is worthwhile.

He said that they might wait for some terrorist event that kills lots of people and blame it on encryption (whether that is true or not).

He did seem to exclude “custom” encryption used by large business enterprises, whoever that might include.

Barr said that bad guys are using crypto to commit crimes what the police can’t investigate.  If that were true we would expect that crime would be going up.  If it is a really bad problem, it would be going way up.

Only problem is that the statistics say crime is going down.

You may remember that Juniper added such a back door, likely at the request of the NSA and it worked great until word got out about it and hackers had a field day.

This conversation is not over.  Source: The Register.

 

Facebooktwitterredditlinkedinmailby feather

Privacy, Security and Cyber Risk Mitigation in the Digital Age

Visit Us On FacebookCheck Our Feed