Category Archives: Security Practices

Presidents’ 2020 Apps Not Secure

I am not sure whether this is a surprise or not.

The apps for both Biden and Trump are not secure. Does that show up as a surprise to you?

Let’s start with Biden’s App.

Biden’s iOS app did not even validate the email addresses, so anyone, say in North Korea can download and abuse the app.

They take your contact information and merge it with information from Target Smart’s voterbase, using your data to enrich their profile of 250 million consumers. While some of the fields are not exposed in the user interface, they are available to anyone reverse engineering the app. The starting data is public voter rolls data, but where it becomes valuable is when they can add your information (where your is thousands or millions of downloads) to their database.

Of course a bad actor could download the app and corrupt the database with millions of compromised contacts.

When the researchers notified Joe’s team, they fixed the flaws (whatever that means) almost immediately.

Now let’s move on to Trump’s app.

Their first problem was a little worse. They exposed hardcoded secret security keys to their Twitter and Google accounts.

In addition, Don’s app learned a lesson from TikTok. They are scraping every piece of user data off the phone that they can find. I think he called that a national security threat when TikTok did that.

In a very smart move (and perfectly legal), Trump’s app turns raising money for the campaign into a game. People get points for raising money and could wind up on a leader board if they raise enough money OR if they get their friends to install the app.

In both cases, the exposure comes from taking public data and, as the data scientists call it, “enriching it” with non-public data such as data collected by friends or by polluting it, with data collected by foes. It appears that it may be possible for folks to steal some of that enriched data.

The exposed security keys are a different story, of course. That is just a problem.

It just shows that political apps are not any more secure than any other app. Which should not be much of a surprise, but means users should not let their guard down.

No politician wants to spend money on tech, although every politician uses tech. In fact, these days, tech is critical, but so is cost containment.

It also points out that politics, these days, is all about the data and both the red team and the blue team are trying their best to collect the most data while at the same time hoping that no one will corrupt their data, either maliciously or accidentally. Or complain about their practices. Credit: Bleeping Computer

It All Starts With Physical Access

Sometimes we focus on the details of cybersecurity protections. And ignore the core issues.

In a lot of cases, when companies office in multi-tenant office buildings, the Internet comes into a shared area of the building that is not part of the company’s leased space. This is called a Dmarc for point of demarcation. The demarcation is where the Internet provider’s responsibility ends and your company’s responsibility starts.

But this is not in your space. it could be in a closet or in the building’s basement. You may not even have access to that space. If you do have access, other people may also have access. It may not even be locked. I used to have an office in a building where all of the communications connections came in to the basement and that space didn’t even have a door, never mind a lock.

Many times it is more convenient to put your company’s network gear such as switches and firewalls in this area. That way you don’t have to allocate any space in your area.

But why is this a problem?

Because now a hacker doesn’t have to hack your network from the outside; he or she can just come in and be on the inside. He or she can pay a janitor a few bucks, at night, to let him or her in, for example, or pick a lock. When only the cleaning crew is there, is someone taking 60 seconds to pick a lock in a hall closet going to be noticed?

Come into the building at night when the cleaning crew is there and insert a probe into your network. The cleaning crew is not going to stop anyone. At that point the hacker may be able to see and capture and transmit all of your network data to any place they want. They can come back at some time in the future and retrieve their gear. Or consider it a throwaway.

So what should you be doing?

Number one is that YOUR Dmarc should be inside your office space and it should be locked in a cabinet. The cabinet can have a tamper seal on it (since locks are for honest people) to make it more likely that you can detect if someone tries to get into it.

Hackers sometimes masquerade as cleaners or maintenance people and even if the equipment is in your space, if it is easily accessible, then that is a problem. Other times they just bribe them.

No one wants to think that an employee would go rogue, but it does happen. Ask the NSA. They “vetted” Edward Snowden. It didn’t work out very well for them.

If you lock the equipment up – and I am talking all network gear – you at least make it more difficult for the hackers.

You still have to deal with that common area Dmarc, but for a one time fee, the utility will typically extend that into your space. Then they are responsible for that wire. If you have to extend it yourself, you really should put your firewall at the end of the wire that is in your space. That way, anything outside your firewall is not trusted and not a whole lot different than what a hacker sees from the Internet – untrusted and with no sensitive data.

If you have questions about how your network gear is protected, reach out to us. We can do a virtual inspection and make recommendations for improvements, if needed.

Beware: Changes to HTTPS Certificate Requirements

This is a follow up to yesterday’s newsletter alert and sorry, it is a bit technical, but I will try to make it as untechnical as possible.

Up to a few years ago, if you ran a website, you could buy an HTTPS (also known as a TLS or SSL) certificate that didn’t expire for 10 years. The problem is that if something happened, a malicious actor could continue to use that certificate and masquerade as a legitimate website owner, possibly for an additional 9 and half years.

There was a certificate revocation process to stop compromised certificates from being used any more, but it never really worked.

As a result, a few years ago, the board that oversees the browser makers (called the CA/Browser Forum) and the certificate authorities that issue certificates reduced the allowed lifetime for a certificate to three years. This was a lot better than 10 years, but still a malicious actor could use a compromised certificate for several years.

As the CA/Browser Forum continued to wrestle with how to deal with compromised certificates, they invented something called OCSP or the Online Certificate Status Protocol. The idea is that the user’s browser could look inside the certificate to find the OCSP web site that the certificate creator runs and a browser can use that webiste to see if a certificate is still good. The problem is that this process doubles the number of requests that is required in order to load a web page. For example, as I write this, the home page of Fox news requires 84 separate calls just to load that one page. Some might be an image or a video or it could be some code. If you have to check to see if the certificate for each of these loads is valid, now you have to make 168 calls, significantly increasing the time to display the results to the user.

And, what do if that web site is down, overloaded or takes too long to respond? Do you not display the page?

During this time the CA/Browser forum reduced the allowed lifetime of a certificate to just two years. Still a bad actor can do damage for a year or more, but each time, we reduce the window for malicious activity.

Then they came up with yet another standard called OCSP Stapling. With stapling, the website owner is responsible for checking to see if the certificate is still valid. A website will get an OCSP certificate from the certificate authority say every few hours. That is then “stapled”, securely, to the HTTPS certificate that is sent to the user’s browser. When there is, say, an hour left in the life of the OCSP certificate, the website owner orders a new one. It has an hour, say, to get it and that is an eternity in browser time. For a while not all browsers understood stapling but now they do.

BUT, there is nothing to force a web site to support either OCSP or STAPLING and many do not support either.

Sometime along this time, came Let’s Encrypt. Let’s Encrypt offers a lower security (but okay for many users) certificate, but it is free and it only lasts 90 days before it expires. Now we have really reduced the bad actor’s window of opportunity.

But Let’s Encrypt came with a new standard called ACME (this has nothing to do with the Road Runner 🙂 ). With ACME, once you get Let’s encrypt installed on your server, it AUTOMATICALLY renewed itself every 90 days. This completely eliminated the work for administrators to manage and Let’s Encrypt has now issued a BILLION certificates.

Of course the certificate authorities aren’t thrilled with someone giving away their product for free, even if it is a slightly lower security product.

There was an effort in February to reduce the lifetime of certificates to one year, but it failed to get approved at the CA/Browser Forum meeting. Administrators and certificate authorities complained about the workload, but if everyone implemented ACME or something like it, that problem goes away.

OK, so now you are up to date. Fast forward to 2020.

Like Google, Microsoft and others, Apple has a lot of clout. After the move to reduce the certificate life to one year failed earlier this year, Apple said you guys can do whatever you want, but we are not going to display any web page that has a certificate (and this is important) THAT WAS ISSUED AFTER SEPTEMBER 1, 2020 AND HAS A LIFETIME OF MORE THAN A YEAR PLUS A MONTH GRACE PERIOD.

This means that if you have a new certificate that has a two year life and someone visits your website from an iPhone, iPad or Mac after September 1, they will get an error message.

So basically, Apple forced the issue.

Once this was a done deal, Google Dogpiled.

This means that if you get a new certificate with a two year life after September one, about 80% of the world’s users will no longer be able to get to your website.

THIS is why the change is kind of important.

Got questions? Contact us. Credit: ZDNet

Is Your Computer Spying on You?

It is pretty interesting what you find when you rummage around your computer.

Most computers these days have cameras and microphones. Do you know which applications can access your camera? What about your microphone? I didn’t. In fact, I didn’t even know where to look to find the answer to that question. When I looked, I was surprised what I found.

Both of these device controls can be found in the Windows SETTINGS app.

In settings, click on CAMERA to see this:

From this screen, you can see which apps, on my computer, had access to my camera. I understand why Skype needs access to my camera (maybe – depends if you are a Skype user), but why does the 3D Viewer need it? I am not even sure what that is. Microsoft Photos? I ONLY use it to look at pictures. Disable all of those items that you do not want to give access to your camera. You can always turn it back on if you want to.

Now move onto your microphone. It is on the same screen, just further down.

Again, there are apps that I don’t even know what they are that have access to my microphone. What is the feedback hub anyway?

Note that Microsoft’s Cortana is disabled. That is because I don’t use it. If you do use it, it needs to be on.

It is unlikely that these apps are evil, but they do increase the attack surface.

Every app has the possibility of being compromised or having bugs that allow hackers to take over the apps and take control your devices.

You have probably seen people that put tape or little slides over their cameras. That pretty effectively stops people from seeing things that they should not see.

There is no equivalent way to stop apps from hearing what is going on. Tape does not solve this problem.

In some cases there is a way to handle this.

After using a laptop for many years, last year I switched to a desktop. I wanted to have a more powerful computer – multiple disk drives, an amazing amount of memory, etc.

One thing that happened as a result of that was that I no longer had a built in camera. My camera sits on top of my monitor and plugs into a USB port.

For me – and this won’t work for everyone – I unplug my camera when I am not on a video conference. That camera, an inexpensive Logitech unit, is also my computer’s microphone. When I unplug the camera, the microphone is unplugged as well.

Highly effective. I don’t know how to hack a camera or microphone that are not connected and not powered on. Consider that.

Just food for thought.

Feds Fine Capital One for Shoddy Cloud Security

Dial back your wayback machine to September of last year. Capital One announced a hack of their Amazon environment by an ex-Amazon employee the previous July that was possible to due an incorrect configuration of their security settings.

Fast forward to today and the feds announced an $80 million fine for bad cloud hygiene.

The feds (the OCC) fined Capital One for Failure to establish effective risk management processes” prior to migrating some of their systems to the cloud.

The OCC said that they considered the bank’s notification and remediation processes favorably in assessing the fine, meaning that the fine would likely have been larger if they hadn’t responded as well after the breach as they did.

On the other hand, they said that the bank glossed over numerous weaknesses in an internal audit.

On top of that, the OCC said that they didn’t report the flaws that they found appropriately to their Board’s audit committee. This means that internal processes were not sufficient to allow the Board to perform it’s fiduciary responsibility. Rather than blaming the Board, in this case they blamed management.

They also claim that Capital One failed to patch security vulnerabilities, violating regulations that banks must follow (GLBA).

After Capital One got caught, the bank decided this was a good time to spend some money on cybersecurity and start fixing the problems.

There is a moral here, I think.

This is a bank, so the expectations for security are high, but still …..

You could wait for a breach and the ensuing regulators and lawsuits. And fines. Or you can start looking at cyber risk management as a business problem and decide that it is probably cheaper to spend the money pre-breach. Last year Capital One said the breach could cost them $150 million. Whether this $80 million fine is in addition is not clear. Credit: The Register

Here is a Match – Lawyers+Security Pros

There are an amazing number of misconfigured Amazon S3 buckets. I have no clue why. No company should be in this boat any more.

Truffle Security said that a team of there security pros STUMBLED across about 4,000 of them.

What was in them?

Login credentials – not great.

Security keys – even worse.

API keys – worse yet.

Also SQL server passwords, Coinbase API keys. Even login info for other AWS S3 buckets.

But what I like is capitalism.

Some enterprising researchers are teaming up with law firms. Why?

The researchers find the leaky buckets.

The law firms sue the owners (and pay a commission).

Sounds like a win-win-win deal. Win 1 – the lawyers get a payday. Win 2 – the researchers get a commission. Given there are so many leaky buckets, everyone gets rich.

What is the third win for? Win 3 – the companies get to close the leaky buckets.

Mind you it might have been cheaper if they just used the tools that Amazon has made available, but whatever gets the job done.

I am only being slightly a smartass. If this isn’t a great reason to hunt for leaky S3 buckets, I can’t think of a better one. Find those leaks. And close them. Avoid those lawsuits. P-L-E-A-S-E!!!!!! Credit: The Register