Category Archives: Google

The End of the Road for HTTP://

Google has decided to lead the way on web, as it often has.  In this case, Google has announced that as of January 1, 2017, web pages that transmit credit cards or ask for passwords over HTTP (vs. HTTPS) will be marked with this flag in the address bar:

not-secure-2

Some of will say that this is as it should be, and I will be the first to agree with you. Any web site that asks for your userid and password over an unsecure connection needs to be flogged appropriately.  Likewise if a web site asks for credit card information in clear text, it is, at the very minimum, in violation of the merchant agreement that the company signed with its bank.  It too needs to mend its ways.

My guess is that there are way too many sites that will get scooped up in this NOT SECURE net come January 1.  It likely will be like the changeover to chip based credit cards.  When last September came, people said “crap” – or some to that effect – they aren’t kidding;  they really are going to leave this deadline in place and companies started doing what they should have been doing a year prior to that. However, they discovered that fixing this problem was harder than they thought.  As a result, almost a year past this deadline, there are still hundreds of thousands of businesses that have not converted.  I do predict that almost every single major site will have this handled well in advance.  No doubt Google is already talking to major web properties privately.

In this case, people may think that Google will blink.  While no one knows for sure, I would not bet on that outcome.

But this is not where it ends.  It ends with, in Google’s view, the death of HTTP.

The next step is to label all pages that are loaded without encryption when the user is in incognito mode as NOT SECURE.

Finally, the last step is to label all pages loaded with HTTP as NOT SECURE.  They have not provided a date for this, but it may well be during 2017.

Of course, this only affects users who use a Google browser on their computer or phone, but according to W3Schools, this is over 72% right now – and growing.  Last August, that percentage was only 64% (see stats here).

Since most businesses do not want their customers to see that message when going to their web site, they will finally, reluctantly, migrate all traffic to HTTPS.

And to be clear, this does not mean optionally HTTPS;  this means mandatory HTTPS.

The biggest challenge will be for companies that have hundreds or thousands of web sites.  They will need to touch each one of them.  They may need to order an SSL certificate for each one.  It will require some work.

My recommendation is to start now and avoid the New Year’s Eve rush.

 

Information for this post came from Google’s security blog.

 

[TAG:TIP]

Facebooktwitterredditlinkedinmailby feather

Google Fixes Over 100 Bugs In Android

google-marshmallow

It appears that Google is getting serious about Android security.  They have, for the past several months, been releasing patch updates every month – like other software companies.  While I have no visibility to AT&T and Verizon, Sprint has been religious at pushing those updates out to my phone.

This month they released patches covering over 100 bugs in both the Android core OS and in chipset drivers from various component chip manufacturers.

Phone vendors have a choice between two different update packages to distribute to their customers.

Android’s Mediaserver component is the recipient of 16 patches, including 7 rated as critical.  These bugs, like Stagefright before it, allow hackers to attack your phone just by sending it specially crafted text (MMS) messages or audio and video files.  This works because the Android OS, in an effort to speed things up when a user wants to open a picture, audio or video file, pre-processes those files in the background without asking or telling you.  If those files are infected, so is your phone.  It has been so bad that Google Hangouts, for example, no longer  pass media files to this component automatically.

Another critical vulnerability is in the built in crypto libraries, OpenSSL and BoringSSL.

The first of the two patch options, labelled 2016-07-01 when you go to SETTINGS|ABOUT in Marshmallow, fixes 32 bugs, 8 of which are critical, 15 high and 9 moderate.  These bugs apply to the core Android OS.  32 bugs starts to rival Microsoft patches, but doesn’t reach the level of Adobe Flash patches.

The other patch option, labelled 2016-07-05 in ABOUT fixes 75 additional bugs that are device specific, meaning some may affect this device while others may effect a different device.

These fixes are in modules such as the Qualcomm GPU driver, the MediaTek WiFi driver, the Qualcomm performance component, the NVIDIA video driver, the kernel file system (not sure why this is device specific though), the USB driver and other unspecified drivers.

Since these are running in a privileged process, a compromise of these modules is a serious problem.  In fact, some of these compromises may only be repairable by reflashing the device firmware, something most users cannot do even if they wanted to.

There are an additional 54 high severity bugs in various drivers that can also lead to a complete device compromise. The difference here is that an attacker would have had to already compromise the phone in order to exploit these 54 bugs.

Google has already released these patches to Google branded Nexus phones – possibly the most important reason to buy a Nexus phone.  How long it will take the various phone manufacturers to get off their collective butts and release them is unknown.

In the meantime, hackers around the world have access to these patches and are busy reverse engineering them to figure out how to attack your phone – it is a race to the bottom.

While this is the biggest Android patch release I have ever seen Google release in a single month, I think, maybe, it is a good thing.  I am hoping that it means that Google is getting serious about upgrading the security of Android and not just trying to cram as many features as possible into the next release.

What this does mean is that users who are running Lollipop (Android 5), Jelly Bean (Android 4.1), Ice Cream Sandwich (Android 4.0) and earlier are at significant risk of compromise because these versions of the Android OS will never be patched.

As of June 1st, 2016, only 10 percent of Android phones were running Marshmallow.  Apple is quite a bit better in FORCING adoption of new versions of the OS because they own the OS and the phone, but this may change as Congress is looking at passing a law forcing phone vendors to patch phones that they sell.  If you make money from it, you have to patch it.  Since Google isn’t releasing patches for older versions, this will force the phone makers, if the law is enacted, to upgrade the phones to the current version.  From a user standpoint, this would be a good thing.

As a consumer, if you are concerned about the security of your data, or, if you are a business and you are concerned about the security of your company systems accessed by employee phones, you need to consider replacing phones on a regular basis.  If you combine Android 5 and 6 together, this still represents less than half the Android phones.  Many of the phones running Android 4 and earlier are likely outside the U.S., but companies, especially, need to be proactive about dealing with this.

Information for this post came from Infoworld.

Facebooktwitterredditlinkedinmailby feather

Yet Another Major Open Source Program Flaw Discovered – After 8 Years

Some people are big advocates of open source because, they say, since people can look at the source, bugs are found more quickly.

I am not a big supporter of that theory even though I am a supporter of open source because just because people CAN look at the source, doesn’t mean that they will and just because they DO look at it, doesn’t mean they will find the bugs.

On a side note, OpenSSL, the super popular open source SSL software package used in many apps and on many web sites will be releasing patches on March 1st for multiple vulnerabilities.

Google announced this week another major open source software package vulnerability.  The package, GLibc, provides basic functionality for C Language software developers.  While not used by every C developer, it is an extremely popular library – likely used in tens of thousands  of applications.

Going back to the open source conversation, this bug was introduced in 2008 – 8 years ago.  And, it was only discovered by accident when a Google developer kept crashing his system.  After some work, the Google team discovered that it was caused by a bug in Glibc.

And the bug is pretty serious.  It allows a hacker to intercept a DNS request and respond with a specially crafted response which allows the attacker to take over the computer by inserting an arbitrary program up to 64,000 bytes and then running it.

The problem with these two bugs – and the fact that they are open source doesn’t really impact this issue – is that developers who use this package need to release an update and every single user needs to install that update.

In fact, these two open source packages are ATYPICAL because they both have teams that support them.  Many open source software packages don’t have formal support teams.

For major developers, such as many Linux distributions, there are likely patches already in the works and users will likely install them.

The problem comes with smaller software packages and dedicated hardware devices that use it – companies that may no longer be supporting that version of the software or hardware or even companies that have gone out of business.

Since Glibc is a large library, many Internet Of Things developers don’t use that library.   For us, that is a good thing.

But as an end user, we likely have no clue which software packages on our devices use the affected library.  Since the bug has been around for 8 years, any software product that uses the library, likely uses the infected version.

 

The OpenSSL announcement – minus details as is their standard policy  – can be found here.

Information on the Glibc bug can be found in Ars Technica, here.

Facebooktwitterredditlinkedinmailby feather

Open Source Software Does Not Solve All Of The World’s Problems

While I am not a Linux user personally, I am a big fan of it.  However, I am not delusional enough to think that just because a piece of software is open source, it is secure and bug free.

Anyone who thought that should have had those delusions ripped away when the Heartbleed bug was publicized.  For those readers not familiar with Heartbleed,  Heartbleed is the name given to the bug that affected the wildly popular open source software that implements SSL or HTTPS, the protocol used to protect secure many web sites.

It was thought that the bug affected around a half million to one million ecommerce web sites, many of which still have not been fixed 18 months later.

As popular as this software is, many, many people looked at it and even made contributions to it.  Still, this bug lived in the software from December 31, 2011 until a fix was released (but of course released does not mean that people have integrated into software that used the flawed version) on April 7, 2014.

To me, this proves that open source software, no matter the goals and desires of developers, may have security holes in it.

Fast forward to this week.

All versions of Linux released since Kernel version 3.8 (released in early 2013 -about 3 years ago) have a bug in the OS keyring, where encryption keys, security tokens and other sensitive security data is stored.

Whether hackers and foreign intelligence agents knew about this over the last few years or not is unknown, but we expect many Linux variants will release a patch this week.

More importantly, at least some versions of Android, which is based on Linux, also have this bug.  The researchers who found the bug said it affected tens of millions of Linux PCs and servers and 66% of all Android phones and tablets.

Google says that it does not think that Android devices are vulnerable to this bug being exploited by third parties and the total number of devices impacted is significantly smaller than the researchers though.  In this case, I trust Google researchers.  Google will have a patch available within 60 days, but getting that patch through the phone carrier release process could take a while.  I call this patch process TOTALLY BROKEN.  The only phones that we know will be patched quickly will be Google Nexus phones because Google releases those patches directly.

So, one more time, a major and highly visible piece of open source software is found to have a significant security hole for years.  This post talks about two examples, but there are many, many others.

If open source software as popular as Linux and OpenSSL has security holes, imagine the holes that MIGHT live in other, less popular open source software.  Some open source software might only be used by tens of people and only be looked at by one person.

The moral of this story is NOT that you should not use open source software;  it is no less or more risky than closed source software.  The moral is that you should ALWAYS consider the potential risks in using software and to the maximum degree possible, test for and mitigate potential security bugs.  And be ready to deal with the new ones when they are found.

Information on the OS Keyring bug can be found here.

Information on Heartbleed can be found here.

Facebooktwitterredditlinkedinmailby feather

How Would Congress’ Effort To Install Crypto Backdoors Actually Work?

While the question of how cypto backdoors would work is unknown since there are no actual proposals on the table at this time, I am concerned that it will turn into a disaster.  Partly this is because Congress does not understand technology.  Out of 500 plus Congress critters, there are 5 that have a computer science degree.  While that is not surprising, it means that mostly lawyers will be writing laws about something they know almost nothing about.

Option 1 – Force Apple and Google to install secret backdoors into their phones.  One option would be a skeleton key.  That is one single key that unlocks all phones past, present and future.  That option would be a disaster since if that key got into the wild, every phone ever made would be compromised.  Hopefully, that is not the option chosen.  Another option would to have a key per phone.  When you  make the phone, you create a key for it, put the key in a mayonnaise jar on Funk & Wagnalls back porch (to quote Johnny Carson) and open that mayonnaise jar if asked.  If this were done, we would need to securely store around two billion keys and growing by hundreds of millions a year between Apple and Android phones.  We could ask the government to store them.  I am sure that would be secure.  Maybe the OPM could do it for us?  Alternatively, the manufacturers might keep them.  The third option might be to have the key algorithmly derived such that you would not have to store the keys. I think that would mean that you would have to keep the algorithm secret otherwise anyone could decrypt a phone and that is not likely possible.

I don’t think that anyone has actually come up with a way to do this that would work.  I am open to possibilities, but haven’t heard one.  Neither have many, many cryptographers who are a lot smarter than I am.

How do we deal with the close to two billion phones that are out there.  In this situation, Apple is a little easier to deal with than Android.  Since Apple users tend to keep their software more current than Android users, you could, possibly, push an update to the close to a billion iPhones, installing the backdoor.  Not to mention the could hundred million iPads.  NOT!

In the Android world the problem is harder.  There are still hundreds of millions of Android phones running version 2 of the operating system even though version 6 is the current version.  Do you really expect each phone manufacturer to dust off their software archives and update that antique software. Not likely.

Then there is the question of who is going to pay for the creation – and more importantly – the ongoing maintenance of this huge intelligence network.  I assume Congress doesn’t want to pay for it, but I certainly don’t want to either.  The cost would likely be in the billions of dollars if not more.

And what about phones that are not made in the US?  Do we really have any leverage to force Chinese manufacturers that sell knock off Android and iPhone clones to do anything that the US wants?  I didn’t think so.  So maybe the objective is to reduce the sales revenue of US phone manufacturers?

But now the real problem.  Encryption is implemented in software in millions of applications.  These applications are written by tens of thousands of developers all over the world.  Many of them are open source meaning the developers don’t have any money to do anything and do not have a company to force to do anything – assuming you can even find these people.

If you don’t remove the encryption from software, cracking the iPhone or Android phone is basically useless.

Maybe Option2 is to ban all software that does not have an encryption backdoor.  How exactly do you do that?  There are likely thousands of new applications released every week.  Some in the US but many more outside the US.  Maybe we should block all non-US IP addresses so that we can make sure that terrorists don’t download software from non US companies or developers.  Maybe we should rename the Internet to the USNet.  Maybe we should pay someone to check every new application that is available on the Internet to see if it has a backdoor.  That would be good for the economy.  The government would have to hire tens of thousands of computer experts. nah,  that’s not going to happen.

Another issue is cost.  When Congress did this the last time in the 1990s, it was called CALEA.  It was Congress’ attempt to install a backdoor into all phone switches sold in the United States to commercial phone companies (the Ma Bells in particular).   There were a handful of phone companies and another handful of phone switch manufacturers,  Congress agreed to pay for the insertion of the backdoors.  They allocated a billion dollars in 1990s money and ran out.  They had to get another billion to finish the job.  And, I think, it took around 10 years to complete.

Fast forward to 2015.  Instead of 10 phone switch manufacturers you have, say, 100,000 software developers.  Instead of a product that is sold through a sales force, installed in known locations (the phone company central office) and maintained by a paid technical staff, you have products that are given away (open source), by people that do not have any paid staff, that are not physically delivered at all and come from all over the globe.  ASSUMING you could do this, how much would this cost?  Of course, you can’t do it.

And what about software made in other countries that don’t have laws like whatever this Frankenlaw might be?  A few countries – like England for example – might be persuaded to pass a similar law, but other countries – like Germany – are actually moving in the other direction saying that strong encryption is a good thing.

What about software made in Russia?  Ukraine? China? and many other countries that are not friendly to the US?  They are not likely to comply.

And, already ISIS has released their own software.  It is encrypted, of course.  Maybe we can ask Daesh (as they do not like to be called) to insert a backdoor for us and give us the keys.  Let me think about that.  Nope. Not gonna happen.

So, in the end, Congress will be able to thump their collective chests and say how wonderful they are and it will do nothing to help fight terrorism other than to make Bin Laden right even years after his death.  Remember that he said that he wanted to bleed us to death?  Well, he certainly is succeeding.  Even in death he is succeeding.

Stay tuned because no one knows how this play will end – tragedy or comedy?  Not clear.

 

Information for this post came from Network World.

Facebooktwitterredditlinkedinmailby feather

What Happens When Online Services Go Down?

This afternoon, Google Apps went down for a few hours.  Judging by the activity on the Twitterverse, you would have thought the world had ended.  You can check the outage yourself by going to Google’s AppsStatus page on the web (google.com/appsstatus).

Google Tweet

It appears that Google Docs, Sheets, Drive and other parts of the Google Apps universe were down for 2-4 hours this afternoon, depending on which app and which user.

While that is not the end of the world, it certainly is inconvenient and if you needed to either work on or deliver a file which is stored in the cloud, it was probably a problem for you.

For most users, they probably left early on a Friday, especially on the East coast where sanity didn’t return until 5 PM.

There is a moral here.  Having a business continuity plan is always a good thing.

While storing things in the cloud is convenient – I do it myself – it does mean that if the vendor has an outage – and every one of them will at some point in time – you may well not be able to get to that file or service until it is repaired.

This is true for Amazon Web Services, Google Apps, Microsoft Azure, Salesforce and everyone else – nothing is 100% available.

Also remember that the cloud is likely more reliable than your own, internal servers.  If your laptop, tablet or server crashes, assuming a reboot doesn’t fix it, how long will you have to go without?  For most vendors, if you pay a lot, you may get the vendor to be on site in say 4 hours.  That does NOT mean that the part that you need will be there with him – that might not arrive until tomorrow or the next day.

So this doesn’t mean that the cloud is bad.  Or good.  It means that technology is imperfect and you need to consider the consequences of an outage, assume that it is going to happen and have a “Plan B”.

For some people, Plan B might mean call it a day.  However, if the outage affects the way that your customers connect with you or how your team supports your customers, that particular Plan B might not be the best answer.

THAT is why you need a business continuity PLAN.  For some applications, waiting is probably a perfectly acceptable plan – for a certain amount of time.  An hour.  A day. A week.  Likely not a month.  For other applications, that might be a terrible plan.

And planning is usually way better than running around the house or office doing your best chicken little imitation.  No, the sky is not falling.  But it might be very cloudy.  Or not cloudy enough.

Facebooktwitterredditlinkedinmailby feather