Category Archives: Privacy

“Smart Cities” Need to be Secure Cities Too

For hundreds of years, government has been the domain of the quill pen and parchment or whatever followed on from that.

But now, cities want to join the digital revolution to make life easier for their citizens and save money.

However, as we have seen, that has not always worked out so well.

Atlanta recently was hit by a ransomware attack – just one example out of hundreds.  It appears that was facilitated by the city’s choice to not spend money on IT and IT security.  Now they are planning on spending about $18 million to fix the mess.  Atlanta can afford that, smaller towns cannot.

We are hearing of hundreds of towns and cities getting hit by hackers – encrypting data, shutting down services and causing mayhem.  In Atlanta, for example, the buying and selling of homes and businesses was shut down for weeks because the recorder could not reliably tell lenders how much was owed on a property being sold or record liens on property being purchased.

But what if, instead of not being able to pay your water bill, not having any telephones working in city hall or not being able to do things on the city’s web site – what if instead, the city owned water delivery system stopped working because the control system was hacked and the water was contaminated?  Or, what if, all of the traffic lights went green in all directions?  Or red?  What if the police lost access to all of the digital evidence for crimes and all of the people being charged had to be set free?  You get the general idea.

As cities and towns, big and small, go digital, they will need to upgrade their security capabilities or run the risk of being attacked.  Asking a vendor to fill out a form asking about their security and then checking the box that says its secure does not cut it.  Not testing software, both before the city buys it and periodically after they buy it to test for security bugs doesn’t work either.  We are already seeing that problem with city web sites that collect credit cards being hacked costing customers (residents) millions.  Not understanding how to configure systems for security and privacy doesn’t cut it either.

Of course the vendors don’t care because cities are not requiring vendors to warranty that their systems are secure or provide service level agreements for downtime.  I promise if the vendor is required to sign a contract that says that if their software is hacked and it costs the city $X million dollars to deal with it, then the vendor gets to pay for that, vendors will change their tune.  Or buy a lot of insurance.  In either case, the city’s taxpayers aren’t left to foot the bill, although the other issues are still a problem.  We have already seen information permanently lost.  Depending on what that information is, that could get expensive for the city.

In most states governments have some level of immunity, but that immunity isn’t complete and even if you can’t sue the government, you can vote them out of office – something politicians are not fond of.

As hackers become more experienced at hacking cities, they will likely do more damage, escalating the spiral.

For cities, the answer is simple but not free.  The price of entering the digital age includes the cost of ensuring the security AND PRIVACY of the data that their citizens entrust to them as well as the security and safety of those same citizens.

When people die because a city did not due appropriate security testing, lawsuits will happen, people will get fired and politicians will lose their jobs.   Hopefully it won’t take that to get a city’s attention.

Source: Helpnet Security

Facebooktwitterredditlinkedinmailby feather

The Feds Take Another Run At Getting Rid of Encryption

O P I N I O N

This is not really an opinion piece, but some people might think it is, so I will go for over disclosure and call it that.

The Feds really don’t like encryption.  It gets in their way when they want to do mass surveillance or even targeted surveillance.

For hundreds of years the Feds could listen in to any conversation that they wanted to, whether it was planting someone in the local pub to overhear your conversation, tapping your phone or more recently reading your email.

In concept, when done appropriately, this is a necessary evil.  I would not say it is a good thing, but there are bad people out there and you have to keep them in check.

In the 1990s a guy named Phil Zimmerman invented a piece of software called PGP.  It was free and it brought encryption to a lot more people than had it before.  It was far from easy to use, so most people didn’t use it, but still the government didn’t like it.  For five years the government tried to get Zimmerman locked up for inventing it (technically, they said that encryption was governed by the International Traffic in Arms Regulation (ITAR) and so you could not export it and since it was available on the Internet, he was exporting it).  The public never bought the argument and finally, in 1996, the government gave up.

Once the government realized that they could not put Phil’s genie back in the bottle, they came up with another idea called the Clipper chip.  The Clipper chip had a built in backdoor so the feds could decrypt anything that was encrypted using it.  People realized that encryption done that way wasn’t really private and never signed on to buying clipper chips.

In the mid 1990s the Feds noticed that phone companies were implementing digital central office phone switches and they could come into a phone company office and put a couple of alligator clips on your home phone line to listen to the mob, so Congress passed CALEA in 1994.  CALEA gave the phone companies billions of dollars (literally) to install digital back doors in their central offices.

Things got sort of quiet after that  with the FBI complaining to anyone who would listen, but Congress never listened for some reason.

Part of the logic might have been if encryption is so bad, crime must be going crazy, but that wasn’t true.  For the most part, in general, crime was level or maybe even going down a little – of course there were exceptions, but nothing massive to indicate that crooks were really smart and hiding all of their actions.

Over the last ten years or so, the FBI and various Justice Department folks said that we needed to put a back door in encryption to find terrorists.  For whatever reason, people still didn’t believe them and Congress has been unwilling to mandate an encryption backdoor.

All during this time, encryption was becoming more and more ubiquitous, including encrypted phones, both Apple and Google.  They said that the world was going dark because of all of this encryption, yet they continue to find and arrest cyber criminals and terrorists.  Maybe not all of them, but a lot of them.

But the Feds are not giving up.  They want Facebook, Google, Apple and others to build in back doors to their messaging applications.

The reason they now want to add encryption back doors?  Its the children.  Poor. Defenseless.  Children.  After all the child molesters and kiddie porn freaks – surely they must be using encryption.  I guess they are.  I mean, what if they catch a kiddie porn pervert and his phone is encrypted.  Surely he will get off Scot free.

Well it turns out that even that isn’t quite true.  The New York City District Attorney signed a deal about two years ago with the Israeli company Cellebrite.  Cellebrite claims to be able to get the data off almost any phone, Android or iPhone.  Probably pretty accurate.  Now it has come out that New York is offering this phone-hacking-as-a-service to other law enforcement agencies as well.  But this is not as easy as vacuuming up all of the data from everyone and looking for anything that seems interesting.

Still the government does have tools.  Raytheon makes a box called a Stingray.  Originally it was designed for the Military to use in the Middle East and other hot spots to watch terrorists, but money wins out and Raytheon will sell it to law enforcement everywhere.  Recently, we have been watching a spy vs. spy game as it has come out that people have found numerous Stingray or Stingray-like devices all over DC, including around the White House.

That is the problem with stuff.  You can’t keep the genie in the bottle.  If we create an encryption back door and say that only the cops can use it, that will last for at least a few months before the secret is no  longer secret.

If you think we have all of this cyber crime now, with all of this encryption, you can’t imagine what it might be like if we don’t have secure encryption.  And this is definitely a genie that you will not be able to get back in the bottle.

Just my opinion.

 

 

 

Facebooktwitterredditlinkedinmailby feather

Mactaggart Gets Ready to Launch New Ballot Initiative – CCPA 2

Alastair Mactaggart, who pretty much single handedly is responsible for the California Consumer Privacy Act is on the warpath again.

CCPA 2, another ballot initiative, would grant California residents new rights in their health and financial records and also their precise location.  It would require consumers to opt in to companies selling that data and would also allow them to block the use of that data for targeted ads.

It would also establish a California privacy agency since it seems that the current AG isn’t real excited about enforcing the current CCPA law.

It would create stronger penalties for violating this law with data on kids under 16 (California already has a stronger law than the feds do for kids called CalOPPA).

It would also require companies to explain how their algorithms work in certain cases like determining employment prospects.

Given that he was able to collect 600,000 signatures very quickly for CCPA and that he is willing to spend his own money for CCPA 2, I would watch what happens closely.

If he collects enough signatures, this will go on the ballot in  2020, with an effective date sometime after that.

Source: WaPo

Facebooktwitterredditlinkedinmailby feather

Coworking and Shared Work Spaces Are A Security and Privacy Nightmare

Coworking and shared office spaces are the new normal.  WeWork, one of the coworking space brands, is now, apparently, the largest office space tenant in the United States.

Who are in these coworking spaces are startups and small branches (often 1 or 2 people) of larger companies, among others.

Most of these folks have a strong need for Internet access and these coworking spaces offer WiFi.  Probably good WiFi, but WiFi.  And WiFi is basically a party line, at least for now.

Look for WiFi 6 with WPA 3 over the next couple of years – assuming the place that you are getting your WiFi from upgrades all of their hardware and software.  And YOU do also.

A couple of years ago a guy moved into a WeWork office in Manhattan and was concerned about security given his business, so he did a scan.  What did he find but hundreds of unprotected devices and many sensitive documents.

When he asked WeWork if they knew about it, the answer was yes.

Four years later, nothing has changed.

Fundamentally, it is a matter of money.  And convenience.

But, if you are concerned about security, you need to think about whether you are OK with living in a bit of a glass house.

For WeWork in particular, this comes at a bad time because they are trying to do  – off and on  – an initial public offering and the bad press from publications like Fast Company on this security and privacy issue don’t exactly inspire investor confidence.

Fundamentally, using the Internet at a WeWork office or one of their competitors is about as safe as using the WiFi at a coffee shop that is owned by the mob  and is in a bad part of town.  Except that you are running your business here.

In their defense, WeWork does offer some more secure options (although you might be able to do it yourself for less).  A VLAN costs an extra $95 a month plus a setup fee and a private office network costs $195 a month.  That might double the cost of a one person shared space (a dedicated desk costs between $275 and $600 a month, depending on the location).

And clearly they do not promote the fact that you are operating in a bit of a sewer if you do not choose one of the more expensive options.  The up sell here is not part of their business model.

For users of shared office spaces, like WeWork (but likely anywhere else too, so this is not a WeWork bug), they need to consider if they are dealing with anything private or whether they care whether their computer is open to hackers.  If not, proceed as usual.

If not, then you need to consider your options, make some choices and spend some money.  Sorry.  Source: CNet.

Facebooktwitterredditlinkedinmailby feather

Business Roundtable Lobbying Group Wants Weak National Privacy Law

O P I N I O N

50 Very Data Hungry CEOs (Out of About 30 Million) Try to Fool Congress into Letting Them Abuse Your Data

A group of big data CEOs wrote a letter to Congressional leaders requesting a Federal privacy law which would usurp the state’s rights to protect their consumers as they see fit.

A spokesperson for Facebook responded several months ago to a reporter’s question about a New York bill requiring companies to be a data fiduciary with the response that if the bill passed (it didn’t), Facebook might as well shut down in New York.  The spin doctors tried to walk that back the next day, but the reality is, if that law passed, it would require Facebook and companies like them to change their business models.

In fairness, it is difficult for companies to keep up with all the privacy laws (we help companies do that), but unless your business model requires that you sell your customer’s data to stay in business, complying is manageable, but it does take work.  Unfortunately, the Facebooks and Googles of the world have made things more complex for everyone else.

The state of data privacy is roughly in the same place that cybersecurity was in after California passed it’s landmark security bill (CA SB 1386) in 2003.  SB 1386 is the model that every other state drew from for enact their security laws.  Now CA AB 375 (the new California Consumer Privacy Act) has already begun this process over again with privacy laws.

Even though they don’t say this, what they really want is for Congress to pass a law because they know that their lobbying billions will allow them to buy a very weak law that will nullify laws like the ones in California, New York, Nevada, Vermont and other states.

The longer Congress doesn’t act, the more states will pass strong privacy laws, because that is what consumers want and the harder it will be to get votes at the national level to obliterate rights people already have – hence the urgency from these CEOs.

The California law would allow people to sue businesses that have breaches, which would dramatically change the economics of lax security practices – right now, at the federal court level, you have to prove that you have been tangibly damaged to sue after a breach.  The defense that some companies are using is that there are so many breaches, how do you know that your damage was from our breach.  The California law removes that requirement to prove that the consumer had tangible damages.  That alone scares the crap out of the Facebooks and Googles – and it should.

They are trying to pass this off as stopping consumers from being confused about their rights (like the right to tell Facebook not to sell your data – that is certainly confusing and hard to understand), but that is completely bull.  The 6 rights that the California law gives consumers are each spelled out in one sentence and are easy to understand. For example:

  • The right to know what data a company has and to get a copy of it
  • The right to request that my data be deleted subject to a list of exclusions
  • The right to stop a company from selling my data
  • The right to equal price and service even if I tell you not to sell my data

And a couple of more rights.  These rights are easy to understand and the real problem for CEOs like Amazon’s Jeff Bezos is that people will likely actually use these rights and that might force companies like Amazon to change their business models.

If companies are transparent about their data collection practices, then this is a pretty simple choice.  People can choose to do business with companies that want to sell their data.  Or not.

One thing that makes this conversation different than the conversation around security in 2003 is that places like Europe, Japan and a significant number of others have already given their consumers these rights, so the big data companies already have to deal with this.  No matter what happens in the US, this will happen in the rest of the world.

At that point, as we are already beginning to see, the lack of a strong national privacy law in the US makes it MORE difficult and MORE expensive for US companies to compete in the rest of the world.

In Europe, the first EU/US privacy agreement, Safe Harbor, was struck down by the EU courts as not protecting EU citizens’ rights.  That was replaced by Privacy Shield (which many people say was just Safe Harbor with lipstick) and Privacy Shield is being attacked in the EU courts.  We do not know the outcome of that court battle, but we will soon.  If the courts strike down or force substantial changes to Privacy Shield, that will make the arguments of these 50 CEOs even less intelligent.    Many companies have already decided that it is cheaper, simpler and better PR to have one set of consumer friendly privacy policies worldwide.

Stay tuned;  this will not end any time soon.

Source: C-Net.

NOTE:  This is likely a hot button topic for folks.  Please post your comments to this.  I promise to approve any comment that is moderately sane and rated PG or less.

Facebooktwitterredditlinkedinmailby feather

The Challenge of Privacy

Everyone has heard about the Federal Trade Commission fining (tentatively) Facebook $5 billion for sharing your data – with Cambridge Analytica  – without your permission.

The FBI has sought proposals for third parties to hoover up everything that is visible on social media and build a database so the FBI can search it for information on activities that you do that they think is sketchy.

The FBI wants to search your stuff by location (neighborhood), keywords and other functions.

Which seems to me precisely what cost Facebook $5 billion for allowing Cambridge Analytica to do.

Except the FBI wants to do this not just with Facebook, but with all social media platforms combined.

Not to worry.  I am sure that it will be secure.  And not abused.  And not used for political purposes.  After all, we are from the government and…..

The FBI wants to capture your photos as well.

Of course, doing so would violate the terms of service of every social media platform, so unless the do it secretly or Congress passes a law nullifying the social media terms of service, it is likely that social media platforms will terminate the accounts if they detect it.  *IF* they detect it.  Given the relationship between social media and DC, they may be motivated to stop it.

However, it is already being done by private companies, in spite of the prohibition, to sell to marketers, so who knows.

Facebook and Instagram actually have a ban on using the platform for surveillance purposes.

From a user perspective, there is likely nothing that you can do other than stop using social media.  It is POSSIBLE that if you stop making posts public (and instead only make them visible to your friends), that MIGHT stop them from being hoovered up.

If you stop using the platforms, that will make Facebook, Twitter and other platforms sad.

Smart terrorists will shift to covert platforms to make detection harder.

The good news is that there are not very many smart terrorists.

Source: ZDNet

Facebooktwitterredditlinkedinmailby feather