Category Archives: Audit

25% of Web Apps Are Vulnerable to 8 of the OWASP Top 10

Let that title sink in for a minute.  A quarter of all web apps fail security miserably.  That does not mean that the other 75% are secure;  it means that the other 75% are less unsecure.  For the 25%, it means that things are pretty hopeless.

For a quick cheat sheet on the OWASP top 10, click here.

The study continues to dissect the state of unsecurity:

  • 69% of web applications have vulnerabilities that could lead to exposing sensitive data.
  • 55% of web applications have cross site request forgery flaws
  • Broken authentication and session management issues affected 41% of the applications
  • 37% of the applications had security misconfiguration issues
  • Function level access control is missing or ineffective in 33% of the web applications
  • 80% of the applications tested contained at least one vulnerability
  • And, the average number of vulnerabilities per application is 45.

So just a question – does it concern you that 80 percent of the web applications tested had at least one vulnerability and 25 percent had 8 out of the top 10?

The only way to know is to test for it.  The best way to know is you have an independent third party test for application vulnerabilities.    Think of this as a network penetration test, but for your applications.

While you can test the applications that your team writes, you can’t test applications on the Public Internet – the owner might frown upon it.  As a business, if you have to use a particular web application as part of your business AND you have a business relationship with the web application owner (such as a supplier or a business partner), you can make completing a web application independent third party penetration test a requirement for doing business.  This is easier for larger companies, but if you don’t ask, you won’t get it.

This also means that you should be careful about what applications that you use and what applications you enter sensitive data in.  Since there is no equivalent to the “Good Housekeeping Security Seal”, although Underwriters Lab is working on one, there is no easy way to know which applications are secure and which ones are not.

Unfortunately, at the moment, there is no good solution to this problem.  In almost all cases, developers have no liability at all – the user shoulders all of the responsibility.  The best that I can say is be cautious.

Information for this post came from Help Net Security.

Facebooktwitterredditlinkedinmailby feather

The SEC is Coming, The SEC is Coming!

For Financial Service firms, the message is clear.  Both FINRA and the SEC are looking over your shoulder to make sure that you are taking cyber security seriously.

And the fines are not small.  From hundreds of thousands to millions of dollars, firms big and small are getting whacked with fines.

In 2014, the SEC office of Compliance Inspections and Examinations released a risk alert describing their new initiative designed to assess cybersecurity preparedness.  Among the requirements outlined in the program are:

  • Inventory of physical devices and systems
  • Inventory of platforms and applications
  • map of network resources, connections and data flows
  • The map above to include locations where customer data is housed
  • External connections are cataloged
  • Resources are prioritized for protection based on their sensitivity and business value
  • Logging capabilities and practices are assessed
  • A written information security policy is available
  • Periodic risk assessments conducted and findings mitigated
  • Periodic physical security risk assessments are conducted
  • Cyber security roles in the company are explicitly assigned and communicated
  • A written cyber business continuity plan has been implemented
  • The firm has a CISO or equivalent

This is only part of the list.  The list goes on for 8 pages.

Check out the end of this post for a list of references to FINRA and SEC documents describing these programs.

John Stark Reed of Reed Consulting has come up with some recommendations.  While paper is 12 pages long, here is the gist of the recommendations.  A link to the paper appears below.

  1. Review overall cyber security policies for adequacy
  2. Eliminate red flags (DUH!)
  3. Create the team (Now, not after a breach)
  4. Protect against identity theft
  5. Get private (protect private data)
  6. Choose the right monitoring technology
  7. Watch out for insiders (Chase learned the hard way)
  8. Consider cyber insurance (Don’t consider it, buy it)
  9. At the first sign of trouble, investigate

There is a ton of information in the articles listed below.

If your head is swimming after reading the articles, contact outside experts (yes, that is self-serving;  we do that for financial service companies, but it is very hard to do it yourself).  I liken fixing cyber security in a running business like paving a road while you are driving on it.  Not easy.

Each year the SEC and FINRA visit more businesses and each year their examiners get more knowledgeable about cyber, so don’t think you are going to fool them.

If you start early and have an active program, you are much more likely to get a friendly reception when the examiners come to visit.

It will take quite a while to put together an entire program, so we really do recommend starting early.  It is much easier to put together a program over a year or two rather than trying to get it done in a couple of months after you get that examination report.  If you wait, not only do you have to pay someone like us, but you also have to pay the fines.

LINKS to useful articles:

Cybersecurity and Financial Firms: Bracing for the Regulatory Onslaught by John Reed Stark

SEC National Exam Program risk alert.

SEC examination sweep results summary.

FINRA Report on cyber security practices.

FINRA cyber security report with small business checklist.

Facebooktwitterredditlinkedinmailby feather

Feds to Increase Audits Of Doctors’ Protection Of Your Information

The Inspector General in the Health and Human Services Office for Civil Rights (OIG, HHS OCR) reported that OCR is not effectively auditing HIPAA covered entities.  A covered entity includes doctors and hospitals that have primary ownership of your health records.  As a result, the OCR is establishing a permanent audit program and working to identity potential audit targets.

One place OCR is, apparently, going to be looking, is at business associates or BAs.  In HIPAA speak, BAs are those vendors that a doctor or hospital uses that have access to your information.  Under the rules, your doctor needs to not only have a written agreement with that vendor, but doctors have to use reasonable diligence to make sure that the security of your information is protected.

Also, the rules are changing regarding what is a breach.  It used to be that you only had to report a breach if there was significant risk of financial or reputational harm – as evaluated by the doctor or hospital.  Needless to say, most lost data did not present significant risk.  Now any breach has to be reported.

Unless the data is encrypted in a way that there is no reasonable way for the hacker to be able to read the data.

And, this includes mobile devices (PHONES!) that contain patient data, so just encrypt patient data wherever it lives.

A Massachusetts dermatology clinic discovered this the hard way when they lost a thumb drive.  Their wallet is now $150,000 lighter.

Doctors that use computerized record keeping systems called EHRs now need to provide copies of those records within 30 days of a request, down from the old 90 window.  That could challenge doctors and hospitals that don’t have a system in place to do that.

And, there are many other rules that both doctors and their service providers need to comply with.

Now that the OCR is finally going to have an active audit program, expect more violations.    Its not that the violations weren’t happening before, it is just that no one was looking.

Those doctors and hospitals that do not have an active program for monitoring their HIPAA compliance may find themselves with a problem.  HIPAA and its cousin HITECH have been around for years.  One of the goals of HITECH was to put teeth in the enforcement of HIPAA.  That goal may have just been accomplished.

If you are a doctor, hospital or service provider to one, don’t say you did not know.

Information for this post came from Family Practice News.

Facebooktwitterredditlinkedinmailby feather

Holy Cow! Alert For Juniper Netscreen Firewall Users

UPDATE:  According to the Wired article below, the remote access issue was caused by a hard coded master password.  Of course, now that people know there is one, they can look at the code and find it, which means that if you have not patched your Juniper firewalls, you are at a high risk for being owned.

The article also says that the VPN issue may allow an attacker to decrypt any traffic that they have captured in the past.  So if the Chinese, for example (or US or Russian or …) had captured traffic hoping that they might be able to decrypt it some time in the future, now is that time.

This is one of those STOP THE PRESSES! kind of alerts.  Juniper announced yesterday that there are two separate compromises to Juniper Netscreen firewalls that would allow an attacker to gain administrative access to company firewalls and also to decrypt VPN traffic.  Together, this would allow an attacker to completely own your network.

If you are running a Juniper firewall running ScreenOS 6.2.r15 through 18 or 6.3.r12 through r20, you need to patch your firewalls immediately.

Juniper has been amazingly open about this, unlike some vendors.  I suspect that they figured that this exploit is so bad that customers may run away from their products, so the lesser of the evil is to be honest about it.  In reality, my guess is that they are no better or no worse than any other vendor.  Some vendors, under the same situation, might have just said “hey, we fixed some bugs, you should patch your firewall”.  The patches are available on Juniper’s web site (see link in Network World article).

A couple of notes that Juniper made:

  • There is no workaround other than applying the patches
  • They discovered this via an internal code review.  This MAY be good as hackers may not have found the problem.  HOWEVER, that being said, every attacker in the world knows about it now and since it is an OWN THE COMPANY bug, you need to patch this ASAP.  I was at a meeting yesterday where an FBI Special Agent was speaking about security and he interrupted his presentation to tell us about it.  It is that kind of high priority.
  • Juniper said that the bug is a result of unauthorized code in ScreenOS.  While they did not explain what this unauthorized code is, to me, that indicates their development environment was compromised,  If this is true, there entire code base is suspect at this time.  Hopefully they are scurrying around looking at all code in all products for backdoors.  Juniper says they don’t think that Junos devices (their other operating system) are affected.
  • The first bug allows someone to get unauthorized remote administrative access.  From there, you own the device, can wipe the logs, change the configuration or do anything else you might want to do.
  • The second bug – which is separate from the first – would allow an attacker who could monitor your VPN traffic to decrypt it.  Also, not good.  There would be no indication that an attacker was decrypting your traffic.
  • Juniper has not said how long these devices have been infected, but some of the code being patched dates back to 2012.
  • While Juniper has not said how this “unauthorized code” got into the devices, one candidate, based on Snowden documents, is the NSA.  They apparently have an interest in listening to organizations using Juniper hardware.

Whether this is the result of an NSA covert op, some other intelligence agencies handiwork, or some random hacker, it points to the fact that companies need to proactively monitor changes to their software to make sure that unauthorized changes are not being made.  For all organizations, this should be a wake up call for internal security.

This is a very interesting development.

 

 

Information for this post came from Network World.

Another article with more details can be found in Wired.

Facebooktwitterredditlinkedinmailby feather

The Target Breach Story – How Did They Let This Out?

Krebs On Security has extensive reporting of an investigation by Verizon conducted starting a few days after the Target breach was announced.

Target has refused to confirm or deny the report .

One thing to consider.  We do not know how Brian (Krebs) got the report, so all we can do is speculate.

This report, in my opinion, is a wonderful tool for the banks and consumers who are suing Target.  It shows all the things that Target was not doing or was doing wrong.  This report makes it so much easier to show Target was not treating cyber security consistent with even reasonable industry practices, never mind best industry practices.

What Target should have done is have their outside counsel manage the engagement of Verizon so that this report could have been shielded by attorney-client privilege.

It is certainly possible that they did that, but then, how did the report get out to a reporter?  Part of engaging the attorneys to manage this is to control the distribution of the final work product.

Any way you look at it, in my opinion, letting this report out of their control is yet another FAIL! by Target.  

While Target spokesperson Molly Snyder said that Target believes that sharing information will make everyone stronger – thereby basically validating that the report is real – it doesn’t make sense to release this kind of detail while there are so many lawsuits pending.

You can go to Brian’s web site (see link below) for the long gory details, but here is the short version:

  • Once the Verizon hacking team was inside Target’s core network, there was nothing stopping them from communicating directly with the cash registers – violating every principal of segmentation known to IT.  They should never have been able to do that.
  • Target had guessable passwords on Microsoft SQL servers and weak passwords for system accounts.
  • Target had a password policy, but it was not being followed. Verizon found clear text password files for system accounts on several servers.
  • Verizon was able to create domain administrator accounts and dump all of the password hashes.
  • Within one week, the consultants were able to crack 472,000 (86%) of the passwords.
  • Patches to systems and services were not applied consistently.
  • Verizon said that Target, who was using Tenable’s vulnerability scanning system, had a comprehensive scanning program in place but was not acting on the vulnerabilities discovered.

There is more in the report, but you get the idea.

If you are a security person, the report is a fascinating indictment of Target and a roadmap of what not to do.

If you are a CEO, the leak of a report like this falls into the worst nightmare category.

Information for this post came from KrebsOnSecurity.

Facebooktwitterredditlinkedinmailby feather

Office Of Civil Rights At HHS Starting Up Audits Again

The Office Of Civil Rights (OCR) has been pretty quiet these last couple of years regarding HIPAA audits, but that may be about to change.

OCR’s staff is small, so they have hired a contractor, FCI,  according to the Federal Register. In an interview, deputy director Deven McGraw says that they will be starting up random audits again early next year.

FCI’s contract for a little under a million dollars is very small by federal standards.  This means that they will be doing narrowly focused remote audits.

Recently, OCR fined a small Oncology clinic $750,000 for a laptop and server that were stolen but not encrypted.

Deven said that anything that is not nailed to the floor (her words) should be encrypted – laptops, storage devices, servers and desktops, for example.

She said that even though encryption is “addressable”, that does not mean that it is optional, even for the smallest health care providers and business associates.  We EXPECT you to address encryption of data at rest and if you don’t encrypt, you must implement an alternative option in it’s place as well as documenting the reasoning.

Illana Peters, senior advisor for compliance and enforcement at OCR said that there really aren’t any other great options besides encryption.

They also said that lost devices, even encrypted ones, that have to be reported are indicators of other problems at the organization.

Deven also said that it all starts with a HIPAA risk analysis.  I suspect that reviewing your risk analysis document is something that could easily be done remotely and lead to more questions if you do not have one or the one that you do have indicates more problems.  The message, regarding risk analysis is to stop procrastinating.

While it remains to be seen what OCR will do starting in 2016, this might be a good time for covered entities to make sure that their HIPAA house is in order as well as the house’s of their Business Associates, since CEs are now liable for the errors of their BAs.

Small providers – ones for whom a $750,000 fine for having two devices stolen out of an employees car would be devastating – should probably start looking now to see if they have their HIPAA security rule act in order.

Information for this post came from two articles at Data Breach Today, here and here.

Facebooktwitterredditlinkedinmailby feather