Tag Archives: HIPAA

More Healthcare Breaches, Record Fines and Other Issues

Another day, another healthcare ransomware attack.  Erie County Medical Center and Terrace View long term care in Buffalo, New York have been dealing with a ransomware attack for about 10 days now.  On April 9th, a Sunday, the computers got hit by what they are only calling a virus, but according to someone I talked to today, it is, in fact, a ransomware attack.  They have not paid the ransom and do not intend to, but from April 9th to the 15th, all systems were down.  They hoped to have the patient data part of their systems operational by the 15th at which point they would need to start entering the backlog of patient data and any data that was lost.

According to local media, the email system is also supposed to be up by that time.

After that is complete, they planned on working to restore systems such as payroll.

According to the person I talked to this morning, as of today, they are still working on recovering.

I am sure that they will complete a lessons learned exercise once people get some sleep, but from the outside, a couple of questions are obvious.  Their disaster recovery plan seems to be lacking if they are still recovering 10 days later.  We don’t know if their business continuity plan is sufficient.  They didn’t have to close the hospital, which is good, but what is the impact on patient care and staff workload.  Finally, how did this ransomware spread so widely in the organization that it is taking them more that 10 days to recover.

As a side note, the Beazley cyber insurance company says that ransomware attacks that were reported to them quadrupled in 2016 and they expect that to double again in 2017.  Half of the attacks were in healthcare.

The FDA is now shifting its focus to medical devices, like the ones from St. Judes, that the FDA slammed the firm over last month.

 

As if that wasn’t enough to worry about, Health and Human Services Office of Civil Rights levied more fines in 2016 than any other year to organizations that were breached.  They announced 12 settlements averaging $2 million in 2016 and three more in the first two months of 2017 PLUS a fourth case that had a fine of $3.2 million.

Some of these cases required the appointment of an external monitor or baby sitter, indicating that OCR didn’t trust those organizations to fix the problems without oversight.

These handful of cases, while significant, represent a fractional percentage of the roughly 17,000 cases a year that are filed with OCR.

In addition, OCR is finishing up a series of desk audits of covered entities and is about to start on auditing business associates.

While it is unclear what will happen under the Trump administration, OCR is funded mainly by the fines they levy, so it may well be the case that things run as they have for the last few years.  Stay tuned.

Putting all of this together should be a red flag to anyone in healthcare that they need to get very serious about cyber security.  It is not likely to get any better or easier any time soon.

 

Information for this post came from Disruptive Views and hrdailyadvisor.

Facebooktwitterredditlinkedinmailby feather

Hospital System Fined $5.5 Million For Not Controlling Access

Memorial Healthcare Systems in Florida was fined $5.5 million for allowing the information of about 115,000 patients to be accessed “impermissibly”.

Memorial, which operates 6 hospitals, an urgent care center, a nursing home and other healthcare facilities in South Florida, reported the breach in 2012 – 5 years ago – after it discovered the problem.  Exactly why it should take Health and Human Services 5 years to complete an investigation is a mystery to me.

The information taken includes names, birth dates and social security numbers.

Apparently, two employees who worked in an affiliated physicians’ office accessed the hospital’s systems for a year, stole patient records of over 100,000 patients and used that data to file fraudulent tax returns.

After discovering  that employees had been stealing data for a year, Memorial worked with federal law enforcement which ultimately led to the conviction of the people who filed the false tax returns using that stolen data.

Apparently, even though Memorial had been told for the six years prior to discovering the breach that reviewing employee data access records was a risk, they still did not review those records.

As part of the settlement, Memorial denied any guilt.  It seems to me that, if they had been told for six years that something was a risk and chose not to deal with it, they have some degree of guilt.  Not admitting guilt is fairly typical in these deals so as to avoid giving plaintiffs who might be suing them any additional leverage.

It appears that the credentials used to access these records were legitimate, but it is unclear to me how the physician’s office staff got access to them.

This brings up the bigger issue of logging and auditing – something that affects all businesses;  they were not using credentials assigned to them when they stole the data.

We are seeing more regulators requiring businesses to maintain more comprehensive audit logs and processes.  Besides the HIPAA regulators, DoD and some state regulators have issued new rules or opinions.

But in addition to creating audit logs, you also need to review them and generate alerts based on that review.  For a business like Memorial, that likely requires reviewing millions or even tens of millions of audit records.  That requires both software and people and those require money.  That is likely at the root of the issue.  After they discovered the breach, they did implement a review process, but apparently, that decision not to review data access records cost them a $5.5 million fine as well has having to implement a multi year corrective action plan with the HIPAA regulator.

This represents a great opportunity for businesses in general to review their auditing processes – what audit data are we collecting, does that audit data meet the regulatory requirements, how long do we store it for and how do we analyze it – to verify that it is appropriate for both compliance reasons and business requirements.

Information for this post came from the Sun Sentinel and Health and Human Services.

Facebooktwitterredditlinkedinmailby feather

Health and Human Services Issues New Guidance on Ransomware

The U.S. Department of Health and Human Services Office of Civil Rights, the government entity that manages the privacy of health care information that you share with doctors and others, has issued new guidance on ransomware.

While technically, it only applies to organizations that they regulate, in reality, almost everything they said applies equally to all businesses.

The U.S. Government says that, on average, there have been 4,000 daily ransomware attacks, a 300% INCREASE over last year. 

They say that businesses should:

(a) Conduct a risk analysis to identify threats and vulnerabilities.  In the case of HHS OCR, they are only worried about protecting health information, but in reality, every business should be conducting a risk analysis at least annually.

(b) Once you have conducted a risk analysis you need to create a plan to mitigate or remediate those risks and then execute that plan.

(c) Implement procedures to safeguard against malicious software (like ransomware).

(d) Train ALL users on detecting malicious software and what to do if they detect it or accidentally click on something.

(e) Limiting access to information to only those people with a need for it and, if possible, grant them read only access.  Ransomware can’t encrypt files that it doesn’t have write access to.

At least one ransomware attack that I am familiar with became a full blown crisis because a user had write access to a whole bunch of network shares and they ALL got encrypted.  Not a good day at that non-profit.

(f) Create and maintain and overall incident response plan that includes disaster recovery, business continuity, frequent backups and periodic full drill exercises.

There is a lot of language that ties the specifics of what they recommend to the HIPAA/HITECH regulations, which is important if you are a covered entity or business associate, but even if you have no HIPAA information, these recommendations are right on.

If you are not doing all of these things today, you should consider making it a priority.  Ransomware is messy stuff, even if you have backups of everything.  Assuming you have not implemented a full disaster recovery/business continuity solution (and if you have not you have a lot of company), recovering from your backups is a very time consuming and labor intensive process and in the mean time, you are working off of pencil and paper.

Information for this post came from the Health and Human Services web site.

Facebooktwitterredditlinkedinmailby feather

Feds to Increase Audits Of Doctors’ Protection Of Your Information

The Inspector General in the Health and Human Services Office for Civil Rights (OIG, HHS OCR) reported that OCR is not effectively auditing HIPAA covered entities.  A covered entity includes doctors and hospitals that have primary ownership of your health records.  As a result, the OCR is establishing a permanent audit program and working to identity potential audit targets.

One place OCR is, apparently, going to be looking, is at business associates or BAs.  In HIPAA speak, BAs are those vendors that a doctor or hospital uses that have access to your information.  Under the rules, your doctor needs to not only have a written agreement with that vendor, but doctors have to use reasonable diligence to make sure that the security of your information is protected.

Also, the rules are changing regarding what is a breach.  It used to be that you only had to report a breach if there was significant risk of financial or reputational harm – as evaluated by the doctor or hospital.  Needless to say, most lost data did not present significant risk.  Now any breach has to be reported.

Unless the data is encrypted in a way that there is no reasonable way for the hacker to be able to read the data.

And, this includes mobile devices (PHONES!) that contain patient data, so just encrypt patient data wherever it lives.

A Massachusetts dermatology clinic discovered this the hard way when they lost a thumb drive.  Their wallet is now $150,000 lighter.

Doctors that use computerized record keeping systems called EHRs now need to provide copies of those records within 30 days of a request, down from the old 90 window.  That could challenge doctors and hospitals that don’t have a system in place to do that.

And, there are many other rules that both doctors and their service providers need to comply with.

Now that the OCR is finally going to have an active audit program, expect more violations.    Its not that the violations weren’t happening before, it is just that no one was looking.

Those doctors and hospitals that do not have an active program for monitoring their HIPAA compliance may find themselves with a problem.  HIPAA and its cousin HITECH have been around for years.  One of the goals of HITECH was to put teeth in the enforcement of HIPAA.  That goal may have just been accomplished.

If you are a doctor, hospital or service provider to one, don’t say you did not know.

Information for this post came from Family Practice News.

Facebooktwitterredditlinkedinmailby feather

4 Health Care Breaches Reported This Week Alone

The Examiner reported about 4 health care data breaches on the 20th.  See if you can find a common element.

Information on 21,000 California Blue Shield customers, including health care info, was compromised when a vendor call center employee was socially engineered, their login information compromised and their customer data stolen.

Last week Montana’s New West Health Services said an unencrypted laptop with data on 25,000 patients was stolen. It included patient information, bank account information health information and other information.  On an unencrypted laptop out in the field.

Also last week, at St. Luke’s Cornwall Hospital in New York, a USB drive was stolen with information on 29,000+ patients which included patient names, services received and other information.  The drive, it would appear, was not encrypted.  The reason I assume it was not encrypted is that if it was encrypted and the encryption key was not taped to the device, the hospital would not have to report this event.

Finally, Indiana University Health Arnett Hospital lost a “storage device” with information on 29,324 patients containing information such as patient name, birthdate, diagnosis, treating physician and other information.  Again, likely this information was not encrypted.

Anyone figure out the common element?  All of these events would have been non-events if these companies had reasonable cyber security practices in place.

The call center employee was socially engineered.

An unencypted laptop was stolen (where was it left)?  Why was it unencypted?  Why did it have patient data on it?

A flash drive with patient data was lost.  Why was  it not encrypted and did the data need to be on the flash drive at all?

And, a storage device was stolen.  That happens.  Why was it not encrypted?

How much training did the call center do to train employees about social engineering?  Why was the laptop not encrypted?  Why was the flash drive not encrypted? And, why was the storage device not encrypted.

I keep pointing to encryption because if you have a breach but the data is not readable by the thief, you don’t have to warn customers.  It is a very simple step to take.  JUST DO IT.

Only in the flash drive case could the encryption cause a problem if you need to be able to share the drive with someone else.  The other two situations, the encryption would be transparent to the user.

Especially when it comes to health data, you need to be careful.  AND this does not only mean hospitals and doctors.  Sony lost protected health information when they were hacked.  PHI has been lost in other hacks too.  Most organizations store PHI somewhere (often it is HR or in risk management).

While some things in cyber security are hard to do, many things are not hard to do.  If we start with the easy stuff, we do make the job harder for the bad guys.  Not impossible, just harder.  Let’s start doing the simple stuff.  We can worry about the hard stuff a little later.

Information for this post came from The Examiner.

 

Facebooktwitterredditlinkedinmailby feather

Office Of Civil Rights At HHS Starting Up Audits Again

The Office Of Civil Rights (OCR) has been pretty quiet these last couple of years regarding HIPAA audits, but that may be about to change.

OCR’s staff is small, so they have hired a contractor, FCI,  according to the Federal Register. In an interview, deputy director Deven McGraw says that they will be starting up random audits again early next year.

FCI’s contract for a little under a million dollars is very small by federal standards.  This means that they will be doing narrowly focused remote audits.

Recently, OCR fined a small Oncology clinic $750,000 for a laptop and server that were stolen but not encrypted.

Deven said that anything that is not nailed to the floor (her words) should be encrypted – laptops, storage devices, servers and desktops, for example.

She said that even though encryption is “addressable”, that does not mean that it is optional, even for the smallest health care providers and business associates.  We EXPECT you to address encryption of data at rest and if you don’t encrypt, you must implement an alternative option in it’s place as well as documenting the reasoning.

Illana Peters, senior advisor for compliance and enforcement at OCR said that there really aren’t any other great options besides encryption.

They also said that lost devices, even encrypted ones, that have to be reported are indicators of other problems at the organization.

Deven also said that it all starts with a HIPAA risk analysis.  I suspect that reviewing your risk analysis document is something that could easily be done remotely and lead to more questions if you do not have one or the one that you do have indicates more problems.  The message, regarding risk analysis is to stop procrastinating.

While it remains to be seen what OCR will do starting in 2016, this might be a good time for covered entities to make sure that their HIPAA house is in order as well as the house’s of their Business Associates, since CEs are now liable for the errors of their BAs.

Small providers – ones for whom a $750,000 fine for having two devices stolen out of an employees car would be devastating – should probably start looking now to see if they have their HIPAA security rule act in order.

Information for this post came from two articles at Data Breach Today, here and here.

Facebooktwitterredditlinkedinmailby feather