Tag Archives: HIPAA

HIPAA Privacy Rules and High Tech Services

Health IT Security wrote an article beating up Amazon on it’s HIPAA compliance process.  The article was not favorable and also interesting.

The issue that they are talking about was a medic-alert style bracelet that someone bought on Amazon.  After this person bought it, the vendor put a picture of it, with the lady’s name, birth date and medical condition on it in an ad on Amazon.  The customer found out about it when her physician called her saying he had seen it.

When the buyer contacted Amazon, she was told they would investigate.  She later received an email from Amazon saying that they would not release the outcome of the investigation.

So the lady reached out to her local NBC TV affiliate.  It is amazing what a little bad PR can do.  The TV station contacted the Amazon vendor and they apologized and said they would fix the problem.  The TV station confirmed that the offending material was removed.

But this post is not about health jewelry.

It is to clear up a possible misunderstanding on the part of the average consumer.

While Amazon may yet get into trouble for not understanding and complying with HIPAA, this is not a HIPAA issue.

For consumers that use apps and other tech products there is an important lesson here.

Amazon does *NOT* have a HIPAA problem.

In fact, as of today, Amazon’s web site does not need to be HIPAA compliant because they are neither a covered entity nor a business associate under the terms of HIPAA.  Covered entities include organizations like doctors, hospitals and insurance companies.  Business associates are companies that handle HIPAA type information on behalf of one or more covered entity.

That means that they have no HIPAA requirement to protect your personal information.

They *MAY* have a requirement to protect it under state law in your state, but they also may not.  This depends on the particular law in your state.  In this case they may be in more trouble for publishing her birth date (which may be covered under her state’s privacy law) than her medical condition.

It does mean that they have no requirement to protect your healthcare information under Federal law because other than HIPAA, which does not apply here, there is no Federal law requiring anyone to protect your healthcare information that I am aware of.

This also includes Apple, Google and any app that is available on either the Apple or Android stores.  Apple and Google are likely covered entities because of the way their employee health insurance plans work, but that is completely separate from iphones, android phones and apps.

So, if one of those apps collects information from a hospital for you, for example, and makes it available to you, they can certainly use the diagnosis, for example, that you have diabetes to show you ads for diabetes medicine or supplies.

It is also possible (although I think this may be pretty dicey) that they could sell your healthcare data.  Depending on the state that you live in, healthcare data may not be protected AT ALL under the state’s privacy laws.  This is likely because legislators are usually lawyers and lawyers rarely understand tech and often don’t understand privacy and they think that your healthcare data is protected under HIPAA.  it is, but only under certain circumstances.  The net effect is that it MAY BE perfectly legal to sell your health care information.

If anyone thinks differently, please post a reply and I will publish it.

Information for this post came from Health IT Security.

 

Facebooktwitterredditlinkedinmailby feather

Email Breach at Oxygen Equipment Maker Affects 30,000

Oxygen equipment maker Inogen announced that information on 30,000 customers was hacked as an attacker compromised the credentials of an employee.

In the grand scheme of breaches, this one barely registers.  Yes, HIPAA protected information was taken (and Health and Human Services may come after them in say 2021, but it is another example of totally preventable self inflicted wounds.

OK, now that I have sufficiently beaten them up, lets look at what they did wrong.

The company is publicly traded so they need to be SOX compliant.  They should have a Board advising them on issues like cybersecurity, but likely not.  Totally silent on the issue.

The breach went from January 2 to March 14 – certainly not the longest breach, but certainly not the shortest.  I know of an incident recently where a company received indicators of a breach at 6:30 AM one day and had contained and mitigated the breach before 9:00 AM the same day and they are looking to shorten that window.  What kind of monitoring and alerting did Inogen have?  Over two months for the hacker to do the dastardly deed?  Obviously, not good enough.

The stolen emails contained name, address, phone number, email address, date of birth, date of death, Medicare ID number, insurance information and type of equipment.  What is that doing in email?  That belongs inside a secure application or web portal.  Not only is this a HIPAA violation before the breach, it is a privacy breach after the event.  The company is based in California, so the Attorney General may be rattling their cage as well.

The worker’s credentials were compromised and then the attacker logged in. From another country.  Two factor authentication would have neutered the attack and, failing that, conditional access geo-fencing would have stopped the attacker cold.  Where was their CISO?  Do they even have one?

One thing they did right – they disclosed the breach in their latest SEC filings. In light of the SEC’s new cybersecurity transparency rules, that is probably a very smart move (to disclose).  One less party out to sue them.

In the SEC filing the company said they hired a forensics firm and made users change their passwords.  Definitely impressive (not).

They have also turned on two factor authentication.  A little late, but better late than never.

Oh, yeah, they have started training.  Nice.  Would have been nicer years ago.

One challenge is the founders are a few young kids who did not, until this, have many battle scars.

I am guessing they are getting those scars now.

Finally, they say in the SEC filing that they have insurance but it may not cover the costs.  Cyber insurance is good, but you better have enough and the right options.  Depending on what lawsuits happen and what regulators (such as Cali and HHS) go after them, this could cost them a couple of million or more.  Depending on what coverage they have, they could be writing all or part of that check themselves.

As a side note, Airway Oxygen, likely a competitor, told HHS last June that they had a breach affecting 500,000 customers.

Cardionet paid a fine to HHS last year of $2.5 million.  That is just the fine and doesn’t cover any other costs.  With a fine like that, Inogen’s total costs could be in the $3-$5 million range.  If they have a $1 million cyber policy, they will be writing a large check.

Other companies could learn from their lessons.  The learning part is free.  OR, they can wait until their story is in the news.  That can be a tad more expensive!

Information for this post came from Careers Info Security.

Facebooktwitterredditlinkedinmailby feather

The Times They Are A Changin

In spite of all of the data breaches that we see on an almost daily basis, we have seen time and again that the courts have dismissed lawsuits for a variety of reasons.  In many cases, the reason is called lack of standing.

Under U.S. Federal law, standing is based on Article III of the U.S. Constitution.  Article III requires you have injury in fact to your own legal interests, in other words, you have suffered some sort of actual harm.  That only applies to lawsuits filed in Federal court.  This is one reason why credit card companies credit you for fraudulent charges,  No lost money, no harm, no ability to sue.

But judges have been loosening the definition of actual harm over the last few years in light of all of the breaches.

Now the Connecticut State Supreme Court has ruled that there is a DUTY of confidentiality between doctor and patient and patients may sue in cases of unauthorized disclosure of protected health information or PHI.

In this case, the plaintiff was pregnant and asked the doctor not to release information to the father of the child, whom the plaintiff was no longer in relationship with.

The practice received a subpoena and in response mailed a copy of the patient’s medical records to the court.

Only problem is, that wasn’t what the subpoena told the doctor to do.  All it said was that the custodian of the records had to appear before the attorney who requested the subpoena.

HIPAA, which governs the disclosure of medical records, says that records may be disclosed in the case of a subpoena, but only if the patient has received adequate notice or a qualified protective order has been issued.

The doctor did none of these things.

Other state courts are also wrestling with these issues.

So now, at least in Connecticut, patients have an expectation of privacy in their medical records and if doctors and hospitals don’t take that expectation seriously enough, patients do have the ability to sue.

It seems to be that the courts are chipping away at this standing conversation, understanding that people are actually being harmed, even if it is not in a measurable, financial way.

While the Connecticut Supreme Court ruling is not binding in any other states, that does not mean that judges won’t be looking at that ruling.

An important note here – this lawsuit is not based on a breach or a hack.  This was based on an inappropriate action of a staff member in the doctor’s office.  It seems unlikely that if the disclosure was due to a breach that the answer from the court would have been any different, but of course, we don not know.

Information for this post came from Health IT Security.

 

Facebooktwitterredditlinkedinmailby feather

The Price of a Breach? Bankruptcy?

21st Century Oncology,  who bills itself as the world’s largest operator of cancer treatment centers with 179 locations, suffered a breach in 2015, losing control of 2+ million patient records.

According to law firm Motley Rice, they found out about the breach when the FBI notified them – not a great way to start your day – (see here).  The breach, they say, happened a month prior, in October 2015.A

While 21st Century is a bit of a high flyer – started in 1983, they sold out to Vestar Parters for $1 billion in 2008, planned to go public in 2014 but changed their mind and raised $325 million privately instead – they have all the problems of any business.

They filed for bankruptcy earlier this year, citing a bunch of reasons including uncertainty in the health insurance market as a result of the new administration, but also the cost of litigation and the cost of complying with regulations regarding electronic health records – in other words the cost resulting from the breach including setting lawsuits from patients who’s data was compromised and settling claims from Health and Human Services regarding the breach.

Health and Human Services said that 21st Century failed to:

  • Failed to conduct an accurate and thorough assessment of the potential risks and vulnerabilities to the confidentiality, integrity, and availability of the electronic protected health information.
  • Failed to implement security measures sufficient to reduce risks and vulnerabilities to a reasonable and appropriate level.
  • Failed to implement procedures to regularly review records of information system activity, such as audit logs, access reports, and security incident tracking reports.
  • Failed to have a written business associate agreement before disclosing protected health information to third-party vendors.

In other words, failing to have any kind of reasonable cyber security program.

Last month 21st Century has agreed to pay a fine of $2.3 million in lieu of what HHS could have whacked them with, which is many times that number and:

  • Complete a risk assessment and create a risk management plan
  • Revise policies and procedures
  • Educate its workforce
  • Create and maintain Business Associate Agreements (BAAs) with people it shares patient data with
  • Submit to an internal monitoring plan – HHS’s version of an ankle monitor.

Also, if they fail to execute the corrective action plan all bets are off and HHS can go after them for real civil money penalties.

HHS will supervise this corrective action plan and if they don’t like something that 21st Century has done, like their security policies, for example, 21st Century has 30 days to fix it.

They are also required to engage and pay for an external third party to monitor their progress.  HHS gets to interview and approve this third party.  The assessor will submit a plan to play nanny to 21st Century within 60 days of selection and HHS must approve this plan.  The assessor, according to the terms of the corrective action plan must make unannounced site inspections during the term of the agreement.   The third party must provide an annual compliance report to HHS.

A copy of the agreement can be found here.

While there are other business reasons for filing for bankruptcy, the after effects, including settlements and lawsuits related to the breach are likely an important part of that filing.

While it is not clear to me what the effect of the bankruptcy filing is on lawsuits that not yet come to trial, there is certainly a short term effect of staying them while the bankruptcy court figures things out.  I am also not clear what effect the bankruptcy filing will have on lawsuits that were not filed prior to the bankruptcy filing date.  This could be a way to dramatically reduce their liability, although it certainly would not make them any friends with investors who were affected by the bankruptcy.  21st Century has been involved in a number of lawsuits related to over and fraudulent billing and fees paid to doctors for referring patients to company owned facilities.  Clearly security is only one of many problems they are dealing with.

Apparently the bankruptcy did not stop HHS’ actions including fines and the corrective action plan.

Information for this post came from Dark Reading.

 

 

 

 

 

Facebooktwitterredditlinkedinmailby feather

More Healthcare Breaches, Record Fines and Other Issues

Another day, another healthcare ransomware attack.  Erie County Medical Center and Terrace View long term care in Buffalo, New York have been dealing with a ransomware attack for about 10 days now.  On April 9th, a Sunday, the computers got hit by what they are only calling a virus, but according to someone I talked to today, it is, in fact, a ransomware attack.  They have not paid the ransom and do not intend to, but from April 9th to the 15th, all systems were down.  They hoped to have the patient data part of their systems operational by the 15th at which point they would need to start entering the backlog of patient data and any data that was lost.

According to local media, the email system is also supposed to be up by that time.

After that is complete, they planned on working to restore systems such as payroll.

According to the person I talked to this morning, as of today, they are still working on recovering.

I am sure that they will complete a lessons learned exercise once people get some sleep, but from the outside, a couple of questions are obvious.  Their disaster recovery plan seems to be lacking if they are still recovering 10 days later.  We don’t know if their business continuity plan is sufficient.  They didn’t have to close the hospital, which is good, but what is the impact on patient care and staff workload.  Finally, how did this ransomware spread so widely in the organization that it is taking them more that 10 days to recover.

As a side note, the Beazley cyber insurance company says that ransomware attacks that were reported to them quadrupled in 2016 and they expect that to double again in 2017.  Half of the attacks were in healthcare.

The FDA is now shifting its focus to medical devices, like the ones from St. Judes, that the FDA slammed the firm over last month.

 

As if that wasn’t enough to worry about, Health and Human Services Office of Civil Rights levied more fines in 2016 than any other year to organizations that were breached.  They announced 12 settlements averaging $2 million in 2016 and three more in the first two months of 2017 PLUS a fourth case that had a fine of $3.2 million.

Some of these cases required the appointment of an external monitor or baby sitter, indicating that OCR didn’t trust those organizations to fix the problems without oversight.

These handful of cases, while significant, represent a fractional percentage of the roughly 17,000 cases a year that are filed with OCR.

In addition, OCR is finishing up a series of desk audits of covered entities and is about to start on auditing business associates.

While it is unclear what will happen under the Trump administration, OCR is funded mainly by the fines they levy, so it may well be the case that things run as they have for the last few years.  Stay tuned.

Putting all of this together should be a red flag to anyone in healthcare that they need to get very serious about cyber security.  It is not likely to get any better or easier any time soon.

 

Information for this post came from Disruptive Views and hrdailyadvisor.

Facebooktwitterredditlinkedinmailby feather

Hospital System Fined $5.5 Million For Not Controlling Access

Memorial Healthcare Systems in Florida was fined $5.5 million for allowing the information of about 115,000 patients to be accessed “impermissibly”.

Memorial, which operates 6 hospitals, an urgent care center, a nursing home and other healthcare facilities in South Florida, reported the breach in 2012 – 5 years ago – after it discovered the problem.  Exactly why it should take Health and Human Services 5 years to complete an investigation is a mystery to me.

The information taken includes names, birth dates and social security numbers.

Apparently, two employees who worked in an affiliated physicians’ office accessed the hospital’s systems for a year, stole patient records of over 100,000 patients and used that data to file fraudulent tax returns.

After discovering  that employees had been stealing data for a year, Memorial worked with federal law enforcement which ultimately led to the conviction of the people who filed the false tax returns using that stolen data.

Apparently, even though Memorial had been told for the six years prior to discovering the breach that reviewing employee data access records was a risk, they still did not review those records.

As part of the settlement, Memorial denied any guilt.  It seems to me that, if they had been told for six years that something was a risk and chose not to deal with it, they have some degree of guilt.  Not admitting guilt is fairly typical in these deals so as to avoid giving plaintiffs who might be suing them any additional leverage.

It appears that the credentials used to access these records were legitimate, but it is unclear to me how the physician’s office staff got access to them.

This brings up the bigger issue of logging and auditing – something that affects all businesses;  they were not using credentials assigned to them when they stole the data.

We are seeing more regulators requiring businesses to maintain more comprehensive audit logs and processes.  Besides the HIPAA regulators, DoD and some state regulators have issued new rules or opinions.

But in addition to creating audit logs, you also need to review them and generate alerts based on that review.  For a business like Memorial, that likely requires reviewing millions or even tens of millions of audit records.  That requires both software and people and those require money.  That is likely at the root of the issue.  After they discovered the breach, they did implement a review process, but apparently, that decision not to review data access records cost them a $5.5 million fine as well has having to implement a multi year corrective action plan with the HIPAA regulator.

This represents a great opportunity for businesses in general to review their auditing processes – what audit data are we collecting, does that audit data meet the regulatory requirements, how long do we store it for and how do we analyze it – to verify that it is appropriate for both compliance reasons and business requirements.

Information for this post came from the Sun Sentinel and Health and Human Services.

Facebooktwitterredditlinkedinmailby feather