Federal regulations are usually complex and can have unintended consequences. If you are a lawyer, they are a goldmine both for prosecutors and consultants for clients. When it comes to the Health Insurance Portability and Accountability Act (HIPAA), there are, however, just a few (relatively) simple steps you can take to avoid violations, fines, and bad publicity for your healthcare organization. HIPAA is not a paper tiger. Healthcare organizations have been fined millions of dollars for breaches and violations so far.
Download a very informative infographic here.
What Is A Violation? What Is A Breach?
For once, there’s a nice, simple distinction which does not require a legal mind to understand. The definition of a violation is:
“A failure to do what HIPAA requires to keep protected health information (PHI) secure.”
Basically, PHI is data that HIPAA requires be protected by following the steps suggested below. There are a few simple steps one can take to avoid violations. These are things that you should be doing already to protect data in general. All HIPAA does is specify them. They are:
- Records must be secured. This means limiting access to the data that employees and providers need to do their jobs. Records can be secured by passwords, biometric identifiers, swipe cards, fingerprint readers, etc. PHI must be encrypted. Enough said. While you’re at it, what rationale do you have for not encrypting all data?
- PHI must be encrypted against hacking and breaches. This means that if there is a breach and encrypted records are stolen, you are still liable. Encryption can be broken, and every method of securing data, except for quantum encryption, which is not generally available, has vulnerabilities.
- Devices must be secured. This generally means that portable devices, such as smartphones and laptops that could be lost or stolen must be secured via encryption and passwords or have other limitations on access. The theft of a device should not enable the one who stole it (or finds it) to access PHI. If the chief of surgery leaves his or her smartphone in a cab, there must be a way to remotely erase all data on it.
- Finally, employees must be trained, not only on the importance of security, but also on the organization’s specific methods of maintaining PHI.
A breach is disclosure of PHI to parties that are not authorized by HIPAA to access it. A breach has occurred if PHI is accessed by an unauthorized party, even if that data is actually stolen.
How Do I Prevent A Breach?
- Train employees on why security is important and how to minimize risks.
- Maintain possession and control security on mobile devices.
- Enable firewalls and encryption.
- Ensure that files are encrypted and stored correctly.
- Move towards a paperless operation and properly dispose of paper files.
How Do I Plan Ahead?
HIPAA regulations require that a healthcare organization regularly audit its security and have a risk mitigation action plan. If you take the following steps, you will have gone a long way towards ensuring that your organization can pass audits and show that you have already taken steps toward risk mitigation:
- Encrypt PHI both in storage and transmission.
- Use secure access controls including strong passwords, access limited to job functions, auto timeouts, and screen locking.
- Use firewalls and antivirus software on all desktops and mobile devices.
- Keep track of incoming, as well as outgoing, data. Know where data comes from.
- Keep your risk mitigation plan updated to deal with new threats. The “threat surface” is constantly changing.
- Keep software and firmware updated. Many new attacks are aimed at hardware as well as software vulnerabilities.
- Keep employee training up-to-date.
How Can My Organization Respond to Breaches?
Your organization should have a written post-breach action plan that is regularly updated. It is foolish to assume that a breach will never occur. The plan should be updated and reviewed at least annually. What is most important is transparency. HHS needs to be notified. Those whose data has been exposed have to be notified. Your legal staff should be notified. And anything resembling a cover-up should be avoided.
Concealment of a breach is a violation of the law and regulations, and can be guaranteed not to work. Breaches will eventually be exposed, and the delay in reporting them or attempts at concealment will only make the organization look worse than would otherwise be the case.
Any breach – or even detection of an attack – should be used as a lesson for future security efforts. The first task is to figure out what went wrong. Where did your security measures fail? Once that is known, you need to determine the root cause. The root cause is usually a bit of a surprise.
In one case, the organization’s own security efforts were perfect. But it used a cloud vendor whose security practices were much laxer. Data was encrypted and communications between the organization and the cloud vendor were secure and encrypted. But the cloud vendor stored decrypted data on one of its own servers that were not even protected by a password and exposed to the public internet. The lesson here is that security audits should cover both your own organization and all of your vendors.