If you own or run a healthcare organization, you probably have someone on staff who acts as your security compliance officer. However, is it their primary job or area of expertise? Having a knowledgeable and experienced security compliance officer or resource is very important since the consequences of violating privacy regulations can be quite serious.
For example, did you know that federal regulators can fine an organization up to $50,000 per HIPAA violation and as much as $1.5 million per year in fines for releasing a patient’s protected health information (PHI)?
That’s why you need to know what a security compliance officer does and if it makes sense for you to work with an external company to help your organization comply with security regulations to avoid hefty fines.
What are a security compliance officer’s responsibilities?
According to the American Health Information Management Association (AHIMA), a healthcare security compliance officer oversees activities for developing, implementing, maintaining, and following an organization’s privacy policies and procedures. This is to ensure a patient’s PHI is kept secure and you’re complying with federal and state privacy laws.
Some of the compliance officer’s responsibilities include:
- Understanding government privacy regulations, especially HIPAA rules, to make sure your organization is complying with them.
- Assessing your organization’s risks and what steps are necessary to prevent and minimize exposure of your patients’ PHI.
- Creating, testing, and reviewing an organization’s information security systems to protect PHI.
- Setting up a security awareness program to meet HIPAA reporting requirements.
- Overseeing a reporting and management system to record and investigate a data breach, and prevent future violations.
- Maintaining a budget to fund information security management programs and processes.
- Managing a training program for employees to help prevent a privacy breach.
Who should be your security compliance officer?
Since this is such an essential role in your organization, it’s critical to have the right person for this job. It shouldn’t be just a part-time or extra job for one of your employees, such as an office manager or human resources director. As mentioned, the consequences of a data breach can be very serious and expensive.
While having IT experience can be helpful, this position also includes auditing, training, handling an incident, and managing business associate agreements with external partners and vendors. Other responsibilities may consist of making and updating a disaster recovery plan and overseeing facility security.
An ideal candidate is someone with the ability to organize, understands HIPAA and other privacy rules, and is knowledgeable about IT and computer systems.
In addition to picking the right person for the job with the relevant experience, the position should have the authority and power to implement needed changes to ensure compliance with HIPAA and privacy rules.
What if you use a cloud-based IT service?
You might assume if you use a cloud-based service for your IT systems, then you don’t need to worry about HIPAA compliance. However, an organization must ensure such services are secure and perform a risk analysis before using a cloud service for storing or transmitting electronic protected health information (ePHI).
In 2015, St. Elizabeth’s Medical Center in Brighton, MA had to pay $218,400 in penalties for violating the HIPAA Security Rule when they uploaded data without doing a risk analysis of the cloud service. An organization needs to set up risk management policies to lower the chances of a data breach as much as possible, even if they use a cloud-based service.
If you manage a healthcare organization, a cloud service provider is considered as a “HIPAA business associate.” This means they must sign a business associate agreement (BAA) before patient data is uploaded to the cloud service. You must have a signed BAA even if the information you upload is encrypted and the cloud service doesn’t have a decryption key.
What can happen if you don’t have a signed BAA from a cloud-based service provider? In one case, Oregon Health & Science University was fined $2.7 million by the Department of Health and Human Services’ Office for Civil Rights because they didn’t get a signed BAA from a cloud-based IT vendor.
The business associate agreement should outline how ePHI is used and disclosed and that both parties have security procedures to prevent the unauthorized release of PHI. This includes verifying that the cloud service vendor:
- Has reliable systems so information is readily available to a healthcare organization.
- Maintains a back-up and data recovery system in case of a natural disaster, ransomware attack, or other emergencies.
- Allows you to obtain data from their systems if you stop using their cloud services.
- Keeps information as secure as possible.
- Limits the use, retention and disclosure of PHI.
Should you work with a consultant or IT provider?
In some cases, you may decide that you need to work with an IT professional or consultant to assess your IT systems and infrastructure for potential weaknesses that can lead to a privacy breach.
Also, it may not be ideal for your internal staff to perform a risk assessment since it can be a challenge to objectively evaluate their practices and identify weaknesses. If you decide to contract with a third party for a risk assessment, make sure they’re experienced and knowledgeable about HIPAA and privacy rules.
Another option is using compliance software that’s customized for your organization’s needs and structure to help perform a risk assessment, train employees, and handle other functions.