What are the rules regarding IMD safety? That all depends on whom you ask.
Imagine being a patient and having to worry about someone hacking into your newly implanted pacemaker. Could this really happen, or is this just an “old wives’ tale” with a millennial flavor? Back in 2007, doctors disconnected the wireless function in then Vice President Dick Cheney’s left ventricular assist device because they feared terrorists would hack into the device and kill him.1 Digital devices have impacted healthcare in ways that few could have predicted. Half of all smartphone owners use their devices to access health information, with one-fifth having health-related apps on their devices.2 Additionally, smartphones are being paired with wearable biosensors to capture data for electrocardiograms, glucose monitoring, blood pressure, heart rate, and respiration rate, among many other uses.3 The newest implantable medical devices (IMDs) incorporate more complex communication and networking functions (typically referred to as “telemetry”).4 Examples of IMDs include subcutaneous blood glucose monitoring by biosensor, defibrillators, pacemakers, insulin systems, cochlear and ocular implants, and neurostimulators. In the future, our phones will have hardware attachments such as microfluidic chips that are able to perform routine and specialized lab tests, carbon nanotubes for analysis of breath and body fluid, and nanopore technology for DNA sequencing.3 Mobile apps such as these also meet the definition of a medical device, according to the U.S. Food & Drug Administration (FDA). In addition to digital technology in smartphone apps and attachments, there are now numerous IMDs that utilize digital technology. This article will discuss how HIPAA and other factors impact IMD usage, as well as what wound care providers must now about compliancy and safety.
LEGAL ISSUES INVOLVING IMDs
The LifeCare PCA™ Infusion System by Hospira, a leading provider of infusion technologies headquartered in Lake Forest, IL, was said to have vulnerabilities that could allow malicious outsiders to remotely modify drug dosages and/or cause potential under-infusion and over-infusion, which could have deadly consequences, according to independent researchers.5 At a security conference in 2001, an expert hacked into his own insulin pump to show how the device could be accessed remotely and how doses of his insulin could be changed.6 An additional risk that has occurred is that hackers of medical devices have subsequently used malware to gain access to hospital networks to steal confidential data.7 The increasing amount of health information that’s digitally generated, transmitted, and stored raises continued concerns about how to protect patient privacy and, in some cases, patient safety. Both smartphone apps and IMDs collect sensitive patient information, or what HIPAA law refers to as protected health information (PHI). PHI is any information about one’s healthcare, health status, or payment for healthcare services by a covered entity (CE) or business associate (BA). CEs are defined by HIPAA as health plans, healthcare clearinghouses, and healthcare providers who electronically transmit any health information in connection with transactions for which standards have been adopted by the U.S. Department of Health and Human Services. BAs are defined by HIPAA as a persons or entities that perform certain functions or activities that involve the use or disclosure of PHI on behalf of, or provide services to, a CE.
HIPAA’s security regulations cover all electronic PHI (ePHI). The HIPAA Security Rule requires implementation of three types of safeguards: 1) administrative, 2) physical, and 3) technical. The HIPAA-required security risk analysis and risk management protocol serve as tools to assist in the development of a CE’s or BA’s strategy to protect the confidentiality, integrity, and availability of ePHI. (The security risk assessment process will be covered in a future column.) Here, we will specifically discuss the HIPAA technical safeguards (Code of Federal Regulations [CFR] 45 164.304) that are the policy and procedures for its use that protect ePHI and control access to it. (Note that the security risk analysis will inform this process.) The first standard of HIPAA’s technical safeguards is “access control,” which refers to the ability or the means necessary to read, write, modify, or communicate data/information (or otherwise use any system resource). One of the relevant specifications here of access control is “encryption and decryption” [CFR 45 164.312(a)(2)(iv)]. CEs and BAs must have policies and procedures in place around how encryption and decryption will be managed. This specification is often confusing for people because the regulations specify that encryption and decryption are “addressable.” Addressable does not mean you can decide if you will utilize it; it was designed to give CEs and BAs flexibility with compliance. You must either (a) implement the addressable implementation specifications, (b) implement one or more alternative security measures to accomplish the same purpose, or (c) not implement either an addressable implementation specification or an alternative. The CE’s choice must be documented and the CE must decide whether a given addressable implementation specification is a reasonable and appropriate security measure to apply within his/her own practice or organization. Given the decrease in cost and the increase in risk to ePHI, it is reasonable to assume that, in most cases, encryption should be implemented. Each CE or BA must ask themselves which of their ePHI should be encrypted to prevent access by persons or software programs that have not been granted access rights. What is considered to be “reasonable” will depend on a variety of factors, such as one’s risk analysis and remediation program, which security measures are already in place, and the cost of implementation. However, in the aforementioned examples, when patient safety could be at risk, the CE will be hard-pressed to justify why he or she did not implement encryption.
Another HIPAA-required standard is “audit controls” [CFR 45 164.312(b)]. Audit controls require CEs and BAs to implement hardware, software, and/or procedural mechanisms that record and examine activity in the information systems that contain or use ePHI. Information systems typically provide some level of audit controls with a reporting method, such as audit reports, that allow the user to record and examine information system activity and provide the ability to determine if a security violation has occurred. The HIPAA-required standard of “integrity” [CFR 45 164.312(c)(1)] is also of interest in these examples, which means assuring that data and/or information have not been altered or destroyed in an unauthorized manner. An addressable specification under this standard is the “mechanism to authenticate ePHI.” This means that, when reasonable and appropriate, a CE must implement electronic mechanisms to corroborate that ePHI) has not been altered or destroyed in an unauthorized manner.
The final relevant standard is “transmission security” [CFR 45 164.312(e)(1)]. This requires that CEs and BAs implement technical security measures to guard against unauthorized access to ePHI that is being transmitted over an electronic communications network. This standard has two addressable specifications, “integrity controls” and “encryption,” as has been previously discussed, but is now referred to specifically when PHI is being transmitted. Integrity controls [CFR 45 164.312(e)(2)(i)] require CEs and BAs to implement security measures, when reasonable and appropriate, that ensure electronically transmitted ePHI is not improperly modified without detection until it has been disposed. CEs will want to evaluate this standard as part of one’s security risk assessment to be sure that ePHI is being adequately protected. It is important to note that medical devices generally do not fall under the HIPAA regulations, but instead fall under the purview of the FDA. The FDA can only recommend that devices be cyber secure; its main job is to certify the efficacy of devices. Thus, there is little impetus for device manufacturers to ensure their products are secure. This has led to cyber risks; for example, devices that were developed using Windows XP have numerous vulnerabilities. Windows XP is no longer supported by Microsoft, leaving an inability to obtain software patches or other updates. If the manufacturer were to update the device’s operating system, it must go through an extensive (and expensive) approval process. This has resulted in many manufacturers being poorly motivated to ensure the security of their devices. Unfortunately, there is no entity that can enforce this if one falls outside the definition of a BA, which most do. (An entity is a BA if it has access to PHI). CEs should ensure vendors of new devices have secure applications, regardless of whether or not they obtain FDA classification. Additionally, it falls upon the wound care professional as the CE to ensure that HIPAA security standards are met. The telemetry, IMDs, and related devices should be assessed using HIPAA’s administrative safeguards, including the technical safeguards discussed in this article. CEs should also ensure these devices are included in their security risk assessment, which, as noted, will be discussed in more detail in a future column. There is too much at stake when these breaches occur, either in the form of patient safety or protection of PHI. Mr. Cheney was ahead of the curve in understanding the risks; it’s time for the rest of us to catch up!
Roger Shindell is chief executive officer of Carosh Compliance Solutions, Crown Point, IN, which specializes in HIPAA compliance consulting for small to midsize practices and their business associates. He is also chairman of the HIMSS Risk Assessment Work Group and a member of AHIMA’s privacy and security council. Shindell has more than 30 years of multidisciplinary experience in healthcare and has served as an advisor and principal in healthcare, technology, and service companies. He may be reached at firstname.lastname@example.org.
1. Peterson A. Yes, Terrorists Could Have Hacked Dick Cheney’s Heart. Washington Post. 2013. Accessed online:www.washingtonpost.com/news/the-switch/wp/2013/10/21/yes-terrorists-could-have-hacked-dick-cheneys-heart
2. Fox S, Duggan M. Mobile Health 2012. Pew Research Center. 2012. Accessed online: http://pewinternet.org/~/media//files/reports/2012/pip_mobilehealth2012_final.pdf
3. Topol E. Digital medicine: empowering both patients and clinicians. Lancet. 2016;388(10046):740-1.
4. Camara C, Peris-Lopez P, Tapiador JE. Security and privacy issues in implantable medical devices: a comprehensive survey. J Biomed Inform. 2015;55:272-89.
5. LifeCare PCA3 and PCA5 Infusion Pump Systems by Hospira: FDA Safety Communication - Security Vulnerabilities. FDA. 2015. Accessed online: www.fda.gov/safety/medwatch/safetyinformation/safetyalertsforhumanmedicalproducts/ucm446828.htm
6. Kaplan D. Black Hat: Insulin Pumps Can be Hacked. SC Media. 2011. Accessed online: www.scmagazine.com/black-hat-insulin-pumps-can-be-hacked/article/559187
7. Smith M. MEDJACK 2: Old Malware Used in New Medical Device Hijacking Attacks to Breach Hospitals. CSO. 2016. Accessed online: www.networkworld.com/article/3088697/security/medjack-2-old-malware-used-in-new-medical-device-hijacking-attacks-to-breach-hospitals.html