Editor’s Note: Recognizing the growing complexities of sharing data inside—and outside—the healthcare system, Manatt Health and the eHealth Initiative Foundation (eHI) hosted two executive advisory board meetings on privacy and security in the age of wearable technologies. The roundtables brought together privacy and security experts to explore data sharing within and among organizations, examining the relationships providers have with business associates and application (app) developers; data-sharing implications for the bioeconomy; and state, federal and international policies that seek to guide organizations through the murky terrain. A new issue brief, summarized below, captures highlights from the advisory board discussions. To download the full issue brief free, click here.
__________________________________________________
The Health Insurance Portability and Accountability Act (HIPAA) was enacted in 1996, when healthcare providers and payers were still using paper-based medical records. The iPhone, iPad and other mobile devices did not emerge until almost a decade later. Although HIPAA was amended in the 2009 Health Information Technology for Economic and Clinical Health (HITECH) Act to address challenges arising from the use of electronic health records (EHRs), numerous issues remain with the aging legislation. A significant amount of health data is now generated by apps and consumer devices that are not governed by HIPAA.
Covered vs. Non-Covered HIPAA Entities
Organizations that are legally required to follow the privacy rules laid out in HIPAA (such as health plans, healthcare clearinghouses and providers) are called “covered entities.” Protected health information (PHI) found in medical records, claims information and lab results is covered under HIPAA. Organizations that conduct business with covered entities also need to follow HIPAA. These “business associates” are required to sign “business associate agreements” with the covered entity and must comply with HIPAA privacy and security standards.
If business associates have a breach in app data, breach notification, pursuant to HIPAA, is required. In today’s world of mHealth, web portals and smartphones, covered entities are struggling to understand the parameters for working with app developers that may or may not constitute business associates. Determining if an app developer is a business associate is extremely important as it sets the stage for whether or not data shared between the covered entity and app developer is regulated under HIPAA.
Determining Who Is a Business Associate
Determining if an organization is a business associate can be complicated. At the heart of the business associate determination is whether the app is being offered on behalf of the covered entity. Various factors contributing to the business associate determination include:
- How is the app branded?
- Do consumers access the app through the covered entity or a separate channel?
- Is the app (or an enhanced version) available only through the covered entity (for example, only to patients of members of the covered entity)?
- How does data flow between the covered entity and app developer?
- Does the app developer provide related services to the covered entity?
For example, if a provider contracts with an app developer for patient management services and the app is the means for providing those services, the app developer is likely a business associate and a business associate agreement is required. In this scenario, the patient downloads the health app at the direction of the provider, and information the patient enters is automatically incorporated into his or her EHR.
Understanding Protected Health Information
PHI is difficult and expensive for app developers, artificial intelligence analysts and other data miners to obtain. Therefore, non-covered groups who want to commercialize PHI often court payers and providers who control vast amounts of PHI. In the absence of authorization by the subject of the PHI, app developers that are business associates of covered entities may only use or disclose the PHI they collect for HIPAA-permitted purposes and as authorized in the business associate agreement. The PHI must be returned to the covered entity or destroyed upon termination of the relationship.
Consumers Acting on Their Own Are Not Business Associates
If a company offers a direct-to-consumer version of an app that is not provided on behalf of a covered entity, it is not subject to HIPAA. An arrangement where a company provides an app or service directly to a consumer and transmits data on behalf of the consumer does not create a business associate relationship with the covered entity.
Federal Guidance and Regulations for Covered Entities and App Developers
To help bridge the technology gap with the industry, federal government agencies are offering guidance to business associates and covered entities.
The HITECH Act permits state attorneys general (AGs) to obtain damages on behalf of state residents, enjoin further violations of HIPAA and bring civil actions on behalf of state residents for violations. The Office of Civil Rights (OCR) developed HIPAA Enforcement Training to aid AGs in investigating and seeking damages for HIPAA violations that affect residents of their states.
The Department of Health and Human Services’ (HHS) Guidance on HIPAA & Cloud Computing, while intended for cloud computing, has also proven relevant for business associates and covered entities. This guidance presents 11 key questions and answers to assist HIPAA-regulated cloud service providers (CSPs) and their customers in understanding their responsibilities under the HIPAA Rules when they create, receive, maintain or transmit electronic PHI (ePHI) using cloud products and services.
The National Institutes of Health (NIH) created guidance for covered entities, business associates and hybrid entities that perform both covered and noncovered functions as part of their business operations. In 2015, the Food and Drug Administration (FDA) issued guidance that replaced its 2013 guidance, which informs manufacturers, distributors and other entities about the subset of mobile app platforms and software applications over which the FDA intends to apply its regulatory authority.
The California Consumer Privacy Act
The California Consumer Privacy Act (CCPA) gives consumers the right to know how much of their personal data is being collected by companies, as well as the right to have that data deleted upon request and the right to opt out of any sale of their data to a third party. (Consumers under age 16 must opt in to any such sale.) The CCPA was signed into law on June 28, 2018, and goes into effect in January 2020.
The CCPA is the nation’s strictest consumer privacy and data protection measure. The law will apply to any for-profit entity doing business in California that (1) collects California residents’ personal information (PI) solely or jointly with others, and (2) either (i) exceeds $25 million in annual gross revenues; (ii) annually transacts in the PI of 50,000 or more consumers, households or devices; or (iii) derives half or more of its annual revenues from PI sales. The law applies to businesses that collect, use or share PI of California residents, including those who are outside the state for temporary or transitory purposes (e.g., travelers).
Companies already regulated under either the California Confidentiality of Medical Information Act (CMIA) or HIPAA should continue to comply with those rules when handling medical information. The CCPA does not supersede those laws. A significant portion of California’s hospitals are not-for-profit, which means they may not be subject to the CCPA at all. Although the law exempts businesses and providers covered by HIPAA, it will have an enormous impact on a wide range of consumer-directed healthcare companies, including those working with digital health, pharmaceutical and medical device manufacturers, healthcare technology companies, wearable manufacturers, and mHealth app developers.
General Data Protection Regulation (GDPR)
The GDPR—which took effect on May 25, 2018—is industry agnostic and applies to any company in the world that processes the personal data of anyone physically located in the EU. Several obligations apply, including the need to establish a legal basis for processing personal data and, if sensitive personal data is involved, the need to satisfy additional special conditions. The GDPR requires organizations’ due diligence regarding their own activities, as well as those of business partners and vendors, in figuring out what is being collected, from whom and how.
The GDPR is designed to harmonize data privacy laws across Europe and give greater protection and rights to individuals. Companies covered by the GDPR are accountable for their handling of personal data, and those with more than 250 employees need to have documentation of why personal data is being collected and processed, descriptions of the information held, how long the data is kept, and descriptions of technical security measures in place. Additionally, companies that have “regular and systematic monitoring” of individuals at a large scale or process a lot of sensitive personal data have to employ a data protection officer (DPO). Businesses are also required to use a “positive opt-in” process.
Implications of the CCPA and the GDPR
Knowing what information is being gathered and for whom it is being collected, and then requesting the business case for the information, including if it is truly needed, will help companies with compliance for both the GDPR and the CCPA. Although the CCPA helps California, it does not address larger, nationwide issues with privacy, security and technology. Additionally, the GDPR was not crafted with the U.S. as a frame of reference. There has been no guidance on the GDPR and HIPAA interface. Entities conducting business overseas with U.S. consumers, however, still need to follow U.S. rules.
Although the U.S. government is behind the curve in creating comprehensive, national legislation that handles all aspects of consumer privacy and considers the technological advances of the future, each state mandating various privacy laws would be disastrous for consumers and stifling for industry. Discrepancies in breach notifications is one example of why a national solution is needed. Although companies have advocated for notification periods as long as 90 days, the CCPA requires breach notification to occur within five days and the GDPR requires notification within 72 hours. A national standard would alleviate the confusion.
Consumers Relinquishing Their Rights to Personal Data
A growing number of consumers voluntarily give away their personal genomic data, without any restrictions, to the DNA market. There are now more than 200 direct-to-consumer genealogy sites, with the collection of biological data being the true return on investment. The average consumer is not contemplating the ramifications of providing his or her DNA to a genealogy company, nor reading the FTC’s guidance on direct-to-consumer genetic tests.
Most consumers share genomic data on these sites to answer questions about their own genealogy. Online genomics companies, however, see another use for the data they collect. Use of DNA data for clinical research is a lucrative market. In July 2018, 23andMe announced that GlaxoSmithKline (GSK) invested $300 million in the company to gain exclusive access to 23andMe’s genetic database of more than 5 million people.1 Pharmaceutical companies are able to custom-make drugs with DNA data, and GSK intends to use 23andMe’s database to develop an experimental Parkinson’s drug. As more pharmaceutical companies begin to partner with genealogy groups, there are likely to be legal and ethical questions about the appropriate use of consumer data.
Final Thoughts on Developing a Values Framework
Despite the shortcomings of HIPAA, it has survived more than two decades. Now policymakers and industry leaders need to ensure that new privacy and security regulations have the ability to evolve with the explosion of new technologies.
New technologies have pushed society to reconsider current models for privacy and ethics and are raising important questions about individual liberty, dignity and autonomy. Given the rapid speed of technology development, it may be impossible for legislators to ensure federal and state policies address all consumer concerns. To make matters more complex, consumer concerns about the privacy of their data vary greatly. Before developing strict privacy policies, policymakers and industry leaders may want to first focus on developing a values framework to guide the future use of personal health information.
1https://www.healthcareitnews.com/news/23andme-lands-300-million-investment-glaxosmithkline