This past Tuesday the Federal Trade Commission (FTC) hosted its 5th annual PrivacyCon, which I listened to as I traveled back home from a much-needed weekend get-away with my family . . . and I am glad that I did — it was a GREAT event! The full-day event covered a wide-range of cutting edge and titillating issues concerning the privacy of data in this day and age of rapidly accelerating technology. However, it was the morning session which covered Health Apps that interested me the most. Here is what I learned from the four-person panel of experts who discussed the ins-and-outs of Health Apps and potential direction of future enforcement by the FTC:
- In his opening remarks, the Director of FTC’s Bureau of Consumer Protection, Andrew Smith, came out-of-the-gate pointing out that earlier this year HHS issued rules that will make it easier for consumers to access their medical records through the app of their choice, and while this expanded access to health information can be an enormous benefit to consumers – wherever data flow opportunities increase, the opportunities for data compromise increase as well. Director Smith concluded his opening remarks by stating “We at the FTC will not hesitate to take action when companies misrepresent what they are doing with consumers’ health information or otherwise put health data at undue risk . . .”
- The panel of experts assembled to discuss Health Apps and data privacy included researchers from Harvard, University of Toronto and Beth Israel Medical Center. They were joined by two FTC attorneys as moderators.
- First up was Quinn Grundy from the University of Toronto. This study looked at the top 100 paid and free Health Apps, and identified 24 that used medication data in particular and shared user data outside of the App with companies that conduct targeted consumer advertisement. These apps also sold “de-identified” data to third parties, like pharmaceutical companies. The study concluded the following: (1) sharing of user health data is routine (i.e., apps do it regularly), but not transparent (i.e., consumers are not told how the apps share the information and with whom); (2) clinicians should be conscious of privacy risks in their own use of Apps and, when recommending Apps, explain the potential for loss of privacy as part of informed consent; (3) privacy regulation should emphasize the accountability of those who control and process user data; (4) developers should disclose all data sharing practices and allow users to choose precisely what data are shared and with whom. Therefore, Ms. Grundy warned, “[d]espite de-identifying data, we need to consider wider harms from sharing of health data.” You can review her entire research study on Health Apps here: Data Sharing Practices of Medicines Related Apps and the Mobile Ecosystem: Traffic, Content and Network Analysis; and Commercialization of User Data by Developers of Medicines-Related Apps.
- Next was Ken Mandl from Boston Children’s Hospital and Harvard Medical School presenting his paper on consumer protections for EHR-connected Apps. He points out “[P]olicymakers are grappling with concerns that data crossing the API and leaving a HIPAA covered entity are no longer governed by HIPAA. Instead, commercial apps and the data therein fall under oversight of the Federal Trade Commission (FTC) under Section 5(a) of the FTC Act (FTCA) which prohibits ‘unfair or deceptive acts or practices in or affecting commerce.'” Mr. Mandl’s study focused on exploring two pathways for strengthening the FTC’s capacity to protect patients. The first approach would be to standardize the terms of service and privacy policies presented to consumers when interacting with EHR-connected apps. To this end, he points out [as I have in my previous Legal HIE post here] that the ONC strongly recommended that Health Apps develop and publish specific privacy practices which cover at least five key issues. The author also suggested that patient-facing privacy notices should be focused on privacy and security risks posed by the technology or third-party developer, and be provided in a manner that is non-discriminatory, factually accurate, unbiased, objective, and not unfair or deceptive. The study also analyzed ONC’s 2018 Model Privacy Notice and other sources to suggest an approach for Apps developing a patient privacy notice that is streamlined. Processes by which a Health App “registers” with an EHR could also present an opportunity to capture these privacy terms through a “yes” “no” algorithm. You can review his paper here: A Technical Approach to Shore up FTC Consumer Protections for Electronic Health Record-Connected Apps.
- Dena Mendlesohn from Elektra Labs presented next on Evaluating and Securing the Connected Sensor Technologies that Power Health Apps. She believes that there are real potential benefits of using connected sensor technology to better monitor health, especially in the time of COVID-19. She pointed out, however, that there has not been enough public discussion about the risks that such technology poses, including most importantly cyber-security protections and privacy rights issues. She proposed that connected sensor health technologies should offer “nutrition label-type” disclosures that detail the protections that they offer.
- Finally, Dr. John Torous of Harvard Medical School and Sarah Lagan of Beth Israel Deaconess Medical Center discussed their work developing an evaluation framework for Health Apps. Dr. Torous started off with an opinion that there are “good” apps that can help health care, but also many “dangerous” apps, including those that expose and sell personal health data. Additionally, most apps that fall into the “health & fitness” category (i.e., not medical devices regulated by FDA) are not tightly regulated and make false claims about effectiveness. Privacy concerns include the fact that apps do NOT voluntarily embrace HIPAA standards (i.e., do not claim to be “HIPAA compliant”), and one study found that ONLY 50% of such apps shared data security practices, and 80% shared health-related data with third parties. This includes sharing of sensitive information – such as depression – with third parties, including advertisers who then can then target and exploit consumers. They point out that consumers often look at the “store ratings” when considering which Health App to buy, but that such ratings do not necessarily correlate with protecting the consumer’s interests. One framework to better evaluate Health Apps would include 5 levels of evaluation, one of which includes privacy & security considerations including: 1) data collected; 2) data storage; 3) personal health information; 4) security measures in place; 5) deleting personal data; and 6) privacy policy. The presenters proposed over 100 objective questions which should be asked and go into the equation of properly evaluating a health app including what is the origin of the Health App. For example, does the App originate from the government? A for-profit company or developer? A trusted healthcare organization? Or an academic institution? Privacy and security questions would include evaluation of answer to questions such as: Does the Health App have a privacy policy? Does the App declare data use and purpose? Does the App report security measures in place? Is PHI shared? Is de-identified data shared? Can the consumer opt-out of data collection? Can the consumer delete her/his data? Does the App claim it meets HIPAA’s standards? Does the App use 3rd party vendors, like google analytics? You can review his study here: Actionable App Evaluation: Objective Standards to Guide Assessment and Implementation of Digital Health Interventions.
If you missed PrivacyCon and want to review more of the content that was covered during that event, the FTC is posting copies of the presentations, transcripts and video on its website here.
_____________
Subscribe to HERE to Legal HIE’s compliance library to gain access to sample policies, documents and tools for compliance with the Info Blocking Rules, HIPAA updates, 42 CFR Part 2 changes, and Breach Notification.