Think Tank Issues Health Apps, Wearables Privacy Guide

Bloomberg Law: Privacy & Data Security brings you single-source access to the expertise of Bloomberg Law’s privacy and data security editorial team, contributing practitioners,...

By Jimmy H. Koo

Aug. 17 — Health applications and wearable devices companies should adopt privacy best practices, the Future of Privacy Forum (FPF) said Aug. 17.

The “Best Practices for Consumer Wearables and Wellness Apps and Devices” seek to fill the gaps in regulating health information privacy and security, the Washington think tank said. Those regulatory gaps were identified by a July 19 Department of Health and Human Services (HHS) report, according to FPF.

The HHS found that U.S. citizens regularly share health information on social media, fitness websites and with internet researchers, with the false assumption that their privacy is protected by federal law (15 PVLR 1525, 7/25/16).

The guidelines help illustrate that tech companies handling consumer health data should be aware of applicable privacy laws as well as potential consumer sentiment.

Data Sensitivity Spectrum

The FPF guidelines said that “many wearables collect data that will be unlikely to be covered by specific sectoral protections,” such as the Health Insurance Portability and Accountability Act, the Children's Online Privacy Protection Rule and the Fair Credit Reporting Act. The guidelines seek to add protections for data that may not be covered by specific legislation and to add guidance in areas where general privacy statutes are applicable, FPF said.

“Some data collected from wearables may be relatively trivial, but other data can be highly sensitive,” FPF Policy Counsel Kelsey Finch said in an Aug. 17 statement.

The guidelines suggested companies provide consumers with choices about the sharing and use of their data, support interoperability with international privacy frameworks and “elevate data norms around research, privacy and security.”

European and other foreign privacy laws often place higher protection standards for highly sensitive personal information such as health data, which can include “consumer-generated wellness data,” the report said. Treating all health data the same way “would be a mistake” because there isn't a clear distinction between sensitive health data and non-sensitive lifestyle data, according to the guidelines.

The guidelines suggested a “spectrum” approach, under which privacy protection levels should be “calibrated to the nature and sensitivity of the data.”

To determine where particular data fall on the spectrum, FPF suggested considering various factors, including:

  •  context and purpose for which data are collected and used;
  •  whether the data are inherently medical data;
  •  whether there is a clear and close link between the consumer's health status and the data;
  •  whether the data can be used to predict or measure health risks; and
  •  the existence of safeguards.

The guidelines also suggested placing limits on the transfer of data to certain entities, such as data brokers and information resellers, even with consumer consent.

“By building on best practices that support consumer trust, as well as developing responsible guidelines for appropriate research and other secondary uses of consumer-generated wellness data, we hope to ensure continued innovation and consumer trust within the wearables ecosystem,” the guidelines said.

By Jimmy H. Koo

To contact the reporter on this story: Jimmy H. Koo in Washington at

To contact the editors responsible for this story: Donald G. Aplin at ; Daniel R. Stoller at

Copyright © 2016 The Bureau of National Affairs, Inc. All Rights Reserved.