Fitness wearables: Who's tracking who

16.03.2016
Fitness wearables are apparently in superb shape when it comes to collecting your health data: Heart rate, sleep patterns, steps taken per day, calories burned, weight gain or loss, mile splits, stress levels, location – even sexual activity or how you’re doing in your effort to quit smoking.

But they are in lousy shape when it comes to protecting that data and keeping it private.

And given the number of them in use – there were more than 13 million sold in the U.S. just in the last two years according to Statistica – there are more vocal warnings from Internet of Things (IoT) experts and privacy advocates that users need to be aware of how vulnerable their health data are, and how it could be used for identity theft, discrimination and more.

The makers of fitness trackers – the biggest names are Samsung, Pebble, Fitbit, Apple, Jawbone, Nike, Sony, Lenovo and LG – generally stress their commitment to privacy, and say they do not “sell” the data they collect.

But, as numerous experts note, selling is not the same as sharing or protecting. Theresa Payton, president and CEO of Fortalice and a former White House CIO, said fitness wearables and associated apps, “have a track record of poor privacy and security measures.

“The culprit is the innovation life cycle,” she said. “There is tremendous pressure to get cool and affordable products on the market at a dizzying speed. That means the time to put the devices in the lab and attack them like an adversary is too short or nonexistent.”

The resulting security flaws in hardware or companion apps, “often allow someone to track your whereabouts or your patterns,” she said.

While there has so far not been a reported catastrophic breach of one of the major fitness wearable companies, Craig Spiezle, executive director of the Online Trust Alliance (OTA), said that, “data can and has been captured off fitness bands easily with $100 and a determined adversary. As more of these devices are amassing data, the risk is increasing.”

And while Eva Velasquez, president and CEO of the Identity Theft Resource Center (ITRC), said she thinks fitness wearable data are “one step down” in privacy value from EHRs (electronic health records) that, “include enough PII (personally identifiable information) to commit ID theft almost immediately,” the trackers still include “sensitive information.”

She said data on heart rate, weight, food log, BMI (body mass index) and exercise are generally not enough on their own to allow identity theft, “but the privacy implications are there. We need to take a much closer look at where it’s valuable and who it’s valuable to.”

Indeed, even if it does not contain specific PII, Spiezle noted that, “the data on wearables is unique to an individual and ultra-sensitive in the data types collected.”

Its value, especially when accumulated over time, can be significant, and more intimate than users may expect. If it is shared with insurers, it could affect the rates people pay. If it is shared with employers, it could affect job status.

Mother Jones reported in January 2014 that Ira Hunt, then the CIA’s chief technology officer, had said at a data conference in New York City that the agency, “likes these things (fitness trackers). What’s really most intriguing is that you can be 100% guaranteed to be identified by simply your gait – how you walk.”

Much more recently, just last month, Open Effect, along with the Citizen Lab at the Munk School of Global Affairs at the University of Toronto, released a report titled, “Every Step You Fake: A Comparative Analysis of Fitness Tracker Privacy and Security.” They studied eight popular fitness trackers and found that all but the Apple Watch, “wirelessly emit a persistent unique identifier over Bluetooth. This leakage lets third parties, such as shopping centers or others interested in location-based monitoring, collect and map out people’s movements over time.”

The study also found vulnerabilities that could allow the user or an intruder to manipulate the data generated, which would falsify activity levels.

Another privacy problem is the one that exists with virtually any connected device: Terms of service and privacy policies are long, complex and hard to read – most of them are 4,000 words or more. Beyond that, if a user does not check the “agree” box, thereby “consenting” to the policy, in many cases the device or app may limit functionality or it can’t be used at all.

[ MORE: Why smart devices and wearables will be security's new headache ]

Not surprisingly, experts agree that most users simply check the box without reading the policy. And that puts their information at risk. The Federal Trade Commission (FTC) reported in 2014 that when they studied a dozen health and fitness apps, they found they were collectively disseminating data to 76 third parties. One app alone shared data with 18 other entities.

Also in 2014, Jessica Rich, director of the Bureau for Consumer Protection at the FTC, said data from fitness trackers could end up in the hands of data brokers or other companies, and eventually be used, “to market other products and services to (users); make decisions about (their) eligibility for credit, employment, or insurance; and share with yet other companies.”

Even if it is shared voluntarily, with an employer or insurance company, the results could be unwelcome. “At the start, we may look at this as a great way for people to get a break on their medical or life insurance,” Velasquez said. “But what if I gain 20 pounds and my insurer knows I’ve stopped exercising regularly I could see it first being an option, then being compulsory and then leading to penalties if you’re not meeting a standard.”

She also noted that if users share their fitness data with their health provider, and then the health provider gets breached, “that could be another entry point to their information.”

What to do about both risks has prompted intense discussion and is also the subject of a number of private sector initiatives.

The OTA formed the IoT Trustworthy Working Group about a year ago, and since then has published an “IoT Trust Framework” – 30 principles for building security and privacy into connected devices. Spiezle also moderated a panel at the recent RSA conference in San Francisco titled “Diffusing the IoT time bomb – Security and privacy trust code of conduct,” which discussed those principles.

The IEEE Center for Secure Design recently released a paper titled, “WearFit: Security Design Analysis of a Wearable Fitness Tracker,” which “created” a fictitious wearable aimed at showing how developers of fitness trackers can design a device that, “addresses each of the top 10 software security design flaws.”

Jacob West, cofounder of the center and lead author of the report, said he doesn’t think fitness trackers have more security vulnerabilities than other consumer devices, “but the challenge for any product company is to determine the right balance between security, functionality, and usability.”

To address that, he said, “we need to expand the focus in security away from simply finding and fixing bugs to include avoiding design flaws as well.”

West also said he thought market pressure could be effective. He said better-informed consumers who, “understand security and privacy and make buying decisions based on that knowledge,” could lead to improvements in both security and privacy.

Velasquez is a bit skeptical of market wisdom. “The market has failed (to cause product improvements) in the past,” she said. “But it does respond to losing customers.

“The way it needs to be framed is: ‘Today you’re not concerned, but at the pace we’re moving, in five years you will be.’”

Susan Grant, director of consumer protection at the Consumer Federation of America, shares the skepticism. “Bad publicity (about breaches) is helpful but not it's not enough because not everything that companies are up to gets exposed,” she said.

Grant called for a stronger “general” privacy law at the federal level, “or at the least, a specific law covering health devices and services that are not already subject to law.”

Spiezle called for consumers to have a true choice about how their data are used. “Non acceptance (of a privacy policy) should not impact the core functionality of the product,” he said.

But Payton said the reality is that if consumers truly want to protect their privacy when using connected devices, “don’t wait for government or industry to do it for you. The standards are still emerging, and by the time they are adopted, they will be out of date.”

Payton offered several suggestions on how to do it yourself:

“Many devices are designed to be open and social, and you may be shouting out your personal details and data unwittingly,” she said.

Spiezle suggested it may be time to shame the industry into providing better security. “In meetings last week a major brand told me the cost to protect the device would be 11 cents and potentially impact battery life as a reason to not make a change,” he said. “This is like Ford not willing to protect the gas tank in the Pinto.”

(www.csoonline.com)

Taylor Armerding

Zur Startseite