The 5 worst Big Data privacy risks (and how to guard against them)

08.12.2014
The collection and manipulation of Big Data, as its proponents have been saying for several years now, can result in real-world benefits: Advertisements focused on what you actually want to buy; smart cars that can call for an ambulance if you're in an accident; wearable or implantable devices that can monitor your health and notify your doctor if something is going wrong.

But, it can also lead to big privacy problems. By now it is glaringly obvious that when people generate thousands of data points every day -- where they go, who they communicate with, what they read and write, what they buy, what they eat, what they watch, how much they exercise, how much they sleep and more -- they are vulnerable to exposure in ways unimaginable a generation ago.

[ Securing big data off to slow start ]

It is just as obvious that such detailed information, in the hands of marketers, financial institutions, employers and government, can affect everything from relationships to getting a job, qualifying for a loan or even getting on a plane.

And so far, while there have been multiple expressions of concern from privacy advocates and government, there has been little to update privacy protections in the online, always connected world.

It has been almost three years since the Obama administration published what it termed a Consumer Privacy Bill of Rights (CPBR), in February 2012. That document declared that, "the consumer privacy data framework in the U.S. is, in fact, strong ... (but it) lacks two elements: a clear statement of basic privacy principles that apply to the commercial world, and a sustained commitment of all stakeholders to address consumer data privacy issues as they arise from advances in technologies and business models."

And, as Susan Grant, director of consumer privacy at the Consumer Federation of America (CFA), puts it, the CPBR is, "not a bill. It has never been a piece of legislation. We need to have something offered, to talk about -- at least somewhere to start."

Meanwhile, organizations like the CFA and Electronic Privacy Information Center (EPIC), and individual advocates like Rebecca Herold, CEO of The Privacy Professor, have enumerated multiple ways that Big Data analytics can invade the personal privacy of individuals. They include:

1. Discrimination

According to EPIC, in comments last April to the U.S. Office of Science and Technology Policy, "The use of predictive analytics by the public and private sector ... can now be used by the government and companies to make determinations about our ability to fly, to obtain a job, a clearance, or a credit card. The use of our associations in predictive analytics to make decisions that have a negative impact on individuals directly inhibits freedom of association."

Herold, in a post on SecureWorld, noted that while overt discrimination has been illegal for decades, Big Data analytics can make it essentially "automated," and therefore more difficult to detect or prove.

In an interview, Herold said current discrimination law is, "vague, narrowly defined, and from the applications of it I've seen, depends upon very explicit and obvious evidence.

"Big Data analytics provides the ability for discriminatory decisions to be made without the need for that explicit and obvious evidence," she said.

That can affect everything from employment to promotions to fair housing and more.

Edward McNicholas, global co-leader of the Privacy, Data Security, and Information Law Practice at Sidley Austin LLP, said he thinks some of the potential risks of Big Data are overstated, but believes, "the most significant risk is that it is used to conceal discrimination based on illicit criteria, and to justify the disparate impact of decisions on vulnerable populations."

2. An embarrassment of breaches

By now, after catastrophic data breaches at multiple retailers like Target and Home Depot, restaurant chains like P.F. Chang's, online marketplaces like eBay, government agencies, universities, online media corporations like AOL and the recent hack of Sony that not only put unreleased movies on the web but exposed the personal information of thousands of employees, public awareness about credit card fraud and identity theft is probably at an all-time high.

But in addition to that, there are numerous reports of Big Data analytics being used to expose personal details, such as beginning to market products to a pregnant woman before she had told others in her family. The same can be true of things like sexual orientation or an illness like cancer.

3. Goodbye anonymity

Herold argues that without rules for anonymized data files, it is possible that combining data sets, "without first determining if any other data items should be removed prior to combining to protect anonymity, it is possible individuals could be re-identified."

She adds that if data masking is not done effectively, "big data analysis could easily reveal the actual individuals who data has been masked."

4. Government exemptions

According to EPIC, "Americans are in more government databases than ever," including that of the FBI, which collects Personally Identifiable Information (PII) including name, any aliases, race, sex, date and place of birth, Social Security number, passport and driver's license numbers, address, telephone numbers, photographs, fingerprints, financial information like bank accounts, employment and business information and more.

Yet, "incredibly, the agency has exempted itself from Privacy Act (of 1974) requirements that the FBI maintain only, 'accurate, relevant, timely and complete' personal records," along with other safeguards of that information required by the Privacy Act, EPIC said.

5. Your data gets brokered

Numerous companies collect and sell, "consumer profiles that are not clearly protected under current legal frameworks," EPIC said.

There is also little or no accountability or even guarantees that the information is accurate.

"The data files used for big data analysis can often contain inaccurate data about individuals, use data models that are incorrect as they relate to particular individuals, or simply be flawed algorithms," Herold said.

***

Those are not the only risks, and there is no way to eliminate them. But there are ways to limit them. One, according to Joseph Jerome, policy counsel at the Future of Privacy Forum (FPF), is to use Big Data analytics for good -- to expose problems.

"In many respects, Big Data is helping us make better, fairer decisions," he said, noting that an FPF report with the Anti-Defamation League showed that, "Big Data can be a powerful tool to empower users and to fight discrimination. It can be used as a sword or a shield. More data can be used to show where something is being done in a discriminatory way. Traditionally, one of the biggest problems in uncovering discrimination is a lack of data," he said.

Government can also help. There is general agreement among advocates that Congress needs to pass a version of the CPBR, which calls for consumer rights to:

Among those rights, Joseph said he thinks context is important. "If I buy a book on Amazon, that book probably shouldn't be used to make health decisions about me," he said. "When we've seen companies get in trouble on privacy, it's been because they've used information given to them for one purpose for some other reason altogether."

But, there is also general agreement that, given the contentious atmosphere in Congress, there is little chance of something resembling the CPBR being passed anytime soon.

That doesn't mean consumers are defenseless, however. Donna Wilson, a litigation attorney specializing in privacy and data security risks and issues at Manatt, Phelps & Phillips, said consumers, "need to take a strong measure of responsibility and consciously decide with whom they will share their information and for what purposes."

That, she said, means consumers need to force themselves to do what almost nobody does: Read privacy policies and terms of service agreements.

"If consumers do not agree with what the business intends to collect, or how it intends to use the information, they can choose not to patronize that business," she said.

Joseph said even if users don't read an entire policy, they should, "still take a moment before clicking 'OK' to consider why and with whom they're sharing their information. A recent study suggested that individuals would give up sensitive information about themselves in exchange for homemade cookies."

McNicholas agrees that consumers, collectively, have some power. "New technologies that include detailed privacy controls, such as those on Facebook, and just-in-time notices, like the Apple requests to use location, should be rewarded by the marketplace," he said.

Herold offers several other individual measures to lower your privacy risks:

(www.csoonline.com)

Taylor Armerding

Zur Startseite