Snowden espouses about data ethics and privacy

22.10.2015
Annandale-on-Hudson, N.Y.  -- In the aftermath of Edward Snowden’s controversial leaks that revealed widespread US surveillance and data gathering, researchers, scholars, lawyers, and privacy advocates gathered at the Hannah Arendt Center for Politics and Humanities at Bard College to engage in conversations about privacy and data ethics.

The Internet of Things continues to grow, and big data has the potential to drive big changes in how society defines privacy. Roger Berkowitz, associate professor of Political Science and Human Rights at Bard College, said many consumers are willing to sacrifice their privacy in exchange for the conveniences technology affords them.

Oft heard in debates over why privacy matters is the argument "I have nothing to hide." Whether indifferent or apathetic, some people don’t take umbrage with government surveillance or data collection.

Privacy advocates, though, are concerned about big data whether collected through government surveillance or user devices. Former NSA contractor and keynote speaker at the “Surveillance and the Private Life: Why Privacy Matters” conference, Snowden argued, “Privacy isn’t about something to hide.  It’s about something to lose.”

[ ALSO ON CSO: The 10 biggest Snowden leaks ]

Snowden was one of many privacy advocates who discussed the value of privacy in a world where people are willing to expose their lives for public consumption. He spoke at the event via satellite as he continues to live in Russia. The difference with social media services like Twitter, Snowden said, “Is about being able to share selectively. We decide. We pick the ideas to share.”

What’s missing from the discussions about surveillance and data collection is this point of consent, he said. “It’s about voluntary participation. The right to privacy does not obligate us to live completely private lives. You’re not obligated to cut yourself off from society,” said Snowden.  

Ashkan Soltani, who currently serves as the chief technologist for the FTC, said, “I hate the word privacy. I don’t think it describes the technology issues we are facing.  It’s more about information.  Someone has some information about me. Sometimes I know they have it and sometimes I don’t.  When I don’t know how it is used is what creates the harm.”

Edward Snowden

Where information used to be ephemeral, Soltani said, “Now we have websites that take all of this ephemeral behavior that can be used in a way that we don’t agree with.  The nature of info is now gone and along with it comes the risk of info being used in a way we don’t desire.”

Whether walking on a public street, using a work issued device, or shopping at the mall, people are being tracked, especially through mobile devices.  

Soltani said, “When you walk around major malls or coffee chains, they employ a technology called mobile retail tracking that monitors cell phones. There are records of your visits to stores just by virtue of your phone being on.”

As a result, data exists all over the place, which makes that information more vulnerable to risks. Yes, there are privacy enhancing technologies available to users, but developers have found savvy workarounds that make accessing information on certain sites impossible for users trying to protect their privacy.

Soltani said, “Even if I value my privacy enough to install a $10 app to block tracking, corporations will always have much more incentives.”

[ ALSO ON CSO: Location tracking in mobile apps is putting users at risk ]

By default, corporations have more money, said Soltani, “So, you install an ad blocker. What happens when a site says, ‘we won’t show you the content because you have an ad blocker.’  That’s the next level. It’s a constant test to see how much we value our privacy.”

As technology collects more information, said Kate Crawford, principal researcher, Microsoft Research New York City, there needs to be more discussion about data ethics.

In reference to the new Hello Barbie, the first interactive doll who is able to talk to and with children, and Jibo, the first social robot for the home, Crawford said, “What is different about this, is these are for a mass market.  More than this, it is their ability to aggregate, analyze, and search this data over time.”

Though a Hollywood film, the movie Her personifies the fear of human beings developing intensely intimate relationships with technology. “The issue of trust gets complicated,” Crawford said. “We begin to connect with them and trust them, and they harvest our most intimate information,” she continued.

As technology explodes in the form of wearables and interconnected devices, so too are there a growing number of smart cities being established around the world.  “What sense of ethics or democracy should accompany this” Crawford asked.

In the same way that changes in society gave birth to ethics in journalism and other fields, so too will the advent of technology and its impact on society forge a path for data ethics.

Crawford argued, “If knowledge and power are always defined as having more data, then we have effectively agreed to a big data phenomenon, and this exceeds privacy rights.”

We need a far broader shift that changes the conversations about privacy to conversations about ethics, said Crawford who also advocates for data ethics courses as part of computer science or engineering degree program.  

Robert Litt, second general counsel of the Office of the Director of National Intelligence, said the intelligence community is made up of people who care about their privacy as well. They have families and personal lives that they too do not want invaded by the watchful eye of government surveillance.

“I don’t think anyone would disagree on the value of privacy, but it’s never been an absolute. The principle goal of the government is to protect its people from hostile powers,” said Litt who also recognized that the current problem for the national intelligence community is that, “we are in a zero tolerance environment.”

Perhaps that is true, and if Crawford is correct in arguing for discussions about ethics in data collection, then the dialogue needs to be open and free-flowing on both sides.  In order to get out of the emotional rut caused by technology in surveillance, society needs to move beyond the negative focus on the awful things the intelligence community can do.

Litt said rather than asking, “How can we protect ourselves against the NSA” people need to be asking, “How can technology give us assurances that powers aren’t being abused”

The answer to that question will come from updating and implementing ethical codes of conduct that protect privacy and security without compromising one for the other.

(www.csoonline.com)

Kacy Zurkus