Hacked Opinions: The legalities of hacking – Joan Pepin

28.10.2015
Joan Pepin, from Sumo Logic talks about hacking regulation and legislation with CSO in a series of topical discussions with industry leaders and experts.

Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focused on disclosure and how pending regulation could impact it. Now, this second set of discussions will examine security research, security legislation, and the difficult decision of taking researchers to court.

CSO encourages everyone to take part in the Hacked Opinions series. If you would like to participate, email Steve Ragan with your answers to the questions presented in this Q&A. The deadline is October 31, 2015. In addition, feel free to suggest topics for future consideration.

What do you think is the biggest misconception lawmakers have when it comes to cybersecurity

Joan Pepin, VP of security and CISO, Sumo Logic (JP): I believe the biggest misconception is pretty fundamental. Most lawmakers are undereducated on how technology or how the Internet works. They don’t even check their own email, or even know how to use email, let alone understand the fundamentals of cybersecurity.

Unfortunately it’s these same individuals who are tasked with developing and passing laws that impact cybersecurity, so I believe there’s a great need to educate lawmakers in order to develop appropriate legislation.

Making it illegal to reverse engineer something does not make it more secure. That’s the main reason, but that’s not exactly my point. The DMCA is something that would be ludicrous under other circumstances.

For instance, if Toyota sued you for painting your Prius with Tiger Stripes because that violated their copyright, we would all laugh, but that is exactly what the DMCA lets tech companies do. A product that you bought and paid for is not really yours if it has microchips or software in it -- and I don’t think they get that.

There are lots of other things that demonstrate the scientific illiteracy of lawmakers. From climate-change denial, to a woman’s body “shutting down” pregnancy in the event of a “legitimate rape” to “the internet is a series of tubes” to pretty much every law ever passed regarding cryptography are all shining examples of a fundamental lack of understanding.

What advice would you give to lawmakers considering legislation that would impact security research or development

JP: There’s this idea that people who research security are criminals, but in fact, most of them are doing so for opposite reasons. They want to better understand the ins and outs of cybersecurity so they can improve security postures, not exploit them. Unfortunately, many researchers are being scrutinized by the legal system through ill-conceived laws, which prevent them from doing their jobs.

Security research is no different than research that is conducted around new drugs, the environment or the economy – all of which is critical to educating lawmakers as they develop legislation. My advice to lawmakers would be to shift their perception of what the purpose of security research is and work with organizations to create legislation that helps support this purpose.

If you could add one line to existing or pending legislation, with a focus on research, hacking, or other related security topic, what would it be

JP: I would add the line, “take with a giant grain of salt” or “this is just a suggestion” to security legislation to ensure the protection of everyday citizens who are uncovering vulnerabilities. Take the example of Sony, who pursued legal actions against kids who had run Linux on PlayStations.

The hacker community was not happy, and rightfully so. Again -- am I not allowed to turn my lawnmower into a go-cart Nobody would sue anyone for that. But it’s a slippery slope to there from here.

Now, given what you've said, why is this one line so important to you

JP: From a civil law perspective, it’s wrong to inhibit people who are trying to do the right thing by bringing vulnerabilities to light. The legal system has utterly failed us in too many ways -- since the 1980s, the Computer Fraud and Abuse Act has been used to put curious teenagers in jail, and I don’t see how we can allow this to continue.

Once a consumer has purchased a device (i.e. a computer or a cell phone), it is up to them how they use it. If Congress tried to pass a law that prohibited people from repainting their cars because it was violating the carmaker’s copyright, we wouldn’t allow it. But now with a cell phone, we think it’s a fine idea

Do you think a company should resort to legal threats or intimidation to prevent a researcher from giving a talk or publishing their work Why, or why not

JP: Absolutely not. We must protect freedom of speech and allow researchers to openly identify and discuss threats and vulnerabilities.

Providing an outlet (such as a published paper or speech) for open dialogue is the only way we can keep the hackers on our side (as in, the U.S.) and protect us against the growing threat of politically driven cyber attacks.

What types of data (attack data, threat intelligence, etc.) should organizations be sharing with the government What should the government be sharing with the rest of us

JP: Organizations should share information they are legally obligated to share, and/or information that may be pertinent to some sort of a crime. Likewise, the government should do the same.

I would caveat that this data should only be exchanged if, in fact, government entities are going to take action based on the information -- otherwise, it’s a waste of my time, budget, and staff resources (which are already limited). If government agencies have indicators of compromise (i.e. “criminal group A is using this tool on your network”), not only should they share this information, but they should also uphold their obligation to tracking down the criminals and pursuing appropriate legal action.

(www.csoonline.com)

Steve Ragan

Zur Startseite