Hacked Opinions: The legalities of hacking – Richard Ford

29.10.2015
Richard Ford, from for Raytheon|Websense talks about hacking regulation and legislation with CSO in a series of topical discussions with industry leaders and experts.

Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focused on disclosure and how pending regulation could impact it. Now, this second set of discussions will examine security research, security legislation, and the difficult decision of taking researchers to court.

CSO encourages everyone to take part in the Hacked Opinions series. If you would like to participate, email Steve Ragan with your answers to the questions presented in this Q&A. The deadline is October 31, 2015. In addition, feel free to suggest topics for future consideration.

What do you think is the biggest misconception lawmakers have when it comes to cybersecurity

Richard Ford, Principal Engineering Fellow, Raytheon|Websense (RF): Cybersecurity is a very complex area, and laws aren't going to fix the problem, but they can help.

I think the biggest problem is the misconception that "tough" legislation will make much of a difference. In fact, "tough new laws" will probably not help much, but they could hurt legitimate and useful research and behaviors, and that would be bad!

What advice would you give to lawmakers considering legislation that would impact security research or development

RF: Don't do it! The industry as a whole has been pretty good at self-regulation, and I think that process is working. If I find a serious vulnerability in a well-known program, I'm going to practice responsible disclosure of that. I certainly don't want a law that prevents me from looking at the products my customers use, or that forces me by law to take a particular course of action that may not be right for that particular case.

The basic tenets of not harming others and being responsible for my actions are already enshrined in law; specific "must do" actions remove my ability to use judgment. I guess what I'm saying is I'd ask lawmakers to allow us discretion and common sense. Every threat is different, and technology changes so fast that very specific laws that apply to situations that are cropping up today in specific technologies are, at best, a waste of time.

If you could add one line to existing or pending legislation, with a focus on research, hacking, or other related security topic, what would it be

RF: Ha! One line in a multi-page bill That's a hard question to answer; hard enough that I suspect it's impossible. I think, in fact, there is no good answer, because we're talking about a framework here that encompasses aspects of national security, corporate well-being, human rights, and fraud.

You have to look at the whole system. We're at very early days in thinking about how to legislate aspects of cyber space - it's a bit like the very early UK road traffic acts that attempted to legislate how horse-less carriages should operate; laws that don't really look much like the laws we have today.

Now, given what you've said, why is this one line so important to you

RF: Well, I answered a different question above, but I am passionate about it. You're going to make big strides here by looking at the big picture and then drilling down to the details, not vice versa.

Do you think a company should resort to legal threats or intimidation to prevent a researcher from giving a talk or publishing their work Why, or why not

RF: Intimidation is always wrong. Legal threats Well, as a researcher, I'm responsible for my actions. If those actions are ill-considered or reckless I should face some censure. What we need here are better guidelines though, and a framework that allows free speech, while still managing the task we have of improving security overall.

In previous positions, I've been threatened by folks for research I've done, and there were no easy standards that I could compare my actions to make sure I was in the right. Being at the tip of the spear has its drawbacks - there's sometimes nobody else to look to for history.

What types of data (attack data, threat intelligence, etc.) should organizations be sharing with the government What should the government be sharing with the rest of us

RF: Threats and attacks - both ways! As a defender, I need to know what attacks are out there; their prevalence, the attacker motivation, and the mechanism. What's good is that some of that sharing is already happening, both between governments and defenders, and between companies you might assume were competitors.

(www.csoonline.com)

Steve Ragan

Zur Startseite