Hacked Opinions: The legalities of hacking – Scot Terban

07.10.2015
Researcher Scot Terban, known to many online simply as Dr. Krypt3ia, talks about hacking regulation and legislation with CSO in a series of topical discussions with industry leaders and experts.

Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focused on disclosure and how pending regulation could impact it. Now, this second set of discussions will examine security research, security legislation, and the difficult decision of taking researchers to court.

CSO encourages everyone to take part in the Hacked Opinions series. If you would like to participate, email Steve Ragan with your answers to the questions presented in this Q&A. The deadline is October 31, 2015. In addition, feel free to suggest topics for future consideration.

What do you think is the biggest misconception lawmakers have when it comes to cybersecurity

Scot Turban (ST): “CYBER” this is the worst word with the worst connotations within the milieux of information security. Cyber does not exist and it only gives people ideas of flying through crystal palaces like in the movie Hackers.

They have no other conceptions of what the problems are even on a non technical level. Look at these people in Congress today. Watch a hearing on something to do with information security and you will see people who have no concept of basic principles of how computers work. Now think about how they are the ones making the laws concerning aspects of hacking/security they only know about through the word “Cyber” it is maddening.

Let me give you another example. In 1984 the congress saw or heard of the movie “WarGames” they did not understand the technology but then created the Computer Fraud and Abuse Act. [The CFAA is an] outdated law that is still in use and leveraged against would be hackers or security researchers today with punitive actions for things that should not be even considered hacks.

Along this line of thinking I would also cite the conviction and incarceration of Weev [Andrew Auernheimer] for enumerating a page by increasing a number at the end of a URL. The lawmakers do not understand the technology; this is the biggest problem.

What advice would you give to lawmakers considering legislation that would impact security research or development

ST: I would recommend that they sit with and learn from not only security luminaries (I hate that term) as well as their staffers who really are helping to research and write these bills that become laws. These staffers are younger and likely understand the technology much more than the sitting senator or congressman.

If they had some comprehension of how things work and why perhaps they could define what is right and wrong and then apply the law to that. Instead we have people who are ill equipped to understand what is happening and scared into a corner from tales of terrorism and hacker extortion we see in the news daily.

Learn. Ask questions. Then learn more before you just spew out poor legislation. Oh and certainly don’t just be a beltway security bandit toadies either! You have to understand the motivations of the companies out there lobbying you on these issues.

If you could add one line to existing or pending legislation, with a focus on research, hacking, or other related security topic, what would it be

ST: If said researcher made a good faith effort of disclosure with the company and can prove it, they are shielded from any and all legal prosecution civil or otherwise.

Now, given what you've said, why is this one line so important to you

ST: Companies rule the world today. How many talks have been cancelled How many GeoHot’s are there in the world who have been harangued by legal attacks because a company felt threatened or embarrassed because they failed to listen to the researcher Work with us, don’t hide from your flaws and then knee-jerk prosecute.

Do you think a company should resort to legal threats or intimidation to prevent a researcher from giving a talk or publishing their work Why, or why not

ST: No. Never. If anything perhaps they should ask the researcher to work with them and to perhaps delay a talk until they in tandem have worked the problem out and remediated it (i.e. re-code or patched).

I think it would be equitable if a company asked me to not write or speak about something until after the fix was in. Unfortunately this is not how it usually happens and we have seen more than a few talks cancelled under threats or worse, one remembers what happened to Dmitry Sklyarov at DEFCON in 2002.

What types of data (attack data, threat intelligence, etc.) should organizations be sharing with the government What should the government be sharing with the rest of us

ST: I believe that threat intelligence is a rubric that needs a lot of work to become actionable for companies to start with.

We need to get what threat intelligence really is, and how to use it, out of the way first. It will take work on the part of the companies to create programs to collect, analyze, and use that data. This is work that we're not seeing done today.

So when you say threat intelligence in my world, you mean threat feeds of data from SIEM/IDS and the like. Add to this certain data on malware C&C’s, and some facts as to how malware works and you have data you can work with to create threat intelligence, post analysis. I won’t go into that any further here, but I think that kind of lays out the problem.

Now, when talking about sharing that data, the model is already out there in the form of the DC3. This is the military government defense base group that shares their intelligence internally between those companies doing business within the DOD space. It is a good thing, and it has gotten much better over the last few years. That data is mostly actionable to a team that can analyze it and apply it to their environment. This however does not really exist outside of the DC3 though and is all secret squirrel.

What I hear being proposed by [President Obama] and others is that we create more sharing on a corporate basis between corporate entities (either within their silo or altogether) on the attacks that are being seen by them. Sharing that data properly between companies like the DC3 model would be good. Sharing that data with government too Well, that kind of scares me. Why

Well, let’s just take a peek at OPM shall we Yeah, I am not so enamored with the idea of sharing my SIEM and other feeds as well as analysis with the government because so far, they have shown that they are unable to really digest and apply any kind of security data to date properly within regular government channels.

On the opposite side of this argument I wonder just how many companies are going to be willing to really pony up to join such an intelligence sharing organization at all. It would mean that someone like Target would have to really be doing the work that the certainly weren’t doing up until recently post their schadenfreude hack experience right

Can you imagine that Sony would do the same I think this is a hard sell to most [organizations] because they would have to put in money and time and with that trained [full-time employee's] with experience.

As to the question of what the government should be sharing with us...

Say, how many cleared individuals are there commonly in a regular corporation anyway The fact of the matter is they won’t share much if it is classified. So it’s kind of a moot point. I think the data flow would be more into them and they would use it for their own purposes.

(www.csoonline.com)

Steve Ragan

Zur Startseite