Hacked Opinions: The legalities of hacking – Morey Haber

05.10.2015
Morey Haber, from BeyondTrust talks about hacking regulation and legislation with CSO in a series of topical discussions with industry leaders and experts.

Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focused on disclosure and how pending regulation could impact it. Now, this second set of discussions will examine security research, security legislation, and the difficult decision of taking researchers to court.

CSO encourages everyone to take part in the Hacked Opinions series. If you would like to participate, email Steve Ragan with your answers to the questions presented in this Q&A. The deadline is October 31, 2015. In addition, feel free to suggest topics for future consideration.

What do you think is the biggest misconception lawmakers have when it comes to cybersecurity

Morey Haber (MH): I think the biggest misconception about any of the pending cybersecurity legislation is the actual identification of a crime and ultimately the enforcement of the legislation.

I think lawmakers are out of touch regarding how much surveillance and monitoring is needed to determine if someone is conducting vulnerability research, actually committing a cybercrime, writing an exploit, or even ethically testing security as a part of the job function.

Cybersecurity laws are not like brick and mortar crimes or physical theft; other cybersecurity tools are needed to identify when they occur and who conducted them. For example, in 2013 and 2014 almost all DUI cases in the State of Florida where dismissed because the state did not have access to the source code used in breathalyzers.

Attorneys successfully argued that without knowing how the system worked, a black box could not be used to convict an individual without proof of its accuracy, algorithms, and mechanics.

In the same vein, how would a state prosecute a cybercriminal without the details of the tools they are using to identify an individual

I doubt many companies would turn over their source code to the state. These types of loop holes represent the misconception in writing a cybersecurity law and enforcing it all the way from the identification to actual successful prosecution.

What advice would you give to lawmakers considering legislation that would impact security research or development

MH: While the intent of these laws is to make “cyber” more secure, they have almost no applicability outside of the U.S. In fact, if we do not allow cyber security research in itself, we are putting ourselves at a disadvantage because other hostile nations would not hesitate to use it against us.

I don’t believe lawmakers should focus on the research itself, but rather on the laws around acceptable public disclosure and time to remediation for governing companies producing software when a flaw is found.

If you could add one line to existing or pending legislation, with a focus on research,  hacking, or other related security topic, what would it be

MH: The public disclosure of any vulnerability without the notification to the developer or managing entity for that software would be considered a criminal act.

Now, given what you've said, why is this one line so important to you

MH: The amount of risks introduced to businesses and government could be mitigated with proper disclosure and good ethics.

I am very much for bug bounties and other methods for identifying vulnerabilities but it is insensitive and dangerous to be blasting a flaw to everyone with no time to react or patch.

This is especially true when the vendor or open source community has not even had a chance to comment and fix the vulnerability in the first place. Hence the inherent problem with zero day vulnerabilities based on irresponsible disclosure.

Do you think a company should resort to legal threats or intimidation to prevent a researcher from giving a talk or publishing their work Why, or why not

MH: Only if ethical proper disclosure has been violated should a company resort to these methods. If the research is generic, and does not site a specific vendor or sole proprietary technology that has not been remediated, there should be no hindrance for researchers to discuss their work.

On the converse side, it should never be acceptable for a researcher to attempt blackmail or extort money from a company based on their research.

What types of data (attack data, threat intelligence, etc.) should organizations be sharing with the government What should the government be sharing with the rest of us

MH: How big of a can of worms would you like to open

Within any business, it is up to the company to notify law enforcement if things such as shoplifting or grand larceny have occurred.

The same is true for cybersecurity. Based on the type and volume of data, thresholds should be set when an organization is obligated to tell law enforcement. As for the government, just like FBI most wanted posters, any patterned or known types of active attacks should be shared with businesses and security vendors to identify the next time a crime is committed in the hopes of bringing the threat to closure.

(www.csoonline.com)

Steve Ragan