Hacked Opinions: The legalities of hacking – Pat Clawson

06.11.2015
Pat Clawson, from Blancco Technology Group, talks about hacking regulation and legislation with CSO in a series of topical discussions with industry leaders and experts.

Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focused on disclosure and how pending regulation could impact it. This week CSO is posting the final submissions for the second set of discussions examining security research, security legislation, and the difficult decision of taking researchers to court.

CSO encourages everyone to take part in the Hacked Opinions series. If you have thoughts or suggestions for the third series of Hacked Opinions topics, or want to be included as a participant, feel free to email Steve Ragan directly.

What do you think is the biggest misconception lawmakers have when it comes to cybersecurity

Pat Clawson, CEO, Blancco Technology Group (PC): I think lawmakers have good intentions for the most part when it comes to cybersecurity.

The General Data Protection Regulation coming out of Europe is a good example. It’s focused on protecting citizens’ data privacy and it creates greater accountability among businesses and advertisers who are actively collected, tracking, storing and sharing customer data.

But in some instances, lawmakers get it wrong. That has to do a lot with the fact that they’re not the ones in the trenches inside businesses and they don’t necessarily understand the ins and outs of data management, security practices and the varying types and causes of security threats. In an attempt to show they’re taking cyber security seriously, they draft bills and acts like the Cybersecurity Information Sharing Act, which passed [last month].

The bill spells out that any broadly defined “cybersecurity threat” information gathered can be shared “notwithstanding any other provision of law.” That’s a pretty vague statement and makes it possible for the government to conduct intrusive surveillance practices under the guise of security protections. To tighten the grip on data security doesn’t automatically require spying. There are better ways to bring about change.

What advice would you give to lawmakers considering legislation that would impact security research or development

PC: I would tell lawmakers to not draft up legislation in a vacuum. I would recommend bringing in folks from different sides of the data security coin. Invite experts from data security software/hardware vendors, technologists and engineers, data scientists, research organizations and analysts into the same room as the legislators.

Each of these groups are working in some way or another with data and know first-hand how and where data is being created, managed, collected, stored and shared inside and outside organizations. By making them a part of the legislation discussion, lawmakers can avoid being seen as the ‘bad guys’, but most importantly, they can actually create the kind of change that needs to happen in cybersecurity.

If you could add one line to existing or pending legislation, with a focus on research, hacking, or other related security topic, what would it be

PC: I don’t think it’s as simple as adding one line to existing or pending legislation – that won’t solve anything. The goal should be to understand the cyber attack landscape for a nation as a whole.

Unlike Russia and China, where all critical infrastructure is state owned, we’re living in a capitalist society and a large portion of critical infrastructure is held by corporations, not by the government. China and Russia, for example, have the ability to understand attacks across all critical infrastructure. But we do not. This is a long-term risk.

Now, given what you've said, why is this one line so important to you

PC: In the long run, we need to be able to correlate attack data as a whole nation in order to understand the threats posed to us by other nations. When we have the data, we can do something about it.

Do you think a company should resort to legal threats or intimidation to prevent a researcher from giving a talk or publishing their work Why, or why not

PC: I definitely don’t feel companies should threaten a researcher who might have found a security flaw in their organization. Most often, those who resort to threats and intimidation do so out of fear and insecurity. Any company that does that is just hurting themselves and highlighting their own inability to learn from mistakes.

Rather than try to quiet them, why not invite them for a one-on-one discussion At least then the company can actually get to the bottom of why the security breach occurred, where the gaps are in their internal systems and even ask the researcher to give suggestions on how to prevent it from happening in the future.

What types of data (attack data, threat intelligence, etc.) should organizations be sharing with the government What should the government be sharing with the rest of us

PC: I would say organizations should be sharing attack data with the government. This would be information related to the types of attacks that have occurred in the past, the causes and sources of the attacks if they’ve been investigated and when they have occurred. I’d also say both organizations and the government should share their own respective threat intelligence with one another. It can’t be a one-way street.

As far as what the government should be sharing with the general public, that can be tricky. The public definitely has a right to know if and when there are severe threat warnings to nation states as that impacts their own safety. I think this is where the government needs to do more than just tell people a risk is there. They need to then offer up useful tips and advice on how to protect their own data from being compromised in future attacks.

(www.csoonline.com)

Steve Ragan

Zur Startseite