Hacked Opinions: Vulnerability disclosure – Sam Curry

28.07.2015
Arbor Networks' Sam Curry talks about disclosure, bounty programs, and vulnerability marketing with CSO, in the first of a series of topical discussions with industry leaders and experts.

Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focus on disclosure and how pending regulation could impact it. In addition, we asked about marketed vulnerabilities such as Heartbleed and bounty programs, do they make sense

CSO encourages everyone to take part in the Hacked Opinions series. If you would like to participate, email Steve Ragan with your answers to the questions presented in this Q&A, or feel free to suggest topics for future consideration.

Where do you stand: Full Disclosure, Responsible Disclosure, or somewhere in the middle

Sam Curry, Chief Technology and Security Officer, Arbor Networks (SC):

Responsible disclosure is the right way to go.

Many vendors still have "old school" and destructive attitudes with respect to security research, which is shameful in this day-and-age. Companies have to grow up, embrace and incent the wider community to help make their products more robust. No quality lab or internal testing is going to be as good as a wide, passionate community of independent researchers. Leveraging it makes sense.

Disclosing without engaging vendors can lead to unintended victims and real damage to innocents. Responsible disclosure or a more moderate path of disclosing and entering a dialog about when to go public (using industry best practices) is the correct approach for all parties.

It's not just a compromise among all partners for least damage: it's actually the best path for all partners.

If a researcher chooses to follow responsible/coordinated disclosure and the vendor goes silent - or CERT stops responding to them - is Full Disclosure proper at this point If not, why not

SC: If a researcher does quality work responsibly and tries several venues to be heard by a vendor, and that vendor goes silent, full disclosure may be warranted.

I hesitate to give a blanket license since there could be extenuating circumstances or the vendor may not know how to respond.

What's really needed is a dialog, even with the most recalcitrant and obstinate vendors. The decision should be based on protecting potential victims. Researchers also need to be careful not to "leak" information accidentally since that can cause as much damage as blind, full disclosure.

Bug Bounty programs are becoming more common, but sometimes the reward being offered is far less than the perceived value of the bug/exploit. What do you think can be done to make it worth the researcher's time and effort to work with a vendor directly

SC: People do things for financial, morale and social reasons. Don't judge the value of the work simply by the economic value of the reward.

It's not a $1,000 reward or a $10,000 reward, it's bragging rights, working on something that matters, protecting the public, contributing to research, proof of skills, developing new techniques, being published and much more. It is in the best interests of companies to embrace security as a lifecycle and to hold close those who do research. Embrace the process and the output.

The companies that reward more generously may provide the opportunity to build a career as a white hat, especially as more companies begin to see the benefits. While the truly prolific researchers out there who build skillets, tools and experience can make a living doing this, let's not lose sight that being a researcher is a reward in itself.

Furthermore, the pros aren't necessarily where the best quality research is done or the best reputations are made.

Do you think vulnerability disclosures with a clear marketing campaign and PR process, such as Heartbleed, POODLE, or Shellshock, have value

SC: I wouldn't presume to know the PR or marketing behind recent disclosures. However, what matters is the intent and the results.

If someone sought to do good but was inexperienced or naive and caused damage...that can be forgiven, sparingly. If they set out to do harm, that's reprehensible. Some disclosures are righting wrongs, teaching lessons to recalcitrants or protecting people.

Others are self-aggrandizing, done for show, building brands with a lot of collateral damage. I would like to see more "marketing" investments in the former and a lot less in the latter.

If the proposed changes pass, how do you think Wassenaar will impact the disclosure process Will it kill full disclosure with proof-of-concept code, or move researchers away from the public entirely preventing serious issues from seeing the light of day Or, perhaps, could it see a boom in responsible disclosure out of fear of being on the wrong side of the law

SC: I'm not a lawyer, but the current language from the Bureau of Industry and Security is alarming.

Responsible vulnerability research does not equate to aiding and abetting the creation of attacks and hacks, and this topic is being debated publicly right now around Wassenaar.

Incidentally, threat and exploit research and tools creation happen and will happen no matter what we allow to cross borders among white hats. Encouragingly, the proposed rule change is open to comment right now, implying that the Bureau is seeking feedback on the language.

One thing I love about my industry is that we aren't wallflowers: we will debate and fight for what's right. Now is the time to engage in this debate and help shape how it will be interpreted and enforced. We must not meekly allow any law or rule to be created that would harm our ability to defend ourselves.

(www.csoonline.com)

Steve Ragan

Zur Startseite