Hacked Opinions: Vulnerability disclosure – Morey Haber

29.06.2015
BeyondTrust's Morey Haber talks about disclosure, bounty programs, and vulnerability marketing with CSO, in the first of a series of topical discussions with industry leaders and experts.

Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focus on disclosure and how pending regulation could impact it. In addition, we asked about marketed vulnerabilities such as Heartbleed and bounty programs, do they make sense

CSO encourages everyone to take part in the Hacked Opinions series. If you would like to participate, email Steve Ragan with your answers to the questions presented in this Q&A, or feel free to suggest topics for future consideration.

Where do you stand: Full Disclosure, Responsible Disclosure, or somewhere in the middle

Morey Haber (MH), Vice President of Technology, BeyondTrust:

I believe in responsible disclosure. The notification of vulnerabilities to the public should only occur after a mitigation (patch or permanent configuration change) is available from the vendor or open source project. The reason why is simple; to minimize the risk to the end user and business until an acceptable solution is found. The history of full disclosure has shown us that exploits will appear well before anyone can defend themselves and create unnecessary loss financially (time, goods, and outages) when no solution is available. It is kind of like complaining to your boss about a problem but having no solutions to mitigate the issue. Always have a plan to solve a problem when you escalate an issue. Never show up empty handed.

If a researcher chooses to follow responsible / coordinated disclosure and the vendor goes silent -- or CERT stops responding to them -- is Full Disclosure proper at this point If not, why not

MH: No. Full disclosure is never an option. As a researcher, persistence and patience is very important. These are traits you must accept as a part of the work you are doing and a risk based on your findings. If no one replies, move on to the next project for all the reasons I listed above. Organizations can go dark for many reasons: the complexity of the issue to resolve, lack of talent and understanding to mitigate the risk, all the way through to pure incompetence.

Full disclosure is not an option for any of them and the work a researcher does is like an artist. Not everyone will appreciate your work, even if it is a masterpiece in vulnerability research or a simple splatter of paint potentially worth millions. A researcher must accept that the work they do may fall on deaf ears or be worth cash.

Bug Bounty programs are becoming more common, but sometimes the reward being offered is far less than the perceived value of the bug / exploit. What do you think can be done to make it worth the researcher's time and effort to work with a vendor directly

MH: Morals. An honest day's work as a researcher in receiving a Bug Bounty is far more ethical than selling the findings of a bug / exploit for misuse. It is the same problem with downloading movies or music. People do it all day but would not walk into a local store and steal a DVD.

The Internet of Things has made cybercrime a much more tolerated crime and selling an exploit much more acceptable. As a society we need to raise the awareness that both are equal crimes even though it does have a physical aspect to it.

To that end, researchers are doing very important work but must be legally ethically responsible with their findings. In order to shortcut this problem, I would encourage vendors and researchers to work together through established companies like Veracode (services) or even Bugcrowd. The later represents a relatively new way to identify these problems and properly handle disclosure and mitigation.

Do you think vulnerability disclosures with a clear marketing campaign and PR process, such as Heartbleed, POODLE, or Shellshock, have value

MH: Yes, but only after Responsible Disclosure. Raising awareness is key at all levels of business / management and society. Only then do we understand the urgency to mitigate the risk and prioritize resources to close the vulnerability from potential exploitation.

If we did not have these media campaigns, potentially only engineers, auditors, and a few other teams would understand the risk and their voices alone may not be loud enough to solve the problem.

If the proposed changes pass, how do you think Wassenaar will impact the disclosure process Will it kill full disclosure with proof-of-concept code, or move researchers away from the public entirely preventing serious issues from seeing the light of day Or, perhaps, could it see a boom in responsible disclosure out of fear of being on the wrong side of the law

MH: If Wassenaar passes with the current changes, I think you will see a split in research and public disclosure.

I believe the criminal activity for underground exploits will blossom because people that are doing the work secretly can make money from their research and have no real public forum to share their results responsibly. The other half will be employed by companies to conduct the research themselves or via services.

The days of a researcher as a consultant performing full disclosure could be a serious crime as they find other avenues to obtain a capital gain for their work. Basically, any time you make something illegal the camp splits and you find people on both sides. Think Prohibition.

(www.csoonline.com)

Steve Ragan

Zur Startseite