The disclosure debate is raging once again and its even seeing some discussion on the SCADA mailing lists.  This was stirred up by the No More Free Bugs “campaign” announced at Cansecwest by Miller, Sotirov, and Dai Zovi.  Accomplished guys and names that should at least sound familiar if you try to stay current on security research.

So why the stir, whats changed?  Well, not a whole lot, but economists may very well have some interesting things to study in the future.  Its a sticky situation, and the time it takes to create a reliable exploit for complex vulnerabilities is considerable, and its understandable that a vulnerability researcher would want to be compensated for that time.  But in most cases the vendor didn’t ask the researcher to look at their software, so they don’t feel obligated to pay, and may well feel like they’re owed the details since its in their software.  And its near impossible to try to get compensation, even in the form of recognition on an advisory, without it feeling like you’re running a protection racket.  So clearly theres a need for change of some sort to make sure that research continues, software improves, and end users stay safe.

I’m interested to see how this will this affect mainstream vs niche software (like SCADA systems), as vulns become more difficult to make into reliable exploits in software like Windows/Apache/IIS/etc do researchers turn their attention to more obscure software and hope for similar payoffs or do they keep going after the big stuff in hopes of fewer but larger payoffs?  Potential damage from vulns in control systems are huge, but would vendors be willing to pay for the information?  Will bug bounty programs like Mozillas become the rule rather than the exception?

This campaign alone won’t change things completely.  Some researchers will sit on a mountain of 0day, some will drop everything they find on full disclosure, some will approach the vendor (some of those through a CERT) and others will sell it to a 3rd party.  These approaches are generally accepted by various parts of the security community in one way or another, but the one that I’m most apprehensive about is the last one.

I see a slippery slope, and its a long way down there, but as markets grow regulation usually isn’t far behind, which isn’t necessarily a bad thing.  But I fear the possibility of having to work under a licensed vulnerability broker, and the thought of auditing software and exploits having the same kind of gray/selectively enforced laws that locksmith tools have in many places today.
Deatils about our own disclosure policy can be found here.