During BlackHat I proctored a session on “Optimizing the Security Researcher and CSO relationship. From the title and outline most of us assumed that this presentation would get us away from the “responsible disclosure” quagmire by focusing on the views of the customer. Most of the audience was IT practitioners, and most were interested in ways research findings might help the end customer, rather than giving them another mess to clean up while exploit code runs rampant. Or just as importantly, which threat is hype, and which threat is serious.
Unfortunately this was not to be. The panel got (once again) mired in the ethical disclosure debate, with vendors and researchers staunchly entrenched in their positions. Irreconcilable differences: we get that. But speaking with a handful of audience members after the presentation I can say they were a little ticked off. They asked repeatedly how does this help the customers? To which they got a flippant answers to the effect “we get them boxes/patches as fast as we can”.
Our contributing analyst Gunnar Peterson offered a wonderful parallel that describes this situation: The Market for Lemons. It’s an analysis of how uncertainty over quality changes a market. In a nutshell, the theory states that a vendor has a distinct advantage as they have knowledge and understanding of their product that the average consumer is incapable of discovering. The asymmetry of available information means consumers cannot judge good from bad, or high risk from low. The seller is incentivized to pass off low quality items as high quality (with premium pricing), and customers lose faith and consider all goods low quality, impacting the market in several negative ways. Sound familiar?
How does this apply to security? Think about anti-virus products for a moment and tell me this isn’t a market for lemons. The AV vendors dance on the tables talking about how they catch all this bad stuff, and thanks to NSS Labs yet another test shows they all suck. Consider product upgrade cycles where customers lag years behind the vendor’s latest release or patch for fear of getting a shiny new lemon. Low-function security products, just as with low-quality products in general, cause IT to spend more time managing, patching, reworking and fixing clunkers. So a lot of companies are justifiably a bit gun-shy to upgrade to the latest & greatest version.
We know it’s in the best interest of the vendors to downplay the severity of the issues and keep their users calm (jailbreak.me
, anyone?). But they have significant data that would help the customers with their patching, workarounds, and operational security as these events transpire. It’s about time someone started looking at vulnerability disclosures from the end user perspective. Maybe some enterprising attorney general should stir the pot? Or maybe threatened legislation could get the vendor community off their collective asses? You know the deal – sometimes the threat of legislation is enough to get forward movement.
Is it time for security Lemon Laws? What do you think? Discuss in the comments.
Reader interactions
5 Replies to “FireStarter: Market for Lemons”
@ds:
“I don
This is the crux of my argument that Responsible Disclosure is necessary and not doing it is not only irresponsible but is also negligent. The end user is the one who often suffers the brunt of the problems when software is poorly and/or insecurely written. We security professionals bemoan the fact that it causes more work for us but we aren’t the ones who are suffering financial loss or severe headache due to having our digital lives turned upside down.
The vendors have to step up to the plate and start doing a better job of releasing secure code and researchers have to be more patient in releasing the vulns that they find.
This tracks with some of my current thinking. Perhaps we can expand the metaphor further and jump to “Unsafe at Any Speed”? Until there are significant penalties for bad software nothing will change. I’m not a fan of current privatized vulnerability market because it has led to this specific disclosure stalemate. I think there is a very strong case to be made that there should be federal oversight on this as it is a National Security issue on more than one level. I’d like to see vulnerability researchers compensated with tax incentives, or bounties administered through public funds. In this model a research who found an MS 0-day that affected the entire planet might not have to pay federal income taxes for the rest of his life.
I guess this could be read both ways… more insight as would be gained from researchers could help shift the ballance of information to the consumer, but it could also confirm the conclusion that a product was low quality.
I don’t know of any related research that shows that consumer information helps improve consumer outcomes, though that would be interesting to see. Does anyone know if the “security seal” programs actually improve user’s perceptions? And do those perceptions materialize in greater adoption? Also may be interesting.
I don’t think we need something like lemon laws for two reasons:
1) The provable cost of buying a bad product for the consumer is nominal; not likely to get any attention. The cost of the security product failing are too hard to quantify into actual numbers so I am not considering these.
2) Corporations that buy the really expensive security products have far more leverage to conduct pre-purchase evaluations, to put non-performance clauses into their contracts and to readily evaulate ongoing product suitability. The fact that many don’t is a seperate issue that won’t in any case be fixed by the law.
Ross Anderson has written about this in probably the most-cited paper on the subject: http://www.cl.cam.ac.uk/~rja14/Papers/econ.pdf
He also has an extensive list of related references at his home page.