The good news about being in security is that you don’t have to look too far for criticism of your work. Most of the time it’s constructive criticism, so overall interaction with the security community makes your work markedly better. Which is why we live by the Totally Transparent Research process. It makes our work better.
But when our pals at Verizon clogged up my Twitter timeline this morning with their annual DBIR masterpiece (you can also check out our guidance on the DBIR), they dragged my attention back to a post by Jericho from Attrition: “Threat Intelligence”, not always that intelligent, prompted by Symantec’s most recent security trends report.
Jericho summed up the value of security trend reports as only he can, and explained why folks tend not to challenge them often.
The reason? Security companies, professionals, and journalists are complacent. They are happy to get the numbers that help them. For some, it sells copy. For others, it gets security budget. Since it helps them, their motivation to question or challenge the data goes away. They never realize that their “threat intelligence” source is stale and serving up bad data.
It’s not in the machine’s best interest to question the data. That’s why most folks (besides, me I guess) don’t poke at the vendor-sponsored survey data or other similar nonsense put forth as gospel in the security business. Anything that helps sell security is good, right?
Well, no. Decisions based on faulty data tend to be faulty decisions. So Jericho presents a number of inconsistencies between Symantec’s vulnerability data and the OSVDB dataset he contributes to. It’s pretty compelling stuff.
But we shouldn’t minimize either the effort involved in building these reports or the value they do provide. There is a lot of value in these threat and data breach reports, if the data is reasonably accurate. We’re security people. We question everything, so it’s reasonable to question the data you use to make the case for your existence.
Photo credit: “Question” originally uploaded by ACU Library
Reader interactions
One Reply to “Question everything, including the data”
Unless you have insider knowledge that we do not know about, the report consists of 2013 data collected from 19 diverse sources. Pages 9 and 62 lists the sources. Taking Verizon at their word that this is 2012 data, then even from a small sample it will take a little time to compile, analyze, and write the report. Giving VZ adequate time for internal reviews, a 3.5 month lead sounds like a reasonable snapshot in time.
Whenever someone bases an analysis on a snapshot of limited data sources, there will be questions as to its accuracy. It is up to the reader to determine whether the sample is representative of the big picture. In fact, Verizon admits that there might be sample bias in their explanation on page 10.
If we can trust Verizon to a certain degree, there is no indication that the data is stale or just bad. Like any study or analysis, it is appropriate to question the conclusion based on the fact. It is unreasonable to disregard the report out of whole cloth without supporting the statement that the data is stale and bad. That would be bad analysis and would not be acceptable by my clients!