Last week, a friend loaned me his copy of Emergency, by Neil Strauss, and I couldn’t put it down.

It’s a non-fiction book about the author’s slow transformation from wussy city dweller to full-on survival and disaster expert. And I mean full on; we’re talking everything from normal disaster preparedness, to extensive training in weapons, wilderness and urban survival, developing escape routes from his home to other countries, planting food and fuel caches, and gaining dual citizenship… “just in case”. There’s even a bit with a goat, but not in the humorous/deviant way.

I’ve never considered myself a survivalist, although I’ve had extensive survival training and experience as part of my various activities. When you tend to run towards disasters, or engage in high risk sports, it’s only prudent to get a little extra training and keep a few spare bags of gear and supplies around.

After I got back from helping out with the Hurricane Katrina response I went through a bit of a freak out moment when I realized all my disaster plans were no longer effective. When I was single in Boulder I didn’t really have to worry much – as a member of the local response infrastructure, I wouldn’t be sitting at home if anything serious happened. I kept the usual 3-day supply of food and water, not that I expected to need it (since I’d be in the field), and my camping/rescue gear would take care of the rest. I lived well above the flood-line, and only had to grab a few personal items and papers in case of a fire.

By the time I deployed to Katrina I was living in a different state with a wife (well, almost, at the time), pets, and an extended family who weren’t prepared themselves. The biggest change of all was that I was no longer part of the local response infrastructure – losing access to all of the resources I’d grown used to having available. The only agency I still worked for was based hundreds of miles away in Denver. Oops.

Needless to say the complexities of planning for a family with children, pets, and in-laws is far different than holing up in the mountains for a few days. Seriously, do you know how hard it is to plan on bugging out with cats who don’t get along? (Yes, of course I’m taking them – I care more about my cats than I do about most people). I still don’t feel fully prepared, although the range of disasters we face in Phoenix is smaller than Colorado. I fully recognize the odds of me ever needing any of my disaster plans are slim to none, but I’ve been involved in far too many real situations to think the effort and costs aren’t worth it.

Nearly all of the corporate disaster planning I’ve seen is absolute garbage (IT or otherwise). The drills are scripted, the plans fatally flawed, and the people running them are idiots who took a 2 day class and have no practical experience. If you have a plan that hasn’t been practiced under realistic conditions on a regular basis, there’s no chance it will work. Oh – and most of the H1N1 advice out there is rubbish. Just tell sick people to stay home at the first sign of a fever, and don’t count it against their vacation hours.

Anyway, I highly recommend the book. It’s an amusing read with a good storyline and excellent information. And a goat.

With that, it’s time for our weekly update:

-Rich

Webcasts, Podcasts, Outside Writing, and Conferences

  • Rich and Adrian presented at the Data Protection Decisions in DC. Rich had a full schedule with “Understanding and Selecting a DLP Solution”, “Understanding and Selecting a Database Activity Monitoring Solution”, “and Pragmatic Data Security”. Adrian got to lounge around after presenting “Truth, Lies and Fiction about Encryption”. We’ll be doing another one in Atlanta on November 19th.
  • Rich was quoted in SC Magazine on the acquisition of Vericept by Trustwave.

Don’t forget that you can subscribe to the Friday Summary via email.

Favorite Securosis Posts

Other Securosis Posts

Project Quant Posts

Favorite Outside Posts

Top News and Posts

Blog Comment of the Week

This week’s best comment comes from Dave in response to Format and Data Preserving Encryption:

Ok, First let me start out by admitting I am almost entirely wrong 😉

Now we have that out of the way… I was correct in asserting the resulting reduced cypher is no more secure than a native cypher of that blocksize, but failed to add that the native cypher must have the same keyspace size as the cypher being reduced – for this reason, DES was a bad example (but 128 bit DESX, which is xor(k1,DES(k2,XOR(k1,Plaintext))) would be a good one.

I was in error however asserting that, in practical terms, this in any way matters – in fact, the size of the keyspace (not the blocksize) is the overwhelming factor due to the physical size of another space – the space of all possible mappings that could result from the key schedule.

it is easy to determine that, for a keyspace of n(ks) bits, there are 2^n(ks) possible keys.

correspondingly, for an input (block size) space of n(bs) bits, there are 2^n(bs) possible inputs (and hence, 2^n(bs) outputs)

by implication therefore, there are 2^n(bs) FACTORIAL possible mappings – meaning the number of mappings produced by the key schedule is very sparse indeed; a reasonable reading of even a fairly small DPE block space (for example the CC personal security code, which is three decimal digits) would be 1000 discrete values; even without expanding to the more reasonable 1024 values (10 bits) we are talking 1000! entries in the mapping space, far, far more than any sane keyspace.

as keyspace reduces therefore, the number of false positives (not true positives) will increase in the keyspace; for a given input/output known plaintext pair, one key in every ‘n’ (where n is the size of the blockspace, 1000 in this example) will be a valid match, statistically. However, as the number of keys which match a given sample block in the keyspace increases, the likelyhood of collisions in the mapping keyspace does not increase noticably – this is obviously not true if someone were to attempt a two bit mapping or something, but I will cover that in a second.

What is required then is an estimate of how many plaintext/cyphertext pairs will be required to uniquely identify a key as valid in the keyspace; as the block size goes down or the key size goes up, this value goes up (so for single DES, where the keyspace is smaller than the blocksize space, never mind the mapping space, the value is very slightly over 1 (there is a small possibility, via the birthday paradox, that two keys will generate the same mapping). For a 128 bit keyed, 64 bit cypher, this is going to be a bit over two, and so forth.

effectively therefore, until the required number of pairs approaches the size of the block itself, you are still going to have to search almost the entire keyspace (well, 50% of it on average) to find the valid key. In situations where this is not true (where the keyspace is so large compared to the input space that the number of keys which produce the same mapping is significant) the required number of pairs is equal or near equal to the input blockspace – at which point, you don’t NEED a key, you already have a comprehensive lookup table!

so, I failed to think this though sufficiently, and was wrong. At least it was a learning experience. 😉

Share: