A couple weeks ago I was sitting on the edge of the hotel bed in Boulder, Colorado, watching the immaculate television. A US-made 30” CRT television in “standard definition”. That’s cathode ray tube for those who don’t remember, and ‘standard’ is the marketing term for ‘low’. This thing was freaking horrible, yet it was perfect. The color was correct. And while the contrast ratio was not great, it was not terrible either. Then it dawned on me that the problem was not the picture, as this is the quality we used to get from televisions. Viewing an old set, operating exactly the same way they always did, I knew the problem was me. High def has so much more information, but the experience of watching the game is the same now as it was then. It hit me just how much our brains were filling in missing information, and we did not mind this sort of performance 10 years ago because it was the best available. We did not really see the names on the backs of football jerseys during those Sunday games, we just thought we did. Heck, we probably did not often make out the numbers either, but somehow we knew who was who. We knew where our favorite players on the field were, and the red streak on the bottom of the screen pounding a blue colored blob must be number 42. Our brain filled in and sharpened the picture for us.
Rich and I had been discussing experience bias, recency bias, and cognitive dissonance during out trip to Denver. We were talking about our recent survey and how to interpret the numbers without falling into bias traps. It was an interesting discussion of how people detect patterns, but like many of our conversations devolved into how political and religious convictions can cloud judgement. But not until I was sitting there, watching television in the hotel; did I realize how much our prior experiences and knowledge shape perception, derived value, and interpreted results. Mostly for the good, but unquestionably some bad.
Rich also sent me a link to a Michael Shermer video just after that, in which Shermer discusses patterns and self deception. You can watch the video and say “sure, I see patterns, and sometimes what I see is not there”, but I don’t think videos like this demonstrate how pervasive this built in feature is, and how it applies to every situation we find ourself in.
The television example of this phenomena was more shocking than some others that have popped into my head since. I have been investing in and listening to high-end audio products such as headphones for years. But I never think about the illusion of a ‘soundstage’ right in front of me, I just think of it as being there. I know the guitar player is on the right edge of the stage, and the drummer is in the back, slightly to the left. I can clearly hear the singer when she turns her head to look at fellow band members during the song. None of that is really in front of me, but there is something in the bits of the digital facsimile on my hard drive that lets my brain recognize all these things, placing the scene right there in front of me.
I guess the hard part is recognizing when and how it alters our perception.
On to the Summary:
Webcasts, Podcasts, Outside Writing, and Conferences
- Rich quoted in “Apple in a bind over its DNS patch”.
- Adrian’s Dark Reading post on SIEM ain’t DAM.
- Rich and Martin on Network Security Podcast #206.
Favorite Securosis Posts
- Rich: Pricing Cyber-Policies. As we used to say at Gartner, all a ‘cybersecurity’ policy buys you is a seat at the arbitration table.
- Mike Rothman: The Cancer within Evidence Based Research Methodologies. We all need to share data more frequently and effectively. This is why.
- Adrian Lane: FireStarter: an Encrypted Value Is Not a Token!. Bummer.
Other Securosis Posts
- Tokenization: Token Servers.
- Incite 7/20/2010: Visiting Day.
- Tokenization: The Tokens.
- Comments on Visa’s Tokenization Best Practices.
- Friday Summary: July 15, 2010.
Favorite Outside Posts
- Rich: Successful Evidence-Based Risk Management: The Value of a Great CSIRT. I realize I did an entire blog post based on this, but it really is a must read by Alex Hutton. We’re basically a bunch of blind mice building 2-lego high walls until we start gathering, and sharing, information on which of our security initiatives really work and when.
- Mike Rothman: Understanding the advanced persistent threat. Bejtlich’s piece on APT in SearchSecurity is a good overview of the term, and how it’s gotten fsked by security marketing.
- Adrian Lane: Security rule No. 1: Assume you’re hacked.
Project Quant Posts
- NSO Quant: Monitor Process Revisited.
- NSO Quant: Monitoring Health Maintenance Subprocesses.
- NSO Quant: Validate and Escalate Subprocesses.
- NSO Quant: Analyze Subprocess.
- NSO Quant: Collect and Store Subprocesses.
- NSO Quant: Define Policies Subprocess.
- NSO Quant: Enumerate and Scope Subprocesses.
Research Reports and Presentations
- White Paper: Endpoint Security Fundamentals.
- Understanding and Selecting a Database Encryption or Tokenization Solution.
- Low Hanging Fruit: Quick Wins with Data Loss Prevention.
- Report: Database Assessment.
- Database Audit Events.
- XML Security Overview Presentation.
- Project Quant Survey Results and Analysis.
- Project Quant Metrics Model Report.
Top News and Posts
- Researchers: Authentication crack could affect millions.
- SCADA System’s Hard-Coded Password Circulated Online for Years.
- Microsoft Launches ‘Coordinated’ Vulnerability Disclosure Program.
- GSM Cracking Software Released.
- How Mass SQL Injection Attacks Became an Epidemic.
- Harsh Words for Professional Infosec Certification.
- Google re-ups the disclosure debate. A new policy – 60 days to fix critical bugs or they disclose. I wonder if anyone asked the end users what they want?
- Adobe reader enabling protected mode. This is a very major development… if it works. Also curious to see what they do for Macs.
- Oracle to release 59 critical patches in security update. Is it just me, or do they have more security patches than bug fixes nowdays?
- Connecticut AG reaches agreement with Health Net over data breach.
Blog Comment of the Week
Remember, for every comment selected, Securosis makes a $25 donation to Hackers for Charity. This week’s best comment goes to Jay Jacobs, in response to FireStarter: an Encrypted Value Is Not a Token.
@Adrian – I must be missing the point, my apologies, perhaps I’m just approaching this from too much of a cryptonerd perspective. Though, I’d like to think I’m not being overly theoretical.
To extend your example, any merchant that wants to gain access to the de-tokenized content, we will need to make a de-tokenization interface available to them. They will have the ability to get at the credit card/PAN of every token they have. From the crypto side, if releasing keys to merchants is unacceptable, require that merchants return ciphertext to be decrypted so the key is not shared… What’s the difference between those two?
Let’s say my cryptosystem leverages a networked HSM. Clients connect and authenticate, send in an account number and get back ciphertext. In order to reverse that operation, a client would have to connect and authenticate, send in cipher text and receive back an account number. Is it not safe to assume that the ciphertext can be passed around safely? Why should systems that only deal in that ciphertext be in scope for PCI when an equivalent token is considered out of scope?
Conversely, how do clients authenticate into a tokenization system? Because the security of the tokens (from an attackers perspective) is basically shifted to that authentication method. What if it’s a password stored next to the tokens? What if it’s mutual SSL authentication using asymmetric keys? Are we just back to needing good key management and access control?
My whole point is that, from my view point, I think encrypting data is getting a bad wrap when the problem is poorly implemented security controls. I don’t see any reason to believe that we can’t have poorly implemented tokenization systems.
If we can’t control access into a cryptosystem, I don’t see why we’d do any better controlling access to a token system. With PCI DSS saying tokenization is “better”, my guess is we’ll see a whole bunch of mediocre token systems that will eventually lead us to realize that hey, we can build just as craptastic tokenization systems as we have cryptosystems.