Friday Summary: October 28, 2011By Adrian Lane
I really enjoyed Marco Arment’s I finally cracked it post, both because he captured the essence of Apple TV here and now, and because his views on media – as a consumer – are exactly in line with mine. Calling DVRs “a bad hack” is spot-on. I went through this process 7 years ago when I got rid of television. I could not accept a 5 minute American Idol segment in the middle of the 30 minute Fox ‘news’ broadcast. Nor the other 200 channels of crap surrounding the three channels I wanted. At the time people thought I was nuts, but now I run into people (okay – only a handful) who have pulled the plug on the broadcast media of cable and satellite. Most people are still frustrated with me when they say “Hey, did you see SuperJunk this weekend?” and I say “No, I don’t get television.” They mutter something like ‘Luddite’ and wonder off. Don’t get me wrong, I have a television. A very nice one in fact, but I have been calling it a ‘monitor’ for the last few years because it’s not attached to broadcast media. But not getting broadcast television does not make me a Luddite – quite to the contrary, I am waiting for the future.
I am waiting for the day when I can get the rest of the content I want just as I get streaming Netflix today. And it’s not just the content, but the user experience as well. I don’t want to be boxed into some bizarre set of rules the content owners think I should follow. I don’t want half-baked DRM systems or advertising thrust at me – and believe me, this is what many of the other streaming boxes are trying to do. I don’t want to interact with a content provider because I am not interested – it was a bad idea proven foul a long time ago.
Just let me watch what I want to watch when I want to watch it. Not so hard.
But I wanted to comment on Marco’s point about Apple and their ability to be disruptive. My guess is that Apple TV will go fully a la carte: Show by show, game by game, movie by movie. But the major difference is we would get first run content, not just stuff from 2004. Somebody told me the other day that HBO stands for “Hey, Beastmaster’s On!”, which is how some of the streaming services and many of the movie channels feel. SOS/DD. The long tail of the legacy television market. The major gap in today’s streaming is first run programming. All I really want that I don’t have today is the Daily Show and… the National Football League (queue Monday Night Football soundtrack).
And that’s the point where Mr. Arment’s analysis and mine diverge – the NFL. I agree that whatever Apple offers will likely be disruptive because the technology will simplify how we watch, rather than tiptoeing around legacy businesses and perverse contracts. But today there is only one game in town: the NFL. That’s why all those people pay $60 (in many cases it’s closer to $120) a month – to watch football. You placate kids with DVDs; you subscribe to cable for football! Just about every man I know, and 30% of the women, want to watch their NFL home team on Sunday. It’s the last remaining reason people still pay for cable or satellite in this economy. Make no mistake – the NFL is the 600 lb. gorilla of television. They currently hold sway over every cable and satellite network in the US. And the NFL makes a ridiculous amount of money because networks must pay princely sums for NFL games to be in the market. Which is why the distributors are so persnickety about not having NFL games on the Internet. Why else would they twist the arm of the federal government to shut down a guy relaying NFL games onto the Internet? (Thanks a ton for that one you a-holes – metropolitan areas broadcast over-the-air for free but it’s illegal to stream? WTF?)
Nobody broadcasts live games over the Internet!?! Why not?!? The NFL could do it directly – they are already set up with “Game Pass” and “Game Rewind” – but likely can’t because fat network contracts prohibit it. Someone would need to spend the $$$ to get Internet distribution rights. Someone should, because there is huge demand, but there are only a handful of firms which could ante up a billion dollars to compete with DirecTV. But when this finally happens it will be seriously disruptive. Cable boxes will be (gleefully) dumped. Satellite providers will actually have competition, forcing them to alter their contacts and rates, and go back to delivering quality picture. ISPs will be pressured to actually deliver the bandwidth they claim to be selling. Consumers will get what they want at lower cost and with greater convenience. Networks will scramble to license the rest of their content to any streaming service provider they can, increasing content availability and pushing prices lower. If Apple wants to be disruptive, they will stream NFL games over the Internet on demand. If they can get rights to broadcast NFL for a reasonable price, they win. The company that gets the NFL for streaming wins. If Apple doesn’t, bet that Amazon will.
On to the Summary:
Webcasts, Podcasts, Outside Writing, and Conferences
- Rich quoted on SaaS security services.
- Adrian quoted in SearchSOA.
- Compliance Holds Up Los Angeles Google Apps Deployment. Mike plays master of the obvious. Ask the auditor before you commit to something that might be blocked by compliance. Duh!
Favorite Securosis Posts
- Adrian Lane: A Kick-Ass Cloud Database Security Automation Example. And most IaaS cloud providers have the hooks to do most of this today. You can even script the removal of base database utilities you don’t want. Granted, you still have to set permissions on data and users, but the infrastructure can be fully secured when it’s instantiated – to the best of your ability.
- Mike Rothman: A Kick-Ass Cloud Database Security Automation Example. The future is here. Today. Kind of. And it’s not called SkyNet. Yet.
- Rich: Next Generation != (Always) Better. This is such a damn pet peeve of mine now. If you are a vendor and say this in a briefing, you should know I will tune out.
Other Securosis Posts
- Applied Network Security Analysis: Collection and Analysis = A Fighting Chance.
- Incite 10/26/2011: The Curious Case of Flat Stanley.
- New Series: Understanding and Selecting a Database Activity Monitoring Solution 2.0.
- Friday Summary: October 21, 2011.
Favorite Outside Posts
- Dave Lewis: Who Else Was Hit by the RSA Attackers?
- Mike Rothman: Assurance of Assessments. Gunnar kills it in this post. Scary stuff. But delusion is a cultural norm in some places. And those places tend to go away over time. Blame Darwin for that.
- Adrian Lane: Will Siri Change the Rules of the Search Game? Not security related, but a great article by our own Rich.
- Rich: NSA helping big banks fight attacks. Well, the article title says “hackers” but that’s just silly. Still, this is the best article I’ve seen on the trickle of information sharing that’s starting to come out.
Project Quant Posts
- DB Quant: Index.
- NSO Quant: Index of Posts.
- NSO Quant: Health Metrics–Device Health.
- NSO Quant: Manage Metrics–Monitor Issues/Tune IDS/IPS.
- NSO Quant: Manage Metrics–Deploy and Audit/Validate.
- NSO Quant: Manage Metrics–Process Change Request and Test/Approve.
- NSO Quant: Manage Metrics–Signature Management.
- NSO Quant: Manage Metrics–Document Policies & Rules.
- NSO Quant: Manage Metrics–Define/Update Policies and Rules.
- NSO Quant: Manage Metrics–Policy Review.
Research Reports and Presentations
- Fact-Based Network Security: Metrics and the Pursuit of Prioritization.
- Tokenization vs. Encryption: Options for Compliance.
- Security Benchmarking: Going Beyond Metrics.
- Understanding and Selecting a File Activity Monitoring Solution.
- Database Activity Monitoring: Software vs. Appliance.
- React Faster and Better: New Approaches for Advanced Incident Response.
- Measuring and Optimizing Database Security Operations (DBQuant).
- Network Security in the Age of Any Computing.
Top News and Posts
- Prism Plans to Recycle Security Camera Images. The Internet. It remembers everything. Forever.
- Congress to release report on hacked satellites. China implicated, but given the low degree of difficulty it could have been anyone.
- Feds to Blacklist Piracy Sites Under House Proposal. It’s not only the wrong idea, it seems built for abuse.
- EFF’s report on HTTPS security.
- How to bypass the iPad password (sorta).
- SecureWorks analysts doubt Duqu and Stuxnet related.
Blog Comment of the Week
This topic came up at the PCI EMEA meeting during the discussion session on P2PE and Tokenization – it was openly discussed and the answers pointed directly back to good key management practices as segregation, the use of HSM’s and so forth. It’s not always vendors talking about this as you mention. Indeed, FPE came up too – brought up by the PCI SSC panel consisting of the Advisory/Management Board – clearly a topic of interest given its wide adoption already – and the panel made it clear that it was acceptable if there was the right level of cryptographic backing – proofs, standards tracks etc.
So, to clarify – encrypted data can be taken out of scope of PCI. In the use case of P2PE (Point to Point Encryption), it’s absolutely clear that encrypted data can be out of scope when the implementation adheres to the P2PE requirements or a QSA evaluates the environment as such – this is to get the data from all those endpoints – secure devices capturing card/chip data and encrypting there and then for transmission up to the acquirer. There is very specific guidance on this in the current P2PE document of what can be de-scoped. Strong cryptography in the absence of keys is irreversible in any practical way. That is a fact and clearly stated.
Of course, P2PE should not be confused with Tokenization – though it often is – they are entirely complimentary. P2PE is about sending pre authorization cardholder data or live payment data to an acquirer. Tokenization is useful after authorization – batch files, transaction logs for settlement and for post sales processes like chargeback. Keep in mind with an on premise tokenization solution, there will still be a need to detokenize prior to any PAN driven process – and it’s absolutely critical – for the merchants sake to get their money from transactions they’ve now tokenized – that the token processes have the highest integrity and reliability. If those tokens go missing, are corrupted, or purged accidentally, it might have a serious financial impact if the batch settlement files can’t execute.
FPE really comes into its own for P2PE in the merchants case – when the path from the point of capture to the destination is complicated by legacy systems expecting a PAN, SAD or Track data and where change system impact needs to be minimal to none. Then, once the transaction data has been securely transferred to the acquirer without the merchant’s ability to decrypt in any shape or form and a token returned in response – you have the ideal scope reduction scenario – the merchant’s not managing a token database or vault/code book, and responsibility for key management rests with the acquirer, and the exposure of PAN data is minimized.
With regards to your decision tree on scope for the tokenization use case – I think you’ve left out an important use case: when the token – or transaction ID’s for that matter used as the surrogate to the PAN for post authorization uses – is cryptographically generated by a third party where the keys are managed entirely independently – e.g. in HSM at an acquirer – then this use cases falls into the PCI FAQ on encrypted data and scope – FAQ 10359 which parallels the P2PE guidance.
In respect to the statement “The supporting encryption and key management systems are accessible – meaning PAN data is available to authorized users, so FPE cannot remove systems from the audit scope” It’s not quite clear what you mean. Contrast this statement with a very common scenario comparable to an acquirer issuing tokens after a P2PE transaction – PIN debit transactions and transaction response codes. Here we have encrypted data going up, derived data coming back. Do merchants have access to PIN decryption keys for instance because they accept PIN debit at the POS? No. Do merchants have direct access to the systems which create the derived transaction response codes post authorization for reversals from the acquirer? No. Perhaps a badly implemented on premise solution that’s not even close to PCI compliant could create an exposure like this, but a Tokenization approach of the same quality would be equally at risk of “accessibility”.
Of course, scope reduction is a major driver for PCI technology investment – but eliminating it is not always possible. This is one of the reasons we provide a choice – randomly generated tokens disassociated from the PAN in every way, or Format Preserving Encryption – from one platform. The decision on approach can be suited to risk, cost, scope, implementation – on premise, acquirer etc and so on.
I do agree with Stephens post – taking an “out of scope, out of mind” posture is financially attractive, but there’s more data to a merchants IT systems than just cardholder data – and most larger merchants already know this and do care – for their customers sake and brand reputation – and are taking steps which go beyond where PCI leaves off to protect data, and to ensure their systems are scuppered by hackers walking I the front door of a de-scoped environment. De-scoped does not mean secure.
Disclaimer: I work for a firm that provides FPE, P2PE and Random token technology.