Securosis

Research

Friday Summary: August 20, 1010

Before I get into the Summary, I want to lead with some pretty big news: the Liquidmatrix team of Dave Lewis and James Arlen has joined Securosis as Contributing Analysts! By the time you read this Rich’s announcement should already be live, but what the heck – we are happy enough to coverage it here as well. Over and above what Rich mentioned, this means we will continue to expand our coverage areas. It also means that our research goes through a more rigorous shredding process before launch. Actually, it’s the egos that get peer shredding – the research just gets better. And on a personal note I am very happy about this as well, as a long-time reader of the Liquidmatrix blog, and having seen both Dave and James present at conferences over the years. They should bring great perspective and ‘Incite’ to the blog. Cheers, guys! I love talking to digital hardware designers for computers. Data is either a one or a zero and there is nothing in between. No ambiguity. It’s like a religion that, to most of them, bits are bits. Which is true until it’s not. What I mean is that there is a lot more information than simple ones and zeros. Where the bits come from, the accuracy of the bits, and when the bits arrive are just as important to their value. If you have ever had a timer chip go bad on a circuit, you understand that sequence and timing make a huge difference to the meaning of bits. If you have ever tried to collect entropy from circuits for a pseudo-random number generator, you saw noise and spurious data from the transistors. Weird little ‘behavioral’ patterns or distortions in circuits, or bad assumptions about data, provide clues for breaking supposedly secure systems, so while the hardware designers don’t always get this, hackers do. But security is not my real topic today – actually, it’s music. I was surprised to learn that audio engineers get this concept of digititis. In spades! I witnessed this recently with Digital to Analog Converters (DACs). I spend a lot of my free time playing music and fiddling with stereo equipment. I have been listening to computer based audio systems, and pleasantly surprised to learn that some of the new DACs reassemble digital audio files and actually make them sound like music. Not that hard, thin, sterile substitute. It turns out that jitter – incorrect timing skew down as low as the pico-second level – causes music to sound like, well, an Excel spreadsheet. Reassembling the bits with exactly the right timing restores much of the essence of music to digital reproduction. The human ear and brain make an amazing combination for detecting tiny amounts of jitter. Or changes in sound by substituting copper for silver cabling. Heck, we seem to be able to tell the difference between analog and digital rectifiers in stereo equipment power supplies. It’s very interesting how the resurgence of interest in of analog is refining our understanding of the digital realm, and in the process making music playback a whole lot better. The convenience of digital playback was never enough to convince me to invest in a serious digital HiFi front end, but it’s getting to the point that it sounds really good and beats most vinyl playback. I am looking at DAC options to stream from a Mac Mini as my primary music system. Finally, no news on Nugget Two, the sequel. Rich has been mum on details even to us, but we figure arrival should be about two weeks away. On to the Summary: Webcasts, Podcasts, Outside Writing, and Conferences Intel to acquire McAfee in $7.7 billion deal. Mike quoted as being baffled. Which is not a surprise… Adrian’s Dark Reading post on Database Threat Modeling And Strip Poker. Good interview with Mike Rothman on Infosec Resources Sending a Mac away. There are many things to expunge before you let a Mac (or any other computer) out of your personal possession. This list feels long but it’s short compared to taking a laptop to China… Favorite Securosis Posts Mike Rothman: Acquisition Doesn’t Mean Commoditization. David Mortman: Tokenization: Selection Criteria. Adrian Lane: Since I am a contrarian I can’t go with David Mortman’s Acquisition Doesn’t Mean Commoditization, so I’ll pick Rich’s Sour Grapes Incite snippet. Not a whole post, but dead on the money! Rich Mogull: Acquisition Doesn’t Mean Commoditization Gunnar Peterson: HP (Finally) Acquires Fortify Other Securosis Posts Liquidmatrix + Securosis: Dave Lewis and James Arlen Join Securosis as Contributing Analysts. Data Encryption for PCI 101: Introduction. Another Take on McAfee/Intel. McAfee: A (Secure) Chip on Intel’s Block. Acquisition Doesn’t Mean Commoditization. Incite 8/18/2010: Smokey and the Speed Gun. Tokenization: Selection Criteria. Favorite Outside Posts Mike Rothman: Career Advice Tuesday = “How Did You Find Your Mentor”. Hopefully Mike and Lee didn’t find a mentor on FriendFinder. But seriously, everyone needs mentors to help them get to the next level. David Mortman: Cloud Computing & Polycentric Risk Tolerances. Adrian Lane: Quality analysis by Andy Jaquith on Horseless Carriage Vendor Buys Buggy-Whips. Rich Mogull: Young will have to change names to escape ‘cyber past’ warns Google’s Eric Schmidt. Honest assessment and totally untrustworthy all at once. Gunnar Peterson: Not a post, but consider this: $4.125B. That’s the average price of acquiring a security company this week. Project Quant Posts NSO Quant: Manage IDS/IPS – Audit/Validate. NSO Quant: Manage IDS/IPS – Deploy. NSO Quant: Manage IDS/IPS – Test and Approve. NSO Quant: Manage IDS/IPS – Process Change Request. NSO Quant: Manage IDS/IPS – Signature Management. Research Reports and Presentations White Paper: Endpoint Security Fundamentals. Understanding and Selecting a Database Encryption or Tokenization Solution. Low Hanging Fruit: Quick Wins with Data Loss Prevention. Report: Database Assessment. Top News and Posts Something about some hardware company that bought some other security company. Supposedly big news. And some other hardware company buying another security company. Supposed to change the industry. Kinda cool feature for detecting

Share:
Read Post

McAfee: A (Secure) Chip on Intel’s Block

Ah, the best laid plans. I had my task list all planned out for today and was diving in when my pal Adrian pinged me in our internal chat room about Intel buying McAfee for $7.68 billion. Crap, evidently my alarm didn’t go off and I’m stuck in some Hunter S. Thompson surreal situation where security and chips and clean rooms and men in bunny suits are all around me. But apparently I’m not dreaming. As the press release says, “Inside Intel, the company has elevated the priority of security to be on par with its strategic focus areas in energy-efficient performance and Internet connectivity.” Listen, I’ll be the first to say I’m not that smart, certainly not smart enough to gamble $7.68 billion of my investors’ money on what looks like a square peg in a round hole. But let’s not jump to conclusions, OK? First things first: Dave DeWalt and his management team have created a tremendous amount of value for McAfee shareholders over the last five years. When DeWalt came in McAfee was reeling from a stock option scandal, poor execution, and a weak strategy. And now they’ve pulled off the biggest coup of them all, selling Intel a new pillar that it’s not clear they need for a 60% premium. That’s one expensive pillar. Let’s take a step back. McAfee was the largest stand-alone security play out there. They had pretty much all the pieces of the puzzle, had invested a significant amount in research, and seemed to have a defensible strategy moving forward. Sure, it seemed their business was leveling off and DeWalt had already picked the low hanging fruit. But why would they sell now, and why to Intel? Yeah, I’m scratching my head too. If we go back to the press release, Intel CEO Paul Otellini explains a bit, “In the past, energy-efficient performance and connectivity have defined computing requirements. Looking forward, security will join those as a third pillar of what people demand from all computing experiences.” So basically they believe that security is critical to any and every computing experience. You know, I actually believe that. We’ve been saying for a long time that security isn’t really a business, it’s something that has to be woven into the fabric of everything in IT and computing. Obviously Intel has the breadth and balance sheet to make that happen, starting from the chips and moving up. But does McAfee have the goods to get Intel there? That’s where I’m coming up short. AV is not something that really works any more. So how do you build that into a chip, and what does it get you? I know McAfee does a lot more than just AV, but when you think about silicon it’s got to be about detecting something bad and doing it quickly and pervasively. A lot of the future is in cloud-based security intelligence (things like reputation and the like), and I guess that would be a play with Intel’s Connectivity business if they build reputation checking into the chipsets. Maybe. I guess McAfee has also been working on embedded solutions (especially for mobile), but that stuff is a long way off. And at a 60% premium, a long way off is the wrong answer. For a go-to-market model and strategy there is very little synergy. Intel doesn’t sell much direct to consumers or businesses, so it’s not like they can just pump McAfee products into their existing channels and justify a 60% premium. That’s why I have a hard time with this deal. This is about stuff that will (maybe) happen in 7-10 years. You don’t make strategic decisions based purely on what Wall Street wants – you need to be able to sell the story to everyone – especially investors. I don’t get it. On the conference call they are flapping their lips about consumers and mobile devices and how Intel has done software deals before (yeah, Wind River is a household name for consumers and small business). Their most relevant software deal was LANDesk. Intel bought them with pomp and circumstances during their last round of diversification, and it was a train wreck. They had no path to market and struggled until they spun it out a while back. It’s not clear to me how this is different, especially when a lot of the stuff relative to security within silicon could have been done with partnerships and smaller tuck-in acquisitions. Mostly their position is that we need tightly integrated hardware and software, and that McAfee gives Intel the opportunity to sell security software every time they sell silicon. Yeah, the PC makers don’t have any options to sell security software now, do they? In our internal discussion, Rich raised a number of issues with cloud computing, where trusted boot and trusted hardware are critical to the integrity of the entire architecture. And he also wrote a companion post to expand on those thoughts. We get to the same place for different reasons. But I still think Intel could have made a less audacious move (actually a number of them) that entailed far less risk than buying McAfee. Tactically, what does this mean for the industry? Well, clearly HP and IBM are the losers here. We do believe security is intrinsic to big IT, so HP & IBM need broader security strategies and capabilities. McAfee was a logical play for either to drive a broad security platform through a global, huge, highly trusted distribution channel (that already sells to the same customers, unlike Intel’s). We’ve all been hearing rumors about McAfee getting acquired for a while, so I’m sure both IBM and HP took long hard looks at McAfee. But they probably couldn’t justify a 60% premium. McAfee customers are fine – for the time being. McAfee will run standalone for the foreseeable future, though you have to wonder about McAfee’s ability to be as acquisitive and nimble as they’ve been. But there is always a focus issue during integration, and there will be the inevitable brain drain.

Share:
Read Post

Another Take on McAfee/Intel

A few moments ago Mike posted his take on the McAfee/Intel acquisition, and for the most part I agree with him. “For the most part” is my nice way of saying I think Mike nailed the surface but missed some of the depths. Despite what they try to teach you in business school (not that I went to one), acquisitions, even among Very Big Companies, don’t always make sense. Often they are as much about emotion and groupthink as logic. Looking at Intel and McAfee I can see a way this deal makes sense, but I see some obstacles to making this work, and suspect they will materially reduce the value Intel can realize from this acquisition. Intel wants to acquire McAfee for three primary reasons: The name: Yes, they could have bought some dinky startup or even a mid-sized firm for a fraction of what they paid for McAfee, but no one would know who they were. Within the security world there are a handful or two of household names; but when you span government, business, and consumers the only names are the guys that sell the most cardboard boxes at Costco and Wal-Mart: Synamtec and McAfee. If they want to market themselves as having a secure platform to the widest audience possible, only those two names bring instant recognition and trust. It doesn’t even matter what the product does. Trust me, RSA wouldn’t have gotten nearly the valuation they did in the EMC deal if it weren’t for the brand name and its penetration among enterprise buyers. And keep in mind that the US federal government basically only runs McAfee and Symantec on endpoints… which is, I suspect, another important factor. If you want to break into the soda game and have the cash, you buy Coke or Pepsi – not Shasta. Virtualization and cloud computing: There are some very significant long term issues with assuring the security of the hardware/software interface in cloud computing. Q: How can you secure and monitor a hypervisor with other software running on the same hardware? A: You can’t. How do you know your VM is even booting within a trusted environment? Intel has been working on these problems for years and announced partnerships years ago with McAfee, Symantec, and other security vendors. Now Intel can sell their chips and boards with a McAfee logo on them – but customers were always going to get the tools, so it’s not clear the deal really provides value here. Mobile computing: Meaning mobile phones, not laptops. There are billions more of these devices in the world than general purpose computers, and opportunities to embed more security into the platforms. Now here’s why I don’t think Intel will ever see the full value they hope for: Symantec, EMC/RSA, and other security vendors will fight this tooth and nail. They need assurances that they will have the same access to platforms from the biggest chipmaker on the planet. A lot of tech lawyers are about to get new BMWs. Maybe even a Tesla or two in eco-conscious states. If they have to keep the platform open to competitors (and they will), then bundling is limited and will be closely monitored by the competition and governments – this isn’t only a U.S. issue. On the mobile side, as Andrew Jaquith explained so well, Apple/RIM/Microsoft control the platform and the security, not chipmakers. McAfee will still be the third party on those platforms, selling software, but consumers won’t be looking for the little logo on the phone if they either think it’s secure, it comes with a yellow logo, or they know they can install whatever they want later. There’s one final angle I’m not as sure about – systems management. Maybe Intel really does want to get into the software game and increase revenue. Certainly McAfee E-Policy Orchestrator is capable of growing past security and into general management. The “green PC” language in their release and call hints in that direction, but I’m just not sure how much of a factor it is. The major value in this deal is that Intel just branded themselves a security company across all market segments – consumer, government, and corporate. But in terms of increasing sales or grabbing full control over platform security (which would enable them to charge a premium), I don’t think this will work out. The good news is that while I don’t think Intell will see the returns they want, I also don’t think this will hurt customers. Much of the integration was in process already (as it is with other McAfee competitors), and McAfee will probably otherwise run independently. Unlike a small vendor, they are big enough and differentiated enough from the rest of Intel to survive. Probably. Share:

Share:
Read Post

Data Encryption for PCI 101: Introduction

Rich and I are kicking off a short series called “Data Encryption 101: A Pragmatic Approach for PCI Compliance”. As the name implies, our goal is to provide actionable advice for PCI compliance as it relates to encrypted data storage. We write a lot about PCI because we get plenty of end-user questions on the subject. Every PCI research project we produce talks specifically about the need to protect credit cards, but we have never before dug into the details of how. This really hit home during the tokenization series – even when you are trying to get rid of credit cards you still need to encrypt data in the token server, but choosing the best way to employ encryption is varies depending upon the users environment and application processing needs. It’s not like we can point a merchant to the PCI specification and say “Do that”. There is no practical advice in the Data Security Standard for protecting PAN data, and I think some of the acceptable ‘approaches’ are, honestly, a waste of time and effort. PCI says you need to render stored Primary Account Number (at a minimum) unreadable. That’s clear. The specification points to a number of methods they feel are appropriate (hashing, encryption, truncation), emphasizes the need for “strong” cryptography, and raises some operational issues with key storage and disk/database encryption. And that’s where things fall apart – the technology, deployment models, and supporting systems offer hundreds of variations and many of them are inappropriate in any situation. These nuggets of information are little more than reference points in a game of “connect the dots”, without an orderly sequence or a good understanding of the picture you are supposedly drawing. Here are some specific ambiguities and misdirections in the PCI standard: Hashing: Hashing is not encryption, and not a great way to protect credit cards. Sure, hashed values can be fairly secure and they are allowed by the PCI DSS specification, but they don’t solve a business problem. Why would you hash rather than encrypting? If you need access to credit card data badly enough to store it in the first place hashing us a non-starter because you cannot get the original data back. If you don’t need the original numbers at all, replace them with encrypted or random numbers. If you are going to the trouble of storing the credit card number you will want encryption – it is reversible, resistant to dictionary attacks, and more secure. Strong Cryptography: Have you ever seen a vendor advertise weak cryptography? I didn’t think so. Vendors tout strong crypto, and the PCI specification mentions it for a reason: once upon a time there was an issue with vendors developing “custom” obfuscation techniques that were easily broken, or totally screwing up the implementation of otherwise effective ciphers. This problem is exceptionally rare today. The PCI mention of strong cryptography is simply a red herring. Vendors will happily discuss their sooper-strong crypto and how they provide compliant algorithms, but this is a distraction from the selection process. You should not be spending more than a few minutes worrying about the relative strength of encryption ciphers, or the merits of 128 vs. 256 bit keys. PCI provides a list of approved ciphers, and the commercial vendors have done a good job with their implementations. The details are irrelevant to end users. Disk Encryption: The PCI specification mentions disk encryption in a matter-of-fact way that implies it’s an acceptable implementations for concealing stored PAN data. There are several forms of “disk encryption”, just as there are several forms of “database encryption”. Some variants work well for securing media, but offer no meaningful increase in data security for PCI purposes. Encrypted SAN/NAS is one example of disk encryption that is wholly unsuitable, as requests from the OS and applications automatically receive unencrypted data. Sure, the data is protected in case someone attempts to cart off your storage array, but that’s not what you need to protect against. Key Management: There is a lot of confusion around key management; how do you verify keys are properly stored? What does it mean that decryption keys should not be tied to accounts, especially since keys are commonly embedded within applications? What are the tradeoffs of central key management? These are principal business concerns that get no coverage in the specification, but critical to the selection process for security and cost containment. Most compliance regulations must balance between description vs. prescription for controls, in order to tell people clearly what they need to do without telling them how it must be done. Standards should describe what needs to be accomplished without being so specific that they forbid effective technologies and methods. The PCI Data Security Standard is not particularly successful at striking this balance, so our goal for this series is to cut through some of these confusing issues, making specific recommendations for what technologies are effective and how you should approach the decision-making process. Unlike most of our Understanding and Selecting series on security topics, this will be a short series of posts, very focused on meeting PCI’s data storage requirement. In our next post we will create a strategic outline for securing stored payment data and discuss suitable encryption tools that address common customer use cases. We’ll follow up with a discussion of key management and supporting infrastructure considerations, then finally a list of criteria to consider when evaluating and purchasing data encryption solutions.   Share:

Share:
Read Post

Liquidmatrix + Securosis: Dave Lewis and James Arlen Join Securosis as Contributing Analysts

In our ongoing quest for world domination, we are excited to announce our formal partnership with our friends over at Liquidmatrix. Beginning immediately Dave Lewis (@gattaca) and James Arlen (@myrcurial) are joining the staff as Contributing Analysts. Dave and James will be contributing to the Securosis blog and taking part in some of our research and analysis projects. If you want to ask them questions or just say “Hi,” aside from their normal emails you can now reach them at dlewis and jarlen at securosis.com. Within the next few days we will also start providing the Liquidmatrix Security Briefing through the Securosis RSS feed and email distribution list (for those of you on our Daily Digest list). We will just be providing the Briefing – Dave, James, and their other contributors will continue to blog on other issues at [the Liquidmatrix site(http://www.liquidmatrix.org/blog/). But you’ll also start seeing new content from them here at Securosis as they participate in our research projects. We’re biased but we think this is a great partnership. Aside from gaining two more really smart guys with a lot of security experience, this also increases our ability to keep all of you up to date on the latest security news. I’d call it a “win-win”, but I think they’ll figure out soon enough that Securosis is the one gaining the most here. (Don’t worry, per SOP we locked them into oppressive ironclad contracts). Dave and James now join David Mortman and Gunnar Peterson in our Contributing Analyst program. Which means Mike, Adrian, and I are officially outnumbered and a bit nervous.   Share:

Share:
Read Post

Incite 8/18/2010: Smokey and the Speed Gun

What ever happened to the human touch? And personal service? Those seem to be hallmarks of days gone by. It’s too bad. Since I don’t like people, I tend not to develop relationships with my bankers or pharmacists or clergy – or pretty much anyone, come to think of it. But I guess a lot of other people did and they likely miss that person to person interaction. Why do I bring this up? On my journey to the Northern regions earlier this summer, I passed through Washington DC on our way to the beach in Delaware. I hardly even remember that section of the journey, but evidently I left a bit of an impression – with an automated speed trap. Yes, it was a good day when I opened my mail and saw a nice little letter from the DC Government requesting $150 for violating their speed laws. The picture below is how they explain the technology. I remember the good old days when if you got caught speeding, you knew it. You have the horror of the flashing lights in your rear view mirror. There was the thought exercise of figuring out what story would perhaps provide a warning and not a ticket. The indignity of sitting on the side of the road as the officer did whatever officers do for 20 minutes. Maybe making sure you aren’t a convicted felon, driving in a stolen vehicle, or sexting with someone. There was none of that. Just an Internet site requesting my money. And that’s the reality of the situation. The way I understand it, speeding laws got enacted for safety purposes, right? It’s dangerous to go 120 mph on a highway (ask Tyreke Evans). But this has nothing to do with safety. This is a shakedown, pure and simple. DC may as well just put a toll booth on the 14th Street bridge and collect $150 from everyone who crosses. Of course, I consulted the Google to figure out whether I could beat the citation – hoping for a precedent that the tickets don’t hold up under scrutiny. Could I could claim I wasn’t driving the car, or raise vague uncertainties about the technology? Not so much. There were a few examples, but none were applicable to my situation. The faceless RoboCop got me. I’m glad these machines weren’t around when I was a kid. Can you imagine how much fun Smokey and the Bandit would have been if Buford T. Justice used one of these automated speed traps? The Bandit would have gotten his cargo to the destination with nary a car chase. The biggest impact would have been a few traffic citations waiting in his mailbox when he returned. I suspect that wouldn’t have gotten many folks to the theaters. – Mike. Photo credits: “Police Department budget cutbacks?” originally uploaded by Brent Moore Recent Securosis Posts Last week we welcomed Gunnar Peterson as a Contributing Analyst and we are stoked. But we aren’t done yet, so keep an eye on the blog and Twitter toward the end of the week for more fun. Suffice it to say we’ll need to increase our beer budget for the next Securosis all-hands meeting. HP (Finally) Acquires Fortify Gunnar Peterson Joins Securosis As a Contributing Analyst Identity and Access Management Commoditization: A Talk of Two Cities Friday Summary: August 13, 2010 Tokenization Series: Tokenization: Use Cases, Part 1 Tokenization: Use Cases, Part 2 Tokenization: Use Cases, Part 3 Tokenization: Selection Criteria Various NSO Quant posts: Manage Firewall Process Revisited Manage IDS/IPS Process Map (Updated) Manage IDS/IPS – Policy Review Manage IDS/IPS – Define/Update Policies & Rules Manage IDS/IPS – Document Policies & Rules Manage IDS/IPS – Signature Management Incite 4 U No Control… – Shrdlu once again hits the nail right on the head with her post on Span of Control. We talking heads do have a nasty habit of assuming that logic prevails in organizations and that business people will make rational decisions (like not authorizing the off-shore partner to have full access to all intellectual property) and give us the resources we need to do our jobs. Ha! Clearly that isn’t the case, and obviously not having control over the systems we are supposed to protect makes things a wee bit harder. I also love her perspectives on Jericho and GRC. Amen, sister! We need to remember security is as much about persuading peers to do the right thing as it is about the technical aspects. If you’ve got no control, it’s time to start breaking out those Dale Carnegie books again. – MR Sour Grapes? – I’d like you to think back to your preschool art class. Remember how sometimes the teacher would pick a few of the best pieces to hang on the class wall or for your preschool art show? Back in the days when it was legal to have “losers”? Ask yourself: were you the kid who was a little disappointed but happy for your classmate? Or did you sulk a bit but get over it? Or were you the little jerk who would kick the winners in the shins and try to steal their Twinkies? We’ve seen a fair few sour grape blog posts and press releases from competitors after acquisitions, but Veracode’s CEO might need a time out. I have a lot of friends over there, but this isn’t the way to show that you’re next in line for success. If you’re ever in that position, you’ll look a lot better being gracious and congratulatory rather than bitter and snarky. – RM Cutting Compliance Corners – Security’s already been cut to the bone and anything that can be done must be within a compliance context. But it’s inevitable that as things remain tight, especially for small business, they’ll finally realize that compliance doesn’t really help them sell more stuff. Or spend less money doing what they already do. So it’s logical that many SMB organizations would start trying to reduce compliance costs,

Share:
Read Post

Acquisition Doesn’t Mean Commoditization

There has been plenty of discussion of what HP’s recent acquisition of Fortify means in terms of commoditization and consolidation in the market. The reality is that most acquisitions by large vendors are about covering perceived holes in their product line. In other words this is really just the market acknowledging the legitimacy of the product or feature set. Don’t get me wrong – legitimization is very important, but it doesn’t necessarily mean either consolidation or commoditization, though they both indicate some level of legitimization. Commoditization is actually at odds with consolidation. Like legitimization, they are both important aspects of the product/market maturity curve. Consolidation is when the number of vendors in a market radically decreases due to acquisitions by larger vendors (HP, IBM, McAfee, Symantec – you get the idea) or straight failures causing companies to shut down. Consolidation – especially the acquisition type – indicates that the product space is beginning to be legitimized in the eyes of customers. At the other end of the legitimization/maturity curve we have commoditization. This is where the market has completely legitimized the product space, and in fact there is little to no innovation going on there. Essentially all the products have become morally equivalent, and as far as customers are concerned there is little or no compelling technical reason to choose one vendor over another. At that point it comes down to cost: which vendor will provide the product at the lowest capital and operational costs? De-consolidation is also correlated with commoditization. One key indicator of commoditization is an increase in the number of vendors. A great example of this is desktops, laptops, and servers. They are pretty much all the same and it’s really a question of which nameplate is on the front. In the security space, you can see this clearly with firewalls/routers for small offices & homes (“SOHO”), and we are starting to see it with AV as well. As for HP buying Fortify, it’s neither consolidation nor commoditization. The market hasn’t shifted in either direction enough for those. It is, however, legitimization of code auditing tools as a product category.   Share:

Share:
Read Post

HP (Finally) Acquires Fortify

One of the great things about Twitter and iChat is their ability to fuel the rumor mill. The back-office chatter for the last couple months, both within and outside Securosis, has been about rumors of HP buying Fortify Software. So we weren’t surprised when HP announced this morning that they are acquiring Fortify Software for an “undisclosed sum.” Well, not publicly disclosed anyway. In our best KGB voice, “Ve have vays of making dem talk.” And talk they did. If you are not up to speed on Fortify, the core of their offering is “white box” application testing software. This basically means they automate several aspects of code scanning. But their business model is built on both products and services for secure software development processes as a whole – not only to help detect defects, but also helping modify processes to prevent poor coding practices, with tool integration to track development. Recently they have announced products for cloud deployments (who hasn’t?), with their Fortify360 and Fortify on Demand products designed to address potential weaknesses in network addressing and platform trust. New businesses aside, the white box testing products and services account for the bulk of their revenue. Fortify was one of the early players in this market, and focused on the high end of the large enterprise market. This means Fortify was subject to the vagaries of large value enterprise sales cycles, which tend to make revenues somewhat lumpy and unpredictable, and we heard sales were down a bit over the last couple quarters. Of course we can’t publicly substantiate this for a private company, but we believe it. To be clear, this is not an indicator of product quality issues or lack of a viable market – variations in Fortify’s numbers have more to do with their sales process than the market’s perceived value for white box testing or their products. Gary McGraw’s timely post on the Software Security Market reinforces this, and is a fair indication of the growing need for security testing software and services. Regardless of individual vendor numbers (which are less than precise), the market as a whole is trending upwards, but probably not at the rate we’d all like to see given the critical importance of developing secure software. The criticisms I most often hear about Fortify focus on their pricing and recommended development methodology – completely geared towards large enterprises, they introduce unneeded complexity for normal organizations. From an analyst perspective my criticisms of Fortify have also been that their enterprise focus made their offerings a non-starter for mid-market companies, which develop many web applications and have an even more pressing need for white box testing. Fortify’s recommended processes and methodologies may appeal to enterprises, but their maturity model and development lifecycles just don’t resonate outside the Fortune 500. The analysts who will not be named have placed Fortify’s product offering far in the lead for both innovation and effectiveness, but in my experience Fortify faces stiffer competition than those analysts would have you believe. Depending on market segment and the problem to be solved, there are equally compelling alternative products. But that’s all much less relevant under HP’s stewardship. Over the past few years HP has made significant investments to build a full suite of application security solutions, and now has the ability to package the needed application scanning pieces along with the rest of the tools and product integration features that enterprise clients demand. Fortify’s static analysis, assessment, and processes are far more compelling coupled with HP’s black box and back office testing, problem tracking, and application delivery (Mercury). And HP’s sales force is in a much better position to close the large enterprises where Fortify’s product excels. Yes, that means Fortify is a very good fit for HP, further solidifying its secure code strategy. So what does this mean to existing Fortify customers? In the short term I don’t think there will be many changes to the product. The “Hybrid 2.0” vision spelled out in February 2010 is a good indicator that for the first couple quarters the security product suites will merge without significant functionality changes. The changes will show up as necessary to compete with IBM and its recent acquisition of Ounce Labs – tighter integration with problem tracking systems and some features tuned for IBM development platforms. This means that the pricing model will be cleaned up, and aggressive discounts will be provided. This will also introduce some short-term disruptions to service and training as responsibilities are shuffled. But both IBM and HP will remain focused on large enterprise clients, which is good for those customers who demand a fully-integrated process-driven software testing suite. It’s natural to mesh the security testing features into existing QA and development tools, with IBM and HP uniquely positioned to take advantage of their existing platforms. Their push to dominate the high end of the market leaves huge opportunities for the entire mid-market, which has been prolific in its adoption of web application technologies. The good news is there is plenty of room for Veracode, Coverity, Klocwork, and Parasoft to gear their products to these customers and increase sales. The bad news is that if they don’t already have dynamic testing capabilities, they will need to add them quickly, continue to innovate their way out of HP and IBM’s shadow, and address platform support and ease-of-use issues that remain hurdles for the mid-market. You just cannot get very far if your software requires significant investment in professional services to be effective. As far as acquisition price goes, the rumor mill had the purchase price anywhere from $200 million on the low end to $270 million on the high end. With Fortify’s revenue widely thought to be in the $35-$50M range, that’s a pretty healthy multiple, especially in a buyer’s market. Despite the volatility of Fortify’s revenues, an established presence in enterprise sales makes a strong case that a higher multiple is warranted. Moreover, the sales teams were already collaborating heavily, which likely

Share:
Read Post

Tokenization: Selection Criteria

To wrap up our Understanding and Selecting a Tokenization Solution series, we now focus on the selection criteria. If you are looking at tokenization we can assume you want to reduce the exposure of sensitive data while saving some money by reducing security requirements across your IT operation. While we don’t want to oversimplify the complexity of tokenization, the selection process itself is fairly straightforward. Ultimately there are just a handful of questions you need to address: Does this meet my business requirements? Is it better to use an in-house application or choose a service provider? Which applications need token services, and how hard will they be to set up? For some of you the selection process is super easy. If you are a small firm dealing with PCI compliance, choose an outsourced token service through your payment processor. It’s likely they already offer the service, and if not they will soon. And the systems you use will probably be easy to match up with external services, especially since you had to buy from the service provider – at least something compatible and approved for their infrastructure. Most small firms simply do not possess the resources and expertise in-house to set up, secure, and manage a token server. Even with the expertise available, choosing a vendor-supplied option is cheaper and removes most of the liability from your end. Using a service from your payment processor is actually a great option for any company that already fully outsources payment systems to its processor, although this tends to be less common for larger organizations. The rest of you have some work to do. Here is our recommended process: Determine Business Requirements: The single biggest consideration is the business problem to resolve. The appropriateness of a solution is predicated on its ability to address your security or compliance requirements. Today this is generally PCI compliance, so fortunately most tokenization servers are designed with PCI in mind. For other data such as medical information, Social Security Numbers, and other forms of PII, there is more variation in vendor support. Map and Fingerprint Your Systems: Identify the systems that store sensitive data – including platform, database, and application configurations – and assess which contain data that needs to be replaced with tokens. Determine Application/System Requirements: Now that you know which platforms you need to support, it’s time to determine your specific integration requirements. This is mostly about your database platform, what languages your application is written in, how you authenticate users, and how distributed your application and data centers are. Define Token Requirements: Look at how data is used by your application and determine whether single use or multi-use tokens are preferred or required? Can the tokens be formatted to meet the business use defined above? If clear-text access is required in a distributed environment, are encrypted format-preserving tokens suitable? Evaluate Options: At this point you should know your business requirements, understand your particular system and application integration requirements, and have a grasp of your token requirements. This is enough to start evaluating the different options on the market, including services vs. in-house deployment. It’s all fairly straightforward, and the important part is to determine your business requirements ahead of time, rather than allowing a vendor to steer you toward their particular technology. Since you will be making changes to applications and databases it only makes sense to have a good understanding of your integration requirements before letting the first salesperson in the door. There are a number of additional secondary considerations for token server selection. Authentication: How will the token server integrate with your identity and access management systems? This is a consideration for external token services as well, but especially important for in-house token databases, as the real PAN data is present. You need to carefully control which users can make token requests and which can request clear text credit card or other information. Make sure your access control systems will integrate with your selection. Security of the Token Server: What features and functions does the token server offer for encryption of its data store, monitoring transactions, securing communications, and request verification. On the other hand, what security functions does the vendor assume you will provide? Scalability: How can you grow the token service with demand? Key Management: Are the encryption and key management services embedded within the token server, or do they depend on external key management services? For tokens based upon encryption of sensitive data, examine how keys are used and managed. Performance: In payment processing speed has a direct impact on customer and merchant satisfaction. Does the token server offer sufficient performance for responding to new token requests? Does it handle expected and unlikely-but-possible peak loads? Failover: Payment processing applications are intolerant of token server outages. In-house token server failover capabilities require careful review, as do service provider SLAs – be sure to dig into anything you don’t understand. If your organization cannot tolerate downtime, ensure that the service or system you choose accommodates your requirements. Share:

Share:
Read Post

Gunnar Peterson Joins Securosis As a Contributing Analyst

We are ridiculously excited to announce that Gunnar Peterson is the newest member of Securosis, joining us as a Contributing Analyst. For those who don’t remember, our Contributor program is our way of getting to work with extremely awesome people without asking them to quit their day jobs (contributors are full members of the team and covered under our existing contracts/NDAs, but aren’t full time). Gunnar joins David Mortman and officially doubles our Contributing Analyst team. Gunnar’s primary coverage areas are identity and access management, large enterprise applications, and application development. Plus anything else he wants, because he’s wicked smart. Gunnar can be reached at gpeterson at securosis.com on top of his existing emails/Skype/etc. And now for the formal bio: Gunnar Peterson is a Managing Principal at Arctec Group. He is focused on distributed systems security for large mission critical financial, financial exchange, healthcare, manufacturer, and insurance systems, as well as emerging start ups. Mr. Peterson is an internationally recognized software security expert, frequently published, an Associate Editor for IEEE Security & Privacy Journal on Building Security In, a contributor to the SEI and DHS Build Security In portal on software security, a Visiting Scientist at Carnegie Mellon Software Engineering Institute, and an in-demand speaker at security conferences. He maintains a popular information security blog at http://1raindrop.typepad.com. Share:

Share:
Read Post
dinosaur-sidebar

Totally Transparent Research is the embodiment of how we work at Securosis. It’s our core operating philosophy, our research policy, and a specific process. We initially developed it to help maintain objectivity while producing licensed research, but its benefits extend to all aspects of our business.

Going beyond Open Source Research, and a far cry from the traditional syndicated research model, we think it’s the best way to produce independent, objective, quality research.

Here’s how it works:

  • Content is developed ‘live’ on the blog. Primary research is generally released in pieces, as a series of posts, so we can digest and integrate feedback, making the end results much stronger than traditional “ivory tower” research.
  • Comments are enabled for posts. All comments are kept except for spam, personal insults of a clearly inflammatory nature, and completely off-topic content that distracts from the discussion. We welcome comments critical of the work, even if somewhat insulting to the authors. Really.
  • Anyone can comment, and no registration is required. Vendors or consultants with a relevant product or offering must properly identify themselves. While their comments won’t be deleted, the writer/moderator will “call out”, identify, and possibly ridicule vendors who fail to do so.
  • Vendors considering licensing the content are welcome to provide feedback, but it must be posted in the comments - just like everyone else. There is no back channel influence on the research findings or posts.
    Analysts must reply to comments and defend the research position, or agree to modify the content.
  • At the end of the post series, the analyst compiles the posts into a paper, presentation, or other delivery vehicle. Public comments/input factors into the research, where appropriate.
  • If the research is distributed as a paper, significant commenters/contributors are acknowledged in the opening of the report. If they did not post their real names, handles used for comments are listed. Commenters do not retain any rights to the report, but their contributions will be recognized.
  • All primary research will be released under a Creative Commons license. The current license is Non-Commercial, Attribution. The analyst, at their discretion, may add a Derivative Works or Share Alike condition.
  • Securosis primary research does not discuss specific vendors or specific products/offerings, unless used to provide context, contrast or to make a point (which is very very rare).
    Although quotes from published primary research (and published primary research only) may be used in press releases, said quotes may never mention a specific vendor, even if the vendor is mentioned in the source report. Securosis must approve any quote to appear in any vendor marketing collateral.
  • Final primary research will be posted on the blog with open comments.
  • Research will be updated periodically to reflect market realities, based on the discretion of the primary analyst. Updated research will be dated and given a version number.
    For research that cannot be developed using this model, such as complex principles or models that are unsuited for a series of blog posts, the content will be chunked up and posted at or before release of the paper to solicit public feedback, and provide an open venue for comments and criticisms.
  • In rare cases Securosis may write papers outside of the primary research agenda, but only if the end result can be non-biased and valuable to the user community to supplement industry-wide efforts or advances. A “Radically Transparent Research” process will be followed in developing these papers, where absolutely all materials are public at all stages of development, including communications (email, call notes).
    Only the free primary research released on our site can be licensed. We will not accept licensing fees on research we charge users to access.
  • All licensed research will be clearly labeled with the licensees. No licensed research will be released without indicating the sources of licensing fees. Again, there will be no back channel influence. We’re open and transparent about our revenue sources.

In essence, we develop all of our research out in the open, and not only seek public comments, but keep those comments indefinitely as a record of the research creation process. If you believe we are biased or not doing our homework, you can call us out on it and it will be there in the record. Our philosophy involves cracking open the research process, and using our readers to eliminate bias and enhance the quality of the work.

On the back end, here’s how we handle this approach with licensees:

  • Licensees may propose paper topics. The topic may be accepted if it is consistent with the Securosis research agenda and goals, but only if it can be covered without bias and will be valuable to the end user community.
  • Analysts produce research according to their own research agendas, and may offer licensing under the same objectivity requirements.
  • The potential licensee will be provided an outline of our research positions and the potential research product so they can determine if it is likely to meet their objectives.
  • Once the licensee agrees, development of the primary research content begins, following the Totally Transparent Research process as outlined above. At this point, there is no money exchanged.
  • Upon completion of the paper, the licensee will receive a release candidate to determine whether the final result still meets their needs.
  • If the content does not meet their needs, the licensee is not required to pay, and the research will be released without licensing or with alternate licensees.
  • Licensees may host and reuse the content for the length of the license (typically one year). This includes placing the content behind a registration process, posting on white paper networks, or translation into other languages. The research will always be hosted at Securosis for free without registration.

Here is the language we currently place in our research project agreements:

Content will be created independently of LICENSEE with no obligations for payment. Once content is complete, LICENSEE will have a 3 day review period to determine if the content meets corporate objectives. If the content is unsuitable, LICENSEE will not be obligated for any payment and Securosis is free to distribute the whitepaper without branding or with alternate licensees, and will not complete any associated webcasts for the declining LICENSEE. Content licensing, webcasts and payment are contingent on the content being acceptable to LICENSEE. This maintains objectivity while limiting the risk to LICENSEE. Securosis maintains all rights to the content and to include Securosis branding in addition to any licensee branding.

Even this process itself is open to criticism. If you have questions or comments, you can email us or comment on the blog.