Login  |  Register  |  Contact
Thursday, June 04, 2009

Hackers 1, Marketing 0

By Rich

You ever watch a movie or TV show where you know you know the ending, but you keep viewing in suspense to find out how it actually happens?

That’s how I felt when I read this:

Break Into My Email Account and Win $10,000 StrongWebmail.com is offering $10,000 to the first person that breaks into our CEO’s email account…and to make things easier, we’re giving you his username and password.

No surprise, it only took a few days for this story to break:

On Thursday, a group of security researchers claimed to have won the contest, which challenged hackers to break into the Web mail account of StrongWebmail CEO Darren Berkovitz and report back details from his June 26 calendar entry.

The hackers, led by Secure Science Chief Scientist Lance James and security researchers Aviv Raff and Mike Bailey, provided details from Berkovitz’s calendar to IDG News Service. In an interview, Berkovitz confirmed those details were from his account.

Reading deeper, they say it was a cross site scripting attack.

However, Berkovitz could not confirm that the hackers had actually won the prize. He said he would need to check to confirm that the hackers had abided by the contest rules, adding, “if someone did it, we’ll kind of put our heads down,” he said.

Silly rules- this is good enough for me.

image

(Thanks to @jeremiahg for the pointer).

–Rich

Introduction To Database Encryption - The Reboot!

By Adrian Lane

Updated June 4th to reflect terminology change.

This is the Re-Introduction to our Database Encryption series. Why are we re-introducing this series? I’m glad you asked. The more we worked on the separation of duties and key management sections, the more dissatisfied we became. Rich and I got some really good feedback from vendors and end users, and we felt we were missing the mark with this series. And not just because the stuff I drafted when I was sick completely lacked clarity of thought, but there are three specific reasons we were unhappy. The advice we were giving was not particularly pragmatic, the terminology we thought worked didn’t, and we were doing a poor job of aligning end-user goals with available options. So yeah, this is an apology to our audience as the series was not up to our expectations and we failed to achieve some of our own Totally Transparent Research concepts. But we’re ‘fessing up to the problem and starting from scratch.

So we want to fix these things in two ways. First we want to change some of the terminology we have been using to describe database encryption. Using ‘media encryption’ and ‘separation of duties’ is confusing the issues, and we want to differentiate between the threat we are trying to protect against vs. what is being encrypted. And as we are talking to IT, developers, DBAs, and other audiences, we wanted to reduce confusion as much as possible. Second, we will create a simple guide for people to select a database encryption strategy that addresses their goals. Basically we are going to outline a decision tree of user requirements and map those to the available database encryption choices. Rich and I think that will aid end users to both clarify their goals and determine the correct implementation strategy.

In our original introduction we provided a clear idea of where we wanted to go with this series, but we did adopt our own terminology in order to better encapsulate the database encryption options vendors provide. We chose “Encryption for Separation of Duties” and “Encryption for Media Protection”. This is a bit of an oversimplification, and mapped to the threat rather than to the feature. Plus, if you asked your RDBMS vendor for ‘media encryption’, they would not know what they heck you were talking about. We are going to change the terminology back to the following:

  1. Database Transparent/External Encryption: Encryption of the entire database. This is provided by native encryption functions within the database. The goal is to prevent exposure of information due to loss of the physical media. This can also be done through drive or OS/file system encryption, although they lack some of the protections of native database encryption. The encryption is invisible to the application and does not require alterations to the code or schema.

  2. Data User Encryption: Encrypting specific columns, tables, or even data elements in the database. The classic example is credit card numbers. The goal is to provide protection against inadvertent disclosure, or to enforce separation of duties. How this is accomplished will depend upon how key management is utilized and (internal/external) encryption services, and will affect the way the application uses the database, but provides more granular access control.

While we’re confident we’ve described the two options accurately, we’re not convinced the specific terms “database encryption” and “data encryption” are necessarily the best, so please suggest any better options.

Blanket encryption of all database content for media protection is much easier than encrypting specific columns & tables for separation of duties, but it doesn’t offer the same security benefits. Knowing which to choose will depend upon three things:

  • What do you want to protect?
  • What do you want to protect it from?
  • What application changes and management tasks will you tolerate?

Thus, the first thing we need to decide when looking at database encryption is what are we trying to protect and why. If we’re just going after the ‘PCI checkbox’ or are worried about losing data from swapping out hard drives, someone stealing the files off the server, or misplacing backup tapes, then database encryption (for media protection) is our answer. If the goal is to protect data in the event of compromised accounts, rogue DBAs, or inadvertent disclosure; then things get a lot more complicated. We will go into the details of ‘why’ and ‘how’ in a future post, as well as the issues of application alterations, after we have introduced the decision tree overview. If you have any comments, good, bad, or indifferent, please share. As always, we want the discussion to be as open as possible.

–Adrian Lane

Wednesday, June 03, 2009

Five Ways Apple Can Improve Their Security Program

By Rich

This is an article I’ve been thinking about for a long time. Sure, we security folks seem to love to bash Apple, but I thought it would be interesting to take a more constructive approach.

From the TidBITS article:

With the impending release of the next versions of both Mac OS X and the iPhone operating system, it seems a good time to evaluate how Apple could improve their security program. Rather than focusing on narrow issues of specific vulnerabilities or incidents, or offering mere criticism, I humbly present a few suggestions on how Apple can become a leader in consumer computing security over the long haul.

The short version of the suggestions are:

  • Appoint and empower a CSO
  • Adopt a secure software development program
  • Establish a security response team
  • Manage vulnerabilities in included third party software
  • Complete the implementation of anti-exploitation technologies

–Rich

Join the Open Patch Management Survey—Project Quant

By Rich

Are you tired of all those BS vendor surveys designed to sell products, and they don’t ever even show you the raw data?

Yeah, us too.

Today we’re taking the next big step for Project Quant by launching an open survey on patch management. Our goal here is to gain an understanding of what people are really doing with regards to patch management, to better align the metrics model with real practices.

We’re doing something different with this survey. All the results will be made public. We don’t mean the summary results, but the raw data (minus any private or identifiable information that could reveal the source person or organization). Once we hit 100 responses we will release the data in spreadsheet formats. Then, either every week or for every 100 additional responses, we will release updated data. We don’t plan on closing this for quite some time, but as with most surveys we expect an initial rush of responses and want to get the data out there quickly. As with all our material, the results will be licensed under Creative Commons.

We will, of course, provide our own analysis, but we think it’s important for everyone to be able to evaluate the results for themselves.

All questions are optional, but the more you complete the more accurate the results will be. In two spots we ask if you are open for a direct interview, which we will start scheduling right away. Please spread the word far and wide, since the more responses we collect, the more useful the results.

If you fill out the survey as a result of reading this post please use SECUROSISBLOG as the registration code (helps us figure out what channels are working best). If you came to this post via twitter, use TWITTER as the reg code. This won’t affect the results, but we think it might be interesting to track how people found the survey, and which social media channels are more effective.

Thanks for participating, and click here to fill it out.

(This is a SurveyMonkey survey, so we can’t disable the JavaScript like we do for everything here on the main site. Sorry).

–Rich

Boaz Nails It- The Encryption Dilemma

By Rich

Boaz Gelbord wrote a thoughtful response (as did Mike Andrews) to my post earlier this week on the state of web application and data security. In it was one key tidbit on encryption:

image

The truth is that you just don’t mitigate that much risk by encrypting files at rest in a reasonably secure environment. Of course if a random account or service is compromised on a server, having those database files encrypted would sure come in handy. But for your database or file folder encryption to actually save you from anything, some other control needs to fail.

I wouldn’t say this is always true, but it’s generally true. In fact, this situation was the inspiration behind the Three Laws of Data Encryption I wrote a few years ago. The thing is, access controls work really freaking well, and the only reason to use encryption instead of them is if the data is moving, or you need to somehow restrict the data with greater granularity than is possible with access controls. For most systems, this is to protect data from administrators, since you can manage everyone else with access controls.

Also keep in mind that many current data encryption systems tie directly to the user’s authentication, and thus are just as prone to compromised user accounts as are access controls.

Again, not true in all cases, but true in many. The first step in encryption is to know what threat you are protecting against, and if other controls would be just as effective. Seriously, we toss encryption around as the answer all the time, without knowing what the question is.

(My favorite question/answer? Me: Why are you encrypting. Them: To protect against hackers. Me: Um. Cool. You have a bathroom anywhere?)

–Rich

Piracy Fighting Dog FUD

By Adrian Lane

OK, I have to call Bull$%} on this: Anti-piracy pup sniffs out 35,000 illegal DVDs. A piracy fighting dog. Really. From Yahoo! News:

The black Labrador helped enforcement officials who carried out raids last week in southern Johor state which neighbours Singapore, the Motion Picture Association (MPA) said in a statement. Paddy was given to Malaysia by the MPA to help close down piracy syndicates who churn out vast quantities of illegal DVDs. The dog is specially trained to detect chemicals in the discs.

So the dog can detect chemicals used in DVDs. Call me a cynic, but I suspect that ‘Paddy’ cannot tell the difference between Best Buy, an adult video store, and an underground DVD warehouse. So unless someone has figured out how to install laser diodes and detection software onto a Labrador, it’s not happening. Of course, when they do, the pirates will be forced to escalate the confrontation with the unstoppable “Fuzzy, bouncy, piracy tennis ball of mayhem”. Seriously, this is an illustration of the huge difference between marketing security and actual security. It looks to me like someone is trying to create the MPA version of Sexual Harassment Panda, and it’s just wrong!

–Adrian Lane

Tuesday, June 02, 2009

Macworld Security Article Up- The Truth About Apple Security

By Rich

Right when the Macalope was sending along his take on the recent ComputerWorld editorial calling for the FTC to investigate Apple, Macworld asked me to write a more somber take. Here’s an excerpt:

On May 26, Macworld republished a controversial Computerworld article by Ira Winkler suggesting that Apple is “grossly negligent” when it comes to security, and should be investigated by the Federal Trade Commission for false advertising. The author was motivated to write this piece based on Apple’s recent failure to patch a known Java security flaw that was fixed on other platforms nearly six months ago. While the article raises some legitimate issues, it’s filled with hyperbole, inaccurate interpretations, and reaches the wrong conclusions. Here’s what you really need to know about the Java situation, Mac security in general, and the important lesson on how we control Apple’s approach to security.

The real failure of this, and many other, calls for Mac security is that they fail to accurately identify those who are really responsible for Apple’s current security situation. It isn’t security researchers, malicious attackers, or even Apple itself, but Apple’s customers. Apple is an incredibly successful company because it produces products that people purchase. We still buy MacBooks despite the lack of a matte screen, for example. And until we tell Apple that security will affect our buying decisions, there’s little motivation for the company to change direction. Think of it from Apple’s perspective—Macs may be inherently less secure, but they are safer than the competition in the real world, and users aren’t reducing what they spend on Apple because of security problems. There is reasonable coverage of Mac security issues in the mainstream press (Mr. Winkler’s claim to the contrary), but without demonstrable losses it has yet to affect consumer behavior.

Don’t worry- I rip into Apple for their totally irresponsible handling of the Java flaw, but there really isn’t much motivation for Apple to make any major changes to how they handle things, as bad as they often are.

–Rich

How Market Forces Can Fix PCI

By Rich

It’s no secret that I haven’t always been the biggest fan of PCI (the Payment Card Industry Data Security Standard). I believe that rather than blowing massive amounts of cash trying to lock down an inherently insecure system, we should look at building a more fundamentally secure way of performing payment transactions. Not that I think anything is ever perfectly secure, but there is a heck of a lot of room for progress, and our current focus has absolutely no chance of doing more than slightly staving off the inevitable. It’s like a turtle trying to outrun the truck that’s about to crush it- the turtle might buy itself an extra microsecond or two, but the outcome won’t change.

image

That said, I’ve also (fairly recently, and due in no small part to debates with Martin McKeay come to believe that as flawed as PCI is, it’s also the single biggest driver of security improvements throughout the industry. It’s the primary force helping security pros gain access to executive management; even more so than any other regulation out there. And while I’d really like to see us focus more on a secure transaction ecosystem, until that wonderful day I recognize we need to live with what we have, and use it to the best of our ability.

Rather than merely adding new controls to the PCI standard, I think the best way to do this is to fix some of the inherent problems with how the program is currently set up. If you’ve ever been involved with other types of auditing, PCI looks totally bass ackwards. The program itself isn’t designed to produce the most effective results. Here are a couple of changes that I think could make material improvements to PCI, possibly doubling the number of microseconds we have until we’re a steaming mass of roadkill:

  1. Eliminate conflicts of interest by forbidding assessors from offering security tools/services to their clients: This is one of the single biggest flaws in PCI- assessors may also offer security services to their clients. This is a massive conflict of interest that’s illegal in financial audits due to all the problems it creates. It will royally piss off a few of the companies involved, but this change has to happen.
  2. Decertify QSAs (assessors) that certify non-compliant companies: Although the PCI council has a few people on their probation list, despite failure after failure we haven’t seen anyone penalized. Without these penalties, and a few sacrificial lambs, QSAs are not accountable for their actions.
  3. Eliminate the practice of “shopping” for the cheapest/easiest QSA: Right now, if a company isn’t happy with their PCI assessment results, they can fire their QSA and go someplace else. Let’s steal a rule from the financial auditing playbook (not that that system is perfect) and force companies to disclose when they change assessors, and why.

There are other actions we could take, such is publicly disclosing all certified companies on a quarterly basis or requiring unscheduled spot-checks, but I don’t think we necessarily need those if we introduce more accountability into the system, and reduce conflicts of interest.

Another very interesting development comes via @treyford on twitter, pointing to a blog post by James DeLuccia and an article over at Wired. It seems that the auditor for CardSystems (you know, one of the first big breaches from back in 2005) is being sued for improperly certifying the company. If the courts step in and establish a negligence standard for security audits, the world might suddenly become mighty interesting. But as interesting (and hopefully motivating, at least to the auditors/assessors) as this is, I think we should focus on what we can do with PCI today to allow market forces to drive security and program improvements.

–Rich

Monday, June 01, 2009

The State of Web Application and Data Security—Mid 2009

By Rich

One of the more difficult aspects of the analyst gig is sorting through all the information you get, and isolating out any inherent biases. The kinds of inquiries we get from clients can all too easily skew our perceptions of the industry, since people tend to come to us for specific reasons, and those reasons don’t necessarily represent the mean of the industry. Aside from all the vendor updates (and customer references), our end user conversations usually involve helping someone with a specific problem – ranging from vendor selection, to basic technology education, to strategy development/problem solving. People call us when they need help, not when things are running well, so it’s all too easy to assume a particular technology is being used more widely than it really is, or a problem is bigger or smaller than it really is, because everyone calling us is asking about it. Countering this takes a lot of outreach to find out what people are really doing even when they aren’t calling us.

Over the past few weeks I’ve had a series of opportunities to work with end users outside the context of normal inbound inquiries, and it’s been fairly enlightening. These included direct client calls, executive roundtables such as one I participated in recently with IANS (with a mix from Fortune 50 to mid-size enterprises), and some outreach on our part. They reinforced some of what we’ve been thinking, while breaking other assumptions. I thought it would be good to compile these together into a “state of the industry” summary. Since I spend most of my time focused on web application and data security, I’ll only cover those areas:

image

When it comes to web application and data security, if there isn’t a compliance requirement, there isn’t budget – Nearly all of the security professionals we’ve spoken with recognize the importance of web application and data security, but they consistently tell us that unless there is a compliance requirement it’s very difficult for them to get budget. That’s not to say it’s impossible, but non-compliance projects (however important) are way down the priority list in most organizations. In a room of a dozen high-level security managers of (mostly) large enterprises, they all reinforced that compliance drove nearly all of their new projects, and there was little support for non-compliance-related web application or data security initiatives. I doubt this surprises any of you.

“Compliance” may mean more than compliance – Activities that are positioned as helping with compliance, even if they aren’t a direct requirement, are more likely to gain funding. This is especially true for projects that could reduce compliance costs. They will have a longer approval cycle, often 9 months or so, compared to the 3-6 months for directly-required compliance activities. Initiatives directly tied to limiting potential data breach notifications are the most cited driver. Two technology examples are full disk encryption and portable device control.

PCI is the single biggest compliance driver for web application and data security – I may not be thrilled with PCI, but it’s driving more web application and data security improvements than anything else.

The term Data Loss Prevention has lost meaningI discussed this in a post last week. Even those who have gone through a DLP tool selection process often use the term to encompass more than the narrow definition we prefer.

It’s easier to get resources to do some things manually than to buy a tool – Although tools would be much more efficient and effective for some projects, in terms of costs and results, manual projects using existing resources are easier to get approval for. As one manager put it, “I already have the bodies, and I won’t get any more money for new tools.” The most common example cited was content discovery (we’ll talk more about this a few points down).

Most people use DLP for network (primarily email) monitoring, not content discovery or endpoint protection – Even though we tend to think discovery offers equal or greater value, most organizations with DLP use it for network monitoring.

Interest in content discovery, especially DLP-based, is high, but resources are hard to get for discovery projects – Most security managers I talk with are very interested in content discovery, but they are less educated on the options and don’t have the resources. They tell me that finding the data is the easy part – getting resources to do anything about it is the limiting factor.

The Web Application Firewall (WAF) market and Security Source Code Tools markets are nearly equal in size, with more clients on WAFs, and more money spent on source code tools per client – While it’s hard to fully quantify, we think the source code tools cost more per implementation, but WAFs are in slightly wider use.

WAFs are a quicker hit for PCI compliance – Most organizations deploying WAFs do so for PCI compliance, and they’re seen as a quicker fix than secure source code projects.

Most WAF deployments are out of band, and false positives are a major problem for default deployments – Customers are installing WAFs for compliance, but are generally unable to deploy them inline (initially) due to the tuning requirements.

Full drive encryption is mature, and well deployed in the early mainstream – Full drive encryption, while not perfect, is deployable in even large enterprises. It’s now considered a level-setting best practice in financial services, and usage is growing in healthcare and insurance. Other asset recovery options, such as remote data destruction and phone home applications, are now seen as little more than snake oil. As one CISO told us, “I don’t care about the laptop, we just encrypt it and don’t worry about it when it goes missing”.

File and folder encryption is not in wide use – Very few organizations are performing any wide scale file/folder encryption, outside of some targeted encryption of PII for compliance requirements.

Database encryption is hard, and not widely used – Most organizations are dissatisfied with database encryption options, and do not deploy it widely. Within a large organization there is likely some DB encryption, with preference given to file/folder/media protection over column level encryption, but most organizations prefer to avoid it. Performance and key management are cited as the primary obstacles, even when using native tools. Current versions of database encryption (primarily native encryption) do perform better than older versions, but key management is still unsatisfactory. Large encryption projects, when initiated, take an average of 12-18 months.

Large enterprises prefer application-level encryption of credit card numbers, and tokenization – When it comes to credit card numbers, security managers prefer to encrypt it at the application level, or consolidate numbers into a central source, using representative “tokens” throughout the rest of the application stack. These projects take a minimum of 12-18 months, similar to database encryption projects (the two are often tied together, with encryption used in the source database).

Email encryption and DRM tend to be workgroup-specific deployments – Email encryption and DRM use is scattered throughout the industry, but is still generally limited to workgroup-level projects due to the complexity of management, or lack of demand/compliance from users.

Database Activity Monitoring usage continues to grow slowly, mostly for compliance, but not quickly enough to save lagging vendors – Many DAM deployments are still tied to SOX auditing, and it’s not as widely used for other data security initiatives. Performance is reasonable when you can use endpoint agents, which some DBAs still resist. Network monitoring is not seen as effective, but may still be used when local monitoring isn’t an option. Network requirements, depending on the tool, may also inhibit deployments.

My main takeaway is that security managers know what they need to do to protect information assets, but they lack the time, resources, and management support for many initiatives. There is also broad dissatisfaction with security tools and vendors in general, in large part due to poor expectation setting during the sales process, and deliberately confusing marketing. It’s not that the tools don’t work, but that they’re never quite as easy as promised.

It’s an interesting dilemma, since there is clear and broad recognition that data security (and by extension, web application security) is likely our most pressing overall issue in terms of security, but due to a variety of factors (many of which we covered in our Business Justification for Data Security paper), the resources just aren’t there to really tackle it head-on.

–Rich

Friday, May 29, 2009

Database Security Mass-Market Update and Friday Summary - May 29, 2009

By Adrian Lane

I ran across a lot of little tidbits in the world of database security this week, so I figured I would share this for the Friday Summary:

Idera has been making a lot of noise this week with seemingly two dozen TechTarget ‘KnowledgeAlerts’ hitting my inbox. Yes, they are still around, but it’s hard to consider them a database security vendor. Customers mostly know them as a DB tools vendor; but they do additionally offer backup encryption, a form of activity monitoring, and what I call “permission mapping” solutions. Not a comprehensive security suite, but handy tools. They really only support the SQL Server platform, but they do in fact offer security products, so bad on me for thinking they were dead and buried. I may not hear about them very often, but the one or two customers I hear from seem to be happy, and that’s what counts. And it’s a challenge to put security tools into the hands of DBA’s and non-security personnel and make them happy.

And speaking of “I thought they were dead”, NGS Software entered into a partnership with Secerno recently. NGS has always incredibly database security savvy but product-deficient, focusing more on their professional services capabilities rather than product development. It shows. Secerno is a small DAM firm with a novel approach to detecting anomalous queries. I would like to see them able to compete on an even footing to demonstrate what they can do, as they need more proof points and customer successes to prove how this technology performs in the real world. To do that they are going to need to offer the assessment capability or they will get relegated to the sidelines as a ‘feature’ and not a database security solution. Secerno is too small and probably does not want to sink the time and money required to develop a meaningful body of assessment policies, so being able to leverage the NGS team and their products will help with preventative security measures. Ideally Secerno will put an updated face on the ‘Squirrel’, and leverage the expanded body of policies, but better to have the capability for now and improve later. I have said it before and I will say it again: any customer needs to have assessment to baseline database configurations, and monitoring to enforce policy and detect threats. The compliance buyers demand it, and that’s your buying center in this market. I am eager to see what this UK tag team can do.

LogLogic announced their database security intentions a little while back, but shipped their Database Security Manager this week. This is not a scruffy startup entering the database security arena, but a successful and polished firm with an established customer base. Granted, we have seen similar attempts botched, but this is the addition of a more complimentary technology with a much better understanding of the customer buying requirements. LogLogic is touting the ability to perform privileged user monitoring, and that this is fully integrated with their existing audit log collection and analysis. But everyone they will be competing with will have something similar, so that’s not very interesting. What is significant to me is a log management vendor providing the near-real-time monitoring and event blocking capabilities that need to be present to take a security product seriously. Additionally, it is done in a way that will address console and privileged users, which is necessary for compliance. The speed of the integration implies that the product architecture is conducive to both, and if you have ever tried implementing a solution of this type you understand that it is difficult because the two functions offer diametrically opposed technical challenges in data storage and processing. Keep in mind that they just acquired Exaprotect to accomplish similar goals for SEM, so I expect we will see that integration happen soon as well. Now let’s see if their customers find it compelling.

Thanks to one of our readers for the heads-up on this one: The Netezza Corporation Investor relations transcript. Interesting details coming out of their end-of-quarter investor call. Turns out that the $3M acquisition price I quoted was slightly off, and the real total was slightly higher at $3.1 million. Given Netezza’s nominal head-count increase since January 1, 2009 (9 people), it looks as if they kept just a handful of the Tizor staff. What shocked me is that they are being credited with 23 customers – less than half the number of customers I thought they had. I am not sure what their average deal size was, but I am willing to bet it was sub-$200k, so revenues must have been very small. This deal was better for their investors than I realized.

Lumigent continues to thrive as the contra-database-security platform. While I find most things GRC to be little more than marketing doublespeak, Lumigent has done a good job at locating and mining their ‘AppGRC’ niche. It’s not my intention to marginalize what they provide because there is customer need, there has been for some time, and the platform is suitable for the task. It is interesting that none of their (former?) competitors had success with that marketing angle and reverted to security and compliance messages, but Lumigent is making it work. The segment needs to move up from generic database security to business policy analysis and enforcement, but the ‘what’ and how to get there are not always clear. I confess I think it funny that for most of their articles such as this one, I could substitute “database security” for ‘AppGRC’ and they would still work. Does the need to move beyond reliance on DBA scripts to a more comprehensive assessment and audit platform with separation of duties sound like DB security? You bet it does. It goes to show that messaging & positioning is an art form. So bravo on the re-branding, appropriate new partnerships and intense focus they have on GRC buyers in the back-office application space.

And now for the week in review:

Webcasts, Podcasts, Outside Writing, and Conferences

Favorite Securosis Posts

Favorite Outside Posts

Top News and Posts

Blog Comment of the Week

This week’s best comment was by Daniel in response to the Macalope’s The Government Must Save Our Children from Apple! guest post:

Would a better analogy be a small town advertising that the chance of your house getting broken into is far lower if you move to small town X than if you live in big city Y? Yes, big city Y has far better policing, and spends far more money on anti-burglary patrols, and yes, the night police force in small town X was just caught sleeping on the job, and yes, there isn’t even a local burglar alarm monitoring company in small town X–but despite the far higher security in big city Y, one is still far less likely to experience a break-in in small town X, and this isn’t an irresponsible claim on the part of those encouraging people to move to small town X?

–Adrian Lane

Thursday, May 28, 2009

Sarbanes-Oxley Is Here to Stay

By Adrian Lane

This is an off-topic post. It has a bit to do with Compliance, but nothing to do with Security, so read no further if you are offended by such things.

I am surprised that we have not been reading more about the off balance sheet ‘assets’ that was brought to light last week. In a nutshell, over $900 billion in ‘assets’, spread across the 19 largest US banks, was not part of the normal 10K/10Q, and the SEC is telling banks they need to be brought back onto the balance sheets. This is an issue is because these ‘assets’ are mostly comprised of real estate and credit card debt owed to the banks.

The change could result in about $900 billion in assets being brought onto the balance sheets of the 19 largest U.S. banks, according to federal regulators. The information was provided by Citigroup Inc., JPMorgan Chase & Co. and 17 other institutions during the government’s recent “stress tests,” which were designed to determine which banks would need more capital if the economy worsened. … In general, companies transfer assets from balance sheets to special purpose entities to insulate themselves from risk or to finance a large project.

Given the accelerating rate at which credit card debt is going bad, and the fact that real estate values in states like Arizona have dropped as much as 70% since 2006, it’s likely we are looking at the majority of these ‘assets’ simply vanishing. Across the board, 12% of all homeowners are behind in payments or in foreclosure, and the remaining assets are worth far less than they were originally. It was ironic that I ran across an article about the need to repeal the Sarbanes-Oxley Act of 2002 on the very morning I saw this news item. There has been a methodical drumbeat for several years now about the need to repeal SOX, saying it makes it harder to fill out company boards of directors, going so far as to claim the reversal could help stimulate the economy. Of course corporate executives never liked SOX as there were additional costs associated with keeping accurate records, and it’s hard to balance the perception of financial performance with the potential for jail time as a consequence of rule violations. The scandals at Worldcom, Enron, Tyco, and others prompted this regulation to ensure the have accuracy and completeness in financial reporting which might enable us to avoid another similar fiasco. But we find ourselves in the same place we did in 2001, where many companies are in worse financial shape than was readily apparent – many of the same firms requesting more money from the government (taxpayer) to stay afloat.

Section 302 was specifically about controls and procedures to ensure that financial statements are accurate, and it looks to me like moving hundreds of billions of dollars in high risk real estate & credit card loans “off balance sheet” would violate the spirit of the act. I would have thought that given the current economic situation, and with the motivating events for Sarbanes-Oxley still in recent memory, there would be greater outcry, but maybe people are just worried about keeping the roofs over their heads. But the call will come for additional regulation and controls over financial systems as more banks fail. Clearly there needs to be refinement and augmentation to the PCAOB guidelines on several accounting practices, but to what degree will not be determined for a long time.

Will this mean new business for vendors who collect data and enforce policies in and around SOX? Nope. Instead it will underscore the core value that they cannot provide. Security and Compliance vendors who offer help with SOX policy enforcement cannot analyze a balance sheet. While there were a couple notable examples where internal auditors monitored accounting and database systems to show fraud, this is not a skill you can bottle up for sale. Collection of the raw data and simple policy enforcement can be provided, but there is no way any product vendors could have assisted in detecting the shuffling of balance sheet assets. Still, I bet we will see it in someone’s marketing collateral come RSA 2010!

–Adrian Lane

The Government Must Save Our Children from Apple!

By Macalope

Editors Note: This morning I awoke in my well-secured hotel room to find a sticky note on my laptop that said, “The Securosis site is now under my control. Do not attempt to remove me our you will suffer my wrath. Best regards, The Macalope.”

ComputerWorld has published an interesting opinion piece from Ira Winkler entitled “Man selling book writes incendiary Mac troll bait”.

Oh, wait, that’s not the title! Ha-ha! That would be silly! What with it being so overly frank.

No, the title is “It’s time for the FTC to investigate Mac security”.

You might be confused about the clumsy phrasing because the FTC, of course, doesn’t investigate computer security, it investigates the veracity of advertising claims. What Winkler believes the FTC should investigate is whether Apple is violating trade laws by claiming in its commercials that Macs are less affected by viruses than Windows.

Apple gives people the false impression that they don’t have to worry about security if they use a Mac.

Really? The ads don’t say Macs are invulnerable. They say that Macs don’t have the same problem with exploits that Windows has. And it’s been the Macalope’s experience that people get that. The switchers he’s come into contact with seem to know exactly the score: more people use Windows so malicious coders have, to date, almost exclusively targeted Windows.

Some people – many of them security professionals like WInkler – find this simple fact unfair. Sadly, life isn’t fair.

Well, “sadly” for Windows users. Not so much for Mac users. We’re kind of enjoying it.

And perhaps because the company is invested in fostering that impression, Apple is grossly negligent in fixing problems. The proof-of-concept code in this case is proof that Apple has not provided a fix for a vulnerability that was identified six months ago. There is no excuse for that.

On this point, the Macalope and Winkler are in agreement. There is no excuse for that. The horny one thinks the company has been too lax on implementing a serious security policy and was one of many Mac bloggers to take the company to task for laughing off shipping infected iPods. He’s hopeful the recent hire of security architect Ivan Krstic signals a new era for the company.

But let’s get back to Winkler’s call for an FTC investigation. Because that’s funnier.

The current Mac commercials specifically imply that Windows PCs are vulnerable to viruses and Macs are not.

Actually, no. What they say is that Windows PCs are plagued by viruses and Macs are not.

I can’t disagree that PCs are frequent victims of viruses and other attacks…

Ah, so we agree!

…but so are Macs.

Oops, no we don’t.

The Macalope would really love to have seen a citation here because it would have been hilarious.

In fact, the first viruses targeted Macs.

So “frequent” in terms of the Mac here is more on a geologic time scale. Got it.

Apple itself recommended in December 2008 that users buy antivirus software. It quickly recanted that statement, though, presumably for marketing purposes.

OK, let’s set the story straight here because Winkler’s version reads like something from alt.microsoft.fanfic.net. The document in question was a minor technical note created in June of 2007 that got updated in December. The company did not “recant” the statement, it pulled the note after it got picked up by the BBC, the Washington Post and CNet as some kind of shocking double-faced technology industry scandal.

By the way, did you know that Apple also markets Macs as easier to use, yet continues to sell books on how to use Macs in its stores? It’s true! But if it’s so easy to use, why all the books, Apple? Why? All? The? Books?

A ZDNet summary of 2007 vulnerabilities showed that there were five times more vulnerabilities for Mac OS than for all types of Windows PC operating systems.

No citation, but the Macalope knows what he’s talking about. He’s talking about this summary by George Ou. George loved to drag these stats out because they always made Apple look worse than Microsoft. But he neglected to mention the many problems with this comparison, most importantly that Secunia, the source of the data, expressly counseled against using it to compare the relative security of the products listed because they’re tracked differently.

But buy Winkler’s book! The Macalope’s sure the rigor of the research in them is better than in this piece!

How can Apple get away with this blatant disregard for security?

How can Computerworld get away with printing unsourced accusations that were debunked a year and a half ago?

Its advertising claims seem comparable to an automobile manufacturer implying that its cars are completely safe and its competitors’ cars are death traps, when we all know that all cars are inherently unsafe.

That’s a really lousy analogy. But to work with it, it’s not that Apple’s saying its car is safer, it’s saying the roads in Macland are safer. Get out of that heavy city traffic and into the countryside.

The mainstream press really doesn’t cover Mac vulnerabilities…

The real mainstream press doesn’t cover vulnerabilities for any operating system. It covers attacks (even lame Mac attacks). The technology press, on the other hand, loves to cover Mac vulnerabilities, despite Winkler’s claim to the contrary, even though exploits of those vulnerabilities have never amounted to much.

When I made a TV appearance to talk about the Conficker worm, I mentioned that there were five new Mac vulnerabilities announced the day before. Several people e-mailed the station to say that I was lying, since they had never heard of Macs having any problems. (By the way, the technical press isn’t much better in covering Mac vulnerabilities.)

So, let’s get this straight. Winkler gets on TV and talks up Mac vulnerabilities in a segment about a Windows attack. But because he got five mean emails, the story we’re supposed to get is about how the coverage is all pro-Apple? Were the five emails from TV news anchors or something?

And just to be clear, it is not that Apple’s software has security vulnerabilities that is the problem; all commercial software does. The problem is that Apple is grossly misleading people to believe otherwise.

Wow, there is an awful lot of loose talk about how badly Apple is misleading the public with its wild claims. It’s somewhat surprising that Winkler doesn’t get around to actually quoting any of those very dangerous claims that the FTC should immediately investigate.

The Macalope thought about going back and pulling the quotes from the commercials and showing how all they actually do is say the Mac simply doesn’t have the virus problems Windows does (true!), but then he thought, hey, Winkler’s the one making the accusations. Why shouldn’t he be forced to back them up?

But buy Winkler’s book! The Macalope’s sure it’s awesome.

Winkler’s right that all commercial software has vulnerabilities. And Vista actually better implements technologies designed to make writing exploits harder. He’s also right that there’s been much to criticize Apple about over security. But the mildly honest parts of Winkler’s piece conflate vulnerabilities and exploits in an effort to make the Mac look worse and the dishonest parts are just utter fabrications (e.g. Macs are “frequently” hit by viruses).

An FTC investigation? That’s just standing on the diving board and jumping up and down yelling “Look at me! Look at me! Hey, everyone, look what I can do!”

If Winkler had a serious argument about there needing to be an FTC investigation, he would have linked to the FTC’s guidelines for the substance of advertising claims and contrasted them with quotes from Apple’s ads. But he didn’t do that.

Because he doesn’t have a serious argument to make.

But buy his book!

This post thanks to www.macalope.com

–Macalope

Wednesday, May 27, 2009

Acquisitions and Strategy

By Adrian Lane

There have been a couple of acquisitions in the last two weeks that I wanted to comment on; one by Oracle and one by McAfee. But between a minor case of food poisoning followed shortly by a major case of influenza, pretty much everything I wanted to do in the last 12 days, blogging notwithstanding, was halted. I am feeling better and trying to catch up on the stuff I wanted to talk about. At face value, neither of the acquisitions I want to mention are all that interesting. In the big picture, the investments do spotlight product strategy, so I want to comment on that. But before I do, I wanted to make some comments about how I go about assessing the value of an acquisition. I always try to understand the basic value proposition to the acquiring company, as well as other contributing factors. There are always a set of reasons why company A acquires company B, but understanding these reasons is much harder than you might expect. The goals of the buyers and the seller are not always clear. The market strategy and self-perception of each firm come into play when considering what they buy, why they bought it, and how much they were willing to pay. The most common motivators are as follows:

Strategic: You want to get into a new market and it is either cheaper or faster to acquire a company that is already in that segment rather than organically develop and sell your own product. Basically this is paving the road for a strategic vision. Buying the major pieces to get into a new market or new growth opportunities in existing markets. No surprises here.

Tactical: Filling in competitive gaps. A tactical effort to fill in a piece of the puzzle that your existing customers really need, or complete a product portfolio to address competitive deficiencies within your product. For example, having network DLP was fine up until a point, and then endpoint became a de facto requirement. We saw this with email security vendors who had killer email security platforms, but were still getting hammered in the market for not having complete web security offerings as well.

Neither is surprising, but there are many more than these basic two reasons. And this is where things can get weird. Other motivating factors that make the deal go forward may not always be entirely clear. A couple that come to mind:

Accretive Acquisition: Buying a solid company to foster your revenue growth curve. Clear value from the buyer’s perspective, but not so clear why profitable companies are willing to sell themselves for 2-4 times revenue when investor hopes, dreams, and aspirations are often much more than that. You have to view this from the seller’s side to make sense of it. There are many small, profitable companies out there in the $15-35M range, with no hope of going public because their market is too small and their revenue growth curve is too shallow. But the investors are pushing for an IPO that will take years, or possibly never happen. So what is your exit strategy? Which firms decide they want the early exit vs. betting their fortunes on a brighter future? You would think that in difficult economic times it is often based upon the stability of their revenue in the next couple of quarters. More often it comes down to which crazy CEOs still swear their firm is at the cusp of greatness for a multi-billion-dollar-a-year market and can convince their boards, vs. pragmatists who are ready to move on. I am already aware of a number of mid-sized companies and investment firms trying to tell “the wheat from the chaff” and target viable candidates, and a handful of pragmatic CEOs willing to look for their next challenge. Look for a lot more of these acquisitions in the next 12 months.

Leveraged/Platform Enabler: Not quite strategic, not quite tactical, but a product or feature that multiple products can leverage. For example a web application server, a policy management engine, or a reporting engine may not be a core product offering, but could provide a depth of service that makes all your other products perform better. And better still, where a small firm could not achieve profitability, a large company might realize value across their larger customer base/product suite far in excess of the acquisition price.

Good Tech, Bad Company: These firms are pretty easy to spot in this economy. The technology is good and the market is viable, but the company that produces the technology sucks. Wrong sales model, bad positioning, bad leadership decisions, or whatever – they simply cannot execute. I also call this “bargain bin”’ shopping because this is one of the ways mid-sized and larger firms can get cutting edge technology at firesale prices, and cash shortfalls force vendors to sell quickly! Still, it’s not always easy to distinguish the “over-sold bad tech” or “overfunded and poorly managed bad technology” firms from the “good tech, bad management” gems you are after. We have seen a few of these in the last 12 months, and we will see more in the coming 12 months as investors balk and lose confidence.

The Hedge: This is where you want into a billion dollar market, but you cannot afford to buy one of the leaders, or your competitors have already bought all of them. What do you do? You practice the art of fighting without fighting: You buy any other player that is a long way from being the front-runner and market that solution like crazy! Sure, you’re not the leader in the category, but it’s good enough not to lose sales, and you paid a fraction of the price. It may even give you time to build a suitable product if you want to, but more often than not, you ride the positive perception train till it runs off the rails. Sellers know this game as well, and you will often see firms not wait around, but rather raise the white flag/sales banner when the market is scaling up and their revenues are not.

The Panic Buy: This is when there is a “hot” new market that may be viewed as “disruptive” to your business. In reality, it’s nothing more than the day’s passing fashion, but buying is imperative due to either delusion or investor prodding. You pay too much and you never generate enough revenue to cover the purchase price, but hey, maybe you’ll sell it for pennies on the dollar a few years later to recoup some of your loss.

Body shop: You need engineers with particular background or skills and buying an engineering heavy company is cheaper than recruiting employees away from another company. The technology is irrelevant. In today’s economy this is rare, but it’s common in hot markets.

Competitive Blocking: Buying a company to prevent a competitor from getting it. Keeps them from customers or competitive technology that, even if it does not make you better, at least does not work against you in sales situations. Sometimes the company is so cheap that there are enough customers or reseller relationships it makes sense to buy it. Who knows, they may even have some salvageable pieces of technology as well.

Ego: Just because they can.

There are more, but that’s enough for now. What started me thinking about all of this was when McAfee acquired Solidcore Systems a little over a week ago. I was looking that Solidcore’s web site and was unable to determine if it was the food poisoning making me gag or their PCI and database marketing claims. It’s “Locked Down” and “Dynamic” all at the same time! Regardless, there’s value in the ability to verify an application set for diverse platforms, especially in virtual or mobile computing environments. Sure, it’s a checkbox for most compliance efforts, but I doubt that is the motivation behind the purchase. This looks like McAfee making an early bet on one vision of “cloud” and virtualization security. They will leverage this across multiple enterprise security and compliance products and multiple value propositions. When you boil it down, the core value is “whitelisting” an appropriate set of applications that run in any given device or environment. Will it provide real value for baselining virtual environments? Who knows, but I doubt it. I suspect that it will be an interesting way to get a handle on mobile device application sets and provide a greater degree of security in that mercurial environment. And at $33M plus incentives, this is an inexpensive investment in the ability tell customers McAfee offers cloud security!

Two weeks ago, Oracle acquired Virtual Iron. Virtual Iron produces a server virtualization product, but they are known for their management tools/capabilities, which are platform neutral. I got to an event they sponsored in San Francisco a couple years ago and the product offerings they demoed appeared competent. Still, they were a small fish in a very large VMware/Xen/Microsoft pond. Virtual Iron does not offer security products, but they do offer systems management, change management, and control for virtual environments. Think about EMC and their systems management vision of security and compliance and you can see the tie-in. Type of acquisition? Tactical, with a little “Good Tech, Bad Management” thrown in. I had commented in a recent post that Oracle’s Sun acquisition supported a long-term growth strategy. This acquisition fills in many of the gaps for the virtualization offering, and helps Oracle with one of the biggest gripes I hear from people using virtualization technologies: the lack of management tools. Not sure what they paid for this, but I am willing to bet that it is at or below the $65M in investment. So the investors got their money back and Oracle accelerated realization of their data center management dreams. Seems like a win-win.

–Adrian Lane

The CIS Consensus Metrics and Project Quant

By Rich

Just before release, the Center for Internet Security sent us a preview copy of the CIS Consensus Metrics. I’m a longtime fan of the Center, and, once I heard they were starting on this project, was looking forward to the results.

Overall I think they did a solid job on a difficult problem. Is it perfect? Is it complete? No, but it’s a heck of a good start. There are a couple things that stand out:

  • They do a great job of interconnecting different metrics, and showing you how you can leverage a single collected data attribute across multiple higher-level metrics. For example, a single “technology” (product/version) is used in multiple places, for multiple metrics. It’s clear they’ve designed this to support a high degree of automation across multiple workflows, supporting technologies, and operational teams.
  • I like how they break out data attributes from security metrics. Attributes are the feeder data sets we use to create the security metrics. I’ve seen other systems that intermix the data with the metrics, creating confusion.
  • Their selected metrics are a reasonable starting point for characterizing a security program. They don’t cover everything, but that makes it more likely you can collect them in the first place. They make it clear this is a start, with more metrics coming down the road.
  • The metrics are broken out by business function – this version covering incident management, vulnerability management, patch management, application security, configuration management, and financial.
  • The metric descriptions are clear and concise, and show the logic behind them. This makes it easy to build your own moving forward.

There are a few things that could also be improved:

  • The data attributes are exhaustive. Without automated tool support, they will be very difficult to collect.
  • The document suggests prioritization, but doesn’t provide any guidance. A companion paper would be nice.

This isn’t a mind-bending document, and we’ve seen many of these metrics before, but not usually organized together, freely available, well documented, or from a respected third party. I highly recommend you go get a copy.

Now on to the CIS Consensus Metrics and Project Quant

I’ve had some people asking me if Quant is dead thanks to the CIS metrics. While there’s the tiniest bit of overlap, the two projects have different goals, and are totally complementary. The CIS metrics are focused on providing an overview for an entire security program, while Quant is focused on building a detailed operational metrics model for patch management. In terms of value, this should provide:

  1. Detailed costs associated with each step of a patch management process, and a model to predict costs associated with operational changes.
  2. Measurements of operational efficiency at each step of patch management to identify bottlenecks/inefficiencies and improve the process.
  3. Overall efficiency metrics for the entire patch management process.

CIS and Quant overlap for the last goal, but not for the first two. If anything, Quant will be able to feed the CIS metrics. The CIS metrics for patch management include:

  • Patch Policy Compliance
  • Patch Management Coverage
  • Mean Time to Patch

I highly suspect all of these will appear in Quant, but we plan on digging into much greater depth to help the operational folks directly measure and optimize their processes.

–Rich

Tuesday, May 26, 2009

Smile!

By Adrian Lane

Normally, when a company buys software that does not work, the IT staff gets in trouble, they try to get your money back, purchase different software or some other type of corrective action. When a state or local government buys software that does not work, what do they do? Attempt to alter human behavior of course! image Taking a page from the TSA playbook, the department of motor vehicles in four states adopt a ‘No Smiles’ policy when taking photos. Why? Because their facial recognition software don’t work none too good:

“Neutral facial expressions” are required at departments of motor vehicles (DMVs) in Arkansas, Indiana, Nevada and Virginia. That means you can’t smile, or smile very much. Other states may follow … The serious poses are urged by DMVs that have installed high-tech software that compares a new license photo with others that have already been shot. When a new photo seems to match an existing one, the software sends alarms that someone may be trying to assume another driver’s identity.”

Great idea! Hassle people getting their drivers licenses by telling them they cannot smile because a piece of software the DMV bought sucks so bad at facial recognition it cannot tell one person from another. I know those pimply face teenagers can be awfully tricky, but really, did they need to spend the money to catch a couple dozen kids a year? Did someone get embarrassed because they issued a kid a drivers license with the name “McLovin”? Was the DHS grant money burning a hole in their pockets? Seriously, fess up that you bought busted software and move on. There are database cross reference checks that should be able to easily spot identity re-use. Besides, kids will figure out how to trick the software far more easily than someone with half a brain. Admitting failure is the first step to recovery.

–Adrian Lane