Securosis

Research

Friday Summary – August 7, 2009

My apologies for getting the Friday Summary out late this week. Needless to say, I’m still catching up from the insanity of Black Hat and DefCon (the workload, not an extended hangover or anything). We’d like to thank our friends Ryan and Dennis at Threatpost for co-sponsoring this year’s Disaster Recovery Breakfast. We had about 115 people show up and socialize over the course of 3 hours. This is something we definitely plan on continuing at future events. The evening parties are fun, but I’ve noticed most of them (at all conferences) are at swanky clubs with the music blasted higher than concert levels. Sure, that might be fun if I wasn’t married and the gender ration were more balanced, but it isn’t overly conducive to networking and conversation. This is also a big week for us because we announced our intern and Contributing Analyst programs. There are a lot of smart people out there we want to work with who we can’t (yet) afford to hire full time, and we’re hoping this will help us resolve that while engaging more with the community. Based on the early applications, it’s going to be hard to narrow it down to the 1-2 people we are looking for this round. Interestingly enough we also saw applicants from some unexpected sources (including some from other countries), and we’re working on some ideas to pull more people in using more creative methods. If you are interested, we plan on taking resumes for another week or so and will then start the interview process. If you missed it, we finally released the complete Project Quant Version 1.0 Report and Survey Results. This has been a heck of a lot of work, and we really need your feedback to revise the model and improve it. Finally, I’m sad to say we had to turn on comment moderation a couple weeks ago, and I’m not sure when we’ll be able to turn it off. The spambots are pretty advanced these days, and we were getting 1-3 a day that blast through our other defenses. Since we’ve disabled HTML in posts I don’t mind the occasional entry appearing as a comment on a post, but I don’t like how they get blasted via email to anyone who has previously commented on the post. The choice was moderation or disabling email, and I went with moderation. We will still approve any posts that aren’t spam, even if they are critical of us or our work. And now for the week in review: Webcasts, Podcasts, Outside Writing, and Conferences Rich Mogull and Lisa Phifer article “Encrypt it or Else”. Adrian was quoted in “Identity Theft”, on the Massachusetts Data Protection Law by Alexander B. Howard. Rich was quoted in a Dark Reading article on database security. Rich was quoted in a Computerworld article on IAM in cloud computing. Next week, Rich will be presenting in a webinar on the SANS Consensus Audit Guidelines. Favorite Securosis Posts Rich: Size Doesn’t Matter. Adrian: Data Labeling is Not the Same as DRM/ERM. Don’t forget to read down to my comment at the end. Other Securosis Posts The Network Security Podcast, Episode 161 McAfee Acquires MX Logic Mini Black Hat/Defcon 17 recap The Securosis Intern and Contributing Analyst Programs Project Quant Posts Project Quant Version 1.0 Report and Survey Results Project Quant: Partial Draft Report Favorite Outside Posts Adrian: How could it be anything other than “Hey hey, I Wanna Be A Security Rockstar by Chris ‘Funkadelic’ Hoff. It’s like he was there, man! Rich: Jack Daniel is starting to post some of the Security B-Sides content. I really wish I could have been there, but since I work the event, I wasn’t able to leave Black Hat. The good news is they’ll be doing this in San Francisco around RSA, and I plan on being there. Top News and Posts Get ready for Badge Hacking! RSnake and Inferno release two new browser hacks. First prosecution for allegedly stealing a domain name. You know, Twitter being under attack is one of those events that brings security to the forefront of the general public’s consciousness, in many ways better than some obscure data breach. Feds concerned with having their RFIDs scanned, and pictures taken, at DefCon. There is nothing at all to prevent anyone from doing this on the street, and it’s a good reminder of RFID issues. Fake ATM at DefCon. I wonder if the bad guys knew 8000 raving paranoids would be circling that ATM? Melissa Hathaway steps down as cybersecurity head. I almost don’t know how to react – the turnover for that job is ridiculous, and I hope someone in charge gets a clue. The Guerilla CISO has a great post on this. Adobe has a very serious problem. It is one of the biggest targets, and consistently rates as one of the worst patching experiences. They respond far too slowly to security issues, and this is one of the best vectors for attack. I no longer use or allow Adobe Reader on any of my systems, and minimize my use of Flash thanks to NoScript. Blog Comment of the Week This week’s best comment comes from Bernhard in response to the Project Quant: Create and Test Deployment Package post: I guess I’m mosty relying on the vendor’s packaging, being it opatch, yum, or msi. So, I’m mostly not repackaging things, and the tool to apply the patch is also very much set. In my experience it is pretty hard to sort out which patches/patchsets to install. This includes the very important subtask of figuring out the order in which patches need to be applied. Having said that, a proper QA (before rollout), change management (including approval) and production verification (after rollout) is of course a must-have. Share:

Share:
Read Post

McAfee Acquires MX Logic

During the week of Black Hat/Defcon, McAfee acquired MX Logic for about $140M plus incentives, adding additional email security and web filtering services to their product line. I had kind of forgotten about McAfee and email security, and not just because of the conferences. Seriously, they were almost an afterthought in this space. Despite their anti-virus being widely used in mail security products, and the vast customer base, their own email & web products have not been dominant. Because they’re one of the biggest security firms in the industry it’s difficult to discount their presence, but honestly, I thought McAfee would have made an acquisition last year because their email security offering was seriously lacking. In the same vein, MX Logic is not the first name that comes to mind with email security either, but not because of product quality issues – they simply focus on reselling through managed service providers and have not gotten the same degree of attention as many of the other vendors. So what’s good about this? Going back to my post on acquisitions and strategy, this purchase is strategic in that it solidifies and modernizes McAfee’s own position in email and web filtering SaaS capabilities, but it also opens up new relationships with the MSPs. The acquisition gives McAfee a more enticing SaaS offering to complement their appliances, and should more naturally bundle with other web services and content filtering, reducing head-to-head competitive issues. The more I think about it, the more it looks like the managed service provider relationships are a big piece of the puzzle. McAfee just added 1,800 new channel partners, and has the opportunity to leverage those channels’ relationships into new accounts, who tend to hold sway over their customers’ buying decisions. And unlike Tumbleweed, which was purchased for a similar amount of $143M on falling revenues and no recognizable SaaS offering, this appears to be a much more compelling purchase that fits on several different levels. I estimated McAfee’s revenue attributable to email security was in the $55M range for 2008, which was a guess on my part because I have trouble deciphering balance sheets, but backed up by another analyst as well as a former McAfee employee who said I was in the ballpark. If we add another $30M to $35M (optimistically) of revenue to that total, it puts McAfee a lot closer to the leaders in the space in terms of revenue and functionality. We can hypothesize about whether Websense or Proofpoint would have made a better choice, as both offer what I consider more mature and higher-quality products, but their higher revenue and larger installed bases would have cost significantly more, overlapping more with what McAfee already has in place. This accomplished some of the same goals for less money. All in all, this is a good deal for existing McAfee customers, fills in a big missing piece of their SaaS puzzle, and I am betting will help foster revenue growth in excess of the purchase price. Share:

Share:
Read Post

Mini Black Hat/Defcon 17 recap

At Black Hat/Defcon, Rich and I are always convinced we are going to be completely hacked if we use any connection anywhere in Las Vegas. Heck, I am pretty sure someone was fuzzing my BlackBerry even though I had Bluetooth, WiFi, and every other function locked down. It’s too freakin’ dangerous, and as we were too busy to get back to the hotel for the EVDO card, neither Rich or I posted anything last week during the conference. So it’s time for a mini BH/Defcon recap. As always, Bruce Schneier gave a thought provoking presentation on how the brain conceptualizes security, and Dan Kaminsky clearly did a monstrous amount of research for his presentation on certificate issuance and trust. Given my suspicion my phone might have been hacked, I probably should have attended more of the presentations on mobile security. But when it comes down to it, I’m glad I went over and saw “Clobbering the Cloud” by the team at Sensepost. I thought their presentation was the best all week, as it went over some very basic and practical attacks against Amazon EC2, both the system itself and its trust relationships. Those of you who were in the room in the first 15 minutes and left missed the best part where Haroon Meer demonstrated how to put a rogue machine up and escalate its popularity. They went over many different ways to identify vulnerabilities, fake out the payment system, escalate visibility/popularity, and abuse the identity tokens tied to the virtual machines. In the latter case, it looks like you could use this exploit to run machines without getting charged, or possibly copy someone else’s machine and run it as a fake version. I think I am going to start reading their blog on a more regular basis. Honorable mention would have to be Rsnake and Jabra’s presentation on how browsers leak data. A lot of the examples are leaks I assumed were possible, but it is nonetheless shocking to see your worst fears regarding browser privacy demonstrated right in front of your eyes. Detecting if your browser is in a VM, and if so, which one. Reverse engineering Tor traffic. Using leaked data to compromise your online account(s) and leave landmines waiting for your return. Following that up with a more targeted attack. It shows not only specific exploits, but how when bundled together they comprise a very powerful way to completely hack someone. I felt bad because there were only 45 or so people in the hall, as I guess the Matasano team was supposed to present but canceled at the last minute. Anyway, if they post the presentation on the Black Hat site, watch it. This should dispel any illusions you had about your privacy and, should someone have interest in compromising your computer, your security. Last year I thought it really rocked, but this year I was a little disappointed in some of the presentations I saw at Defcon. The mobile hacking presentations had some interesting content, and I laughed my ass off with the Def Jam 2 Security Fail panel (Rsnake, Mycurial, Dave Mortman, Larry Pesce, Dave Maynor, Rich Mogull, and Proxy-Squirrel). Other than that, content was kind of flat. I will assume a lot of the great presentations were the ones I did not select … or were on the second day … or maybe I was hung over. Who knows. I might have seen a couple more if I could have moved around the hallways, but human gridlock and the Defcon Goon who did his Howie Long impersonation on me prevented that from happening. I am going to stick around for both days next year. All in all I had a great time. I got to catch up with 50+ friends, and meet people whose blogs I have been reading for a long time, like Dave Lewis and Paul Asadoorian. How cool is that?! Oh, and I hate graffiti, but I have to give it up for whomever wrote ‘Epic Fail’ on Charo’s picture in the garage elevator at the Riviera. I laughed halfway to the airport. Share:

Share:
Read Post

Friday Summary – July 24, 2009

“Hi, my name is Adrian, and, uh … I am a technologist” … Yep. I am. I like technology. Addicted to it in fact. I am on ‘Hack A Day’ almost once a day. I want to go buy a PC and over-clock it and I don’t even use PCs any more. I can get distracted by an interesting new technology or tool faster than a kid at Toys R Us. I have had a heck of a time finishing the database encryption paper as I have this horrible habit of dropping right down into the weeds. Let’s look at a code sample! What does the API look like? What algorithms can I choose from? How fast is the response in key creation? Can I force a synch across key servers manually, or is that purely a scheduled job? How much of the API does each of the database vendors support? Yippee! Down the rabbit hole I go … Then Rich slaps me upside the head and I get back to strategy and use cases. Focus on the customer problem. The strategy behind deployment is far more important to the IT and security management audiences than subtleties of implementation, and that should be the case. All of the smaller items are interesting, and may be an indicator off the quality of the product, but are not a good indicator to the suitability of a product to meet a customers need. I’ll head to the technologist anonymous meeting next week, just as soon as I wrap the recommendations section on this paper. But the character flaw remains. In college, studying software, I was not confident I really understood how computers worked until I went down into the weeds, or in this case, into the hardware. Once I designed and built a processor, I understood how all the pieces fit together and was far more confident in making software design trade-offs. It’s why I find articles like this analysis of the iPhone 3GS design so informative as it shows how all of the pieces are designed and work together, and now I know why certain applications perform they way they do, and why some features kill battery life. I just gotta know how all the pieces fit together! I think Rich has his addiction under control. He volunteers to do a presentation at Defcon/Black Hat each year, and after a few weeks of frenzied soldering, gets it out of his system. Then he’s good for the remainder of the year. I think that is what he is doing right now: bread board and soldering iron out, and making some device perform in a way nature probably did not intend it to. Last year it was a lamp that hacked your home network. God only knows what he is doing to the vacuum cleaner this year! A couple notes: We are having to manually approve most comments due to the flood of message spam. If you don’t see your comment, don’t fret, we will usually open it up within the hour. And we are looking for some intern help here at Securosis. There is a long list of dubious qualities we are looking for. Basically we need some help with some admin and site work, and in exchange will teach you the analyst game and get you involved with writing and other projects. And since our office is more or less virtual, it really does not matter where you live. And if you can write well enough you can help me finish this damned paper and write the occasional blog post or two. We are going to seriously look after Black Hat, but not before, so get in contact with us next month if you are interested. We’re also thinking we might do this in a social media/community kind of way, and have some cool ideas on making this more than the usual slave labor internship. As both Rich and I will be at Black Hat/Defcon next week, there will not be a Friday summary, but we will return to our regularly scheduled programming on the 7th of August. We will be blogging live and I assume we’ll even get a couple of podcasts in. Hope to see you at BH and the Disaster Recovery Breakfast at Cafe Lago! Hey, I geek out more than once a year! I use microcontrollers in my friggen Halloween decorations for Pete’s sake! -rich And now for the week in review: Webcasts, Podcasts, Outside Writing, and Conferences Rich and Martin in Episode 159 of the Network Security Podcast. Rich wrote an article on iPhone 3GS security over at TidBITS. Favorite Securosis Posts Rich: Adrian’s post on the FTC’s Red Flag rules. Adrian: Amazon’s SimpleDB looks like it is going to be a very solid, handy development tool. Other Securosis Posts Electron Fraud, Central American Style Project Quant Posts Project Quant: Partial Draft Report Favorite Outside Posts Adrian: Jack Daniel’s pragmatic view on risk and security. Rich: Techdulla with a short post that makes a very good point. I have a friend in exactly the same situation. Their CIO has no idea what’s going on, but spends a lot of time speaking at vendor conferences. Top News and Posts Get ready for Badge Hacking! RSnake and Inferno release two new browser hacks. I want to be a cyber-warrior, I want to live a life of dang-er-ior, or something like that. A great interview with our friend Stepto on gaming safety. The Pwnie award nominations are up. The dhcpclient vulnerability is very serious, and you shouldn’t read this post. There is another serious unpatched Adobe Flash/PDF vulnerability. George Hulme with some sanity checking on malware numbers. Medical breach reports flooding California. Blog Comment of the Week This week’s best comment comes from Bernhard in response to the Project Quant: Create and Test Deployment Package post: I guess I’m mosty relying on the vendor’s packaging, being it opatch, yum, or msi. So, I’m mostly not repackaging things, and the tool

Share:
Read Post

Amazon’s SimpleDB

I have always felt the punctuated equilibrium of database technology is really slow, with long periods between the popularity of simple relational ‘desktop’ databases (Access, Paradox, DBIII+, etc) and ‘enterprise’ platforms (DB2, Oracle, SQL Server, etc). But for the first time in my career, I am beginning to believe we are seeing a genuine movement away from relational database technology altogether. I don’t really study trends of relational database management platforms like I did a decade or so ago, so perhaps I have been slightly ignorant of the progression, but I am somewhat surprised by the rapidity with which programmers and product developers are moving away from relational DB platforms and going to simple indexed flat files for data storage. Application developers need data storage and persistence as much as ever, but it seems simpler is better. Yes, they still use tables, and they may use indices, but complex relational schemata, foreign keys, stored procedures, normalization, and triggers seem to be unwanted and irrelevant. Advanced relational technologies are being ignored, especially by web application developers, both because they want to manage the functions within the application they know (as opposed to the database they don’t), and because it makes for a cleaner design and implementation of the application. What has surprised me is the adoption of indexed flat files for data storage in lieu of any relational engine at all. Flat files offer a lot of flexibility, they can deal with bulk data insertions very quickly, and depending upon how they are implemented may offer extraordinary query response. It’s not like ISAM and other variants ever went away, as they remain popular in everything from mainframes to control systems. We moved from basic flat files to relational platforms because they offered more efficient storage, but that requirement is long dead. We have stuck with relational platforms because they offered data integrity and transactional consistency lacking in the simple data storage platforms, as well as excellent lookup speed on reasonably static data sets, and they provide a big advantage with of pre-compiled, execution ready stored procedure code. However when the primarily requirement is quick collection and scanning of bulk data, you don’t really care about those features so much. This is one of the reasons why many security product vendors moved to indexed flat files for data storage as it offers faster uploads, dynamic structure, and correlation capabilities, but that is a discussion for another post. I have been doing some research into ‘cloud’ service & security technologies of late, and a few months ago I was reminded of Amazon Web Services’ offering, Amazon SimpleDB. It’s a database, but in the classic sense, or what databases were like prior to the relational database model we have been using for the last 25 years. Basically it is a flat file, with each entry having attached name/value attribute pairs. Sounds simple because it is. It’s a bucket to dump data in. And you have the flexibility to introduce as much or as little virtual structure into it as you care to. It has a query interface, with all of the same query language constructs that most SQL languages offer. It appears to have been quietly launched in 2007, and I am guessing it was built by Amazon to solve their own internal data storage needs. In May of this year they augmented the query engine to support comparison operators such as ‘contains’ and several features for managing result sets. At this point, the product seems to have reached a state where it offers enough functionality to support most web application developers. You will be giving up a lot of (undesired?) functionality, but f you just want a simple bucket to dump data into with complete flexibility, this is a logical option. I am a believer that ‘cheaper, faster, easier’ always wins. Amazon’s SimpleDB fits that model. It’s feasible that this technology could snatch away the low end of the database market that is not interested in relational functions. Share:

Share:
Read Post

Electron Fraud, Central American Style

When I was a kid, the catchphrase “Computers don’t lie” was very common, implying that machines were unbiased and accurate, in order to engender faith in the results they produced. Maybe that’s why I am in security – because I found the concept to be very strange. Machines, and certainly computers, do pretty much exactly what we tell them to do, and implicit trust is misguided. As their inner workings are rarely transparent, they are perfectly suited to hiding all sorts of shenanigans, especially when under the control of power hungry despots. It is being reported that Honduran law enforcement has seized a number of computers that contain certified results for an election that never took place. It appears that former President Manuel Zelaya attempted to rig the vote on constitutional reform, and might have succeeded if he had not been booted prior to the vote. I cannot vouch for the quality of the translated versions, but here is an except: The National Direction of Criminal Investigation confiscated computers in the Presidential House in which were registered the supposed results of the referendum on the reform of the Constitution that was planned by former President Manuel Zelaya on last June 28, the day that he was ousted. “This group of some 45 computers, by the appearance that present, they would be used for the launch of the supposed final results of the quarter ballot box”, he explained. The computers belonged to the project ‘Learns’ of the Honduran Counsel of Science and Technology directed towards rural schools. All of the computers had been lettered with the name of the department for the one that would transmit the information accompanied by a document with the headline: “Leaf of test”, that contained all the data of the centers of voting. From the translated articles, it’s not clear to me if these computers were going to be used in the polling places and would submit the pre-loaded results, or if they were going to mimic the on-site computers and upload fraudulent data. You can pretty do anything you want when you have full access to the computer. Had this effort been followed through, it would have been difficult to detect, and the results would have been considered legitimate unless proven otherwise. Share:

Share:
Read Post

FTC Requirements for Customer Data

There was an article in Sunday’s Arizona Republic regarding to the Federal Trade Commission’s requirements for any company handling sensitive customer information. Technically this law went into effect back in January 2008, but it was enforced due to lack of awareness. Now that the FTC has completed their education and awareness program, and enforcement will begin August 1st of this year, it’s time to begin discussing these guidelines. This means that any business that collects, stores, or uses sensitive customer data needs a plan to protect data use and storage. The FTC requirements are presented in two broad categories. The first part spells out what companies can do to detect and spot fraud associated with identity theft. The Red Flags Rule spells out the four required components. Document specific ‘red flags’ that indicate fraud for your type of business. Document how your organization will go about detecting those indicators. Develop guidelines on how to respond when they are encountered. Periodically review the process and indicators for effectiveness and changes to business processes. The second part is about protecting personal information and safeguarding customer data. It’s pretty straightforward: know what you have, keep only what you need, protect it, periodically dispose of data you don’t need, and have a plan in case of breach. And, of course, document these points so the FTC knows you are in compliance. None of this is really ground-breaking, but it is a solid generalized approach that will at least get businesses thinking about the problem. It’s also broadly applied to all companies, which is a big change from what we have today. After reviewing the overall program, there are several things I like about the way the FTC has handled this effort. It was smart to cover not just data theft, but how to spot fraudulent activity as part of normal business operations. I like that the recommendations are flexible, and the FTC did not mandate products or process, only that you document. I like the fact that they were pretty clear on who this applied to and who it does not. I like the way that reducing the amount of sensitive data retention is a shown as a natural way to simplify requirements for many companies. Finally, providing simple educational materials, such as this simplified training video, is a great way to get companies jump started, and gives them some material to train their own people. Most organizations are going to be besieged by vendors with products that ‘solve’ this problem, and to them I can only say ‘Caveat emptor’. What I am most interested in is the fraud detection side, both what the red flags are for various business verticals, and how and where they detect. I say that for several reasons, but specifically because the people who know how to detect fraud within the organization are going to have a hard time putting it into a checklist and training others. For example, most accountants I know still use Microsoft Excel to detect fraud on balance sheets! Basically they import the balance sheet and run a bunch of macros to see if there is anything ‘hinky’ going on. There is no science to it, but practical experience tells them when something is wrong. Hopefully we will see people share their experiences and checklists with the community at large. I think this is a good basic step forward to protect customers and make companies aware of their custodial responsibility to protect customer data. Share:

Share:
Read Post

Friday Summary – July 17, 2009

I apologize to those of you reading this on Saturday morning – with the stress of completing some major projects before Black Hat, I forgot that to push the Summary out Friday morning, we have to finish it off Thursday night. So much for the best laid plans and all. The good news is that we have a lot going on at Black Hat. Adrian and I will both be there, and we’re running another Disaster Recovery Breakfast, this time with our friends over at Threatpost. I’m moderating the VC panel at Black Hat on Wednesday, and will be on the Defcon Security Jam 2: The Fails Keep on Coming panel. This is, by far, my favorite panel. Mostly because of the on-stage beverages provided. Since I goon for the events (that means work), Adrian will be handling most of our professional meetings for those of you who are calling to set them up. To be honest, Black Hat really isn’t the best place for these unless you catch us the first day (for reasons you can probably figure out yourself). This is the one conference a year when we try and spend as much of our time as possible in talks absorbing information. There is some excellent research on this year’s agenda, and if you have the opportunity to go I highly recommend it. I think it’s critical for any security professional to keep at least half an eye on what’s going on over on the offensive side. Without understanding where the threats are shifting, we’ll always be behind the game. I’ve been overly addicted to the Tour de France for the past two weeks, and it’s fascinating to watch the tactical responsiveness of the more experienced riders as they intuitively assess, dismiss, or respond to the threats around them. While the riders don’t always make large moves, they best sense what might happen around the next turn and position themselves to take full advantage of any opportunities, or head off attacks (yes, they’re called attacks) before they post a risk. Not to over-extend another sports analogy, but by learning what’s happening on the offensive side, we can better position ourselves to head off threats before they overly impact our organizations. And seriously, it’s a great race this year with all sorts of drama, so I highly recommend you catch it. Especially starting next Tuesday when they really hit the mountains and start splitting up the pack. -Rich And now for the week in review: Webcasts, Podcasts, Outside Writing, and Conferences Martin interviews Steve Ocepek on this week’s Network Security Podcast (plus we cover a few major news stories). Rich is quoted in a Dark Reading article on implemented least privileges. Rich is quoted alongside former Gartner co-worker Jeff Wheatman on database privileges over at Channel Insider. John Sawyer refers to our Database Activity Monitoring paper in another Dark Reading article. Favorite Securosis Posts Rich: Adrian’s Technology vs. Practicality really hit home. I miss liking stuff. Adrian: Database Encryption, Part 6: Use Cases. Someone has already told us privately that one of the use cases exactly described their needs, and they are off and implementing. Other Securosis Posts Oracle Critical Patch Update, July 2009 Microsoft Patched; Firefox’s Turn Second Unpatched Microsoft Flaw Being Exploited Subscribe to the Friday Summary Mailing List Pure Extortion Project Quant Posts We’re getting near the end of phase 1 and here’s the work in progress: Project Quant: Partial Draft Report Favorite Outside Posts Adrian: Amrit Williams North Korea Cyber Scape Goat of the World. The graphic is priceless! Rich: David and Alex over at the New School preview their Black Hat talk. Top News and Posts Critical JavaScript Vulnerability in Firefox 3.5. Microsoft Windows and Internet Explorer security issues patched. Oracle CPU for July 2009. Goldman Trading Code Leaked. Mike Andrews has a nice analysis on Google Web “OS”. Twitter Hack makes headlines. Lexis-Nexus breached by the mob? Vulnerability scanning the clouds. State department worker sentenced for snooping passports. Casino sign failure (pretty amusing). PayPal reports security blog to the FBI for a phishing screenshot. A school sues a bank over theft due to hacked computer. This is a tough one; the school was hacked and proper credentials stolen, but according to their contract those transfers shouldn’t have been allowed even from the authenticated system/account. Nmap 5 released – Ed’s review. Blog Comment of the Week This week’s best comment comes from SmithWill in response to Technology vs. Practicality: Be weary of the CTO/car fanatic. Over-built engines=over instrumented, expensive networks. But they’re smoking fast! Share:

Share:
Read Post

Oracle Critical Patch Update, July 2009

If you have read my overviews of Oracle database patches long enough, you probably are aware of my bias against the CVSS scoring system. It’s a yardstick to measure the relative risk of the vulnerability, but it’s a generic measure, and a confusing one at that. You have to start somewhere, but it’s just a single indicator, and you do need to take the time to understand how the threats apply (or don’t) to your environment. In cases where I have had complete understanding of the nature of a database threat, and felt that the urgency was great enough to disrupt patching cycles to rush the fix into production, CVSS has only jibed with my opinion around 60% of the time. This is because access conditions typically push the score down, and most developers have pre-conceived notions about how a vulnerability would be exploited. They fail to understand how attackers turn all of your assumptions upside down, and are far more creative in finding avenues to exploit than developers anticipate. CVSS scores reflect this overconfidence. Oracle announced the July 2009 “Critical Patch Update Advisory” today. There are three fairly serious database security fixes, and two more for serious issues for secure backup. The problem with this advisory (for me, anyway) is that none of my contacts know the specifics behind CVE-2009-1020, CVE-2009-1019 or CVE-2009-1963. Further, NIST, CERT, and Mitre have not published any details at this time. The best information I have seen in Eric Maurice’s blog post, but it’s little more than the security advisory itself. Most of us are in the dark on these, so meaningful analysis is really not possible at this time. Still, remotely exploitable vulnerabilities that bypass authentication are very high on my list of things to patch immediately. And compromise of the TNS service in the foundation layer, which two of the three database vulnerabilities appear to be, provides an attacker both a method of probing for available databases and also exploitation of peer database trust relationships. I hate to make the recommendation without a more complete understanding of the attack vectors, but I have to recommend that you patch now. Share:

Share:
Read Post

Technology vs. Practicality

I am kind of a car nut. Have been since I was little when my dad took me to my first auto race at the age of four (It was at Laguna Seca, a Can-Am race. Amazing!). I tend to get emotionally attached to my vehicles. I buy them based upon how they perform, how they look, and how they drive. I am fascinated by the technology of everything from tires to turbos. I am a tinkerer, and I do weird things like change bushings that don’t need to be changed, rebuild a perfectly good motor or tweak engine management computer settings just because I can make them better. I have heavily modified every vehicle I have ever owned except the current one. I acknowledge it’s not rational, but I like cars, and this has been a hobby now for many years. My wife is the opposite. She drives a truck. For her, it’s a tool she uses to get her job done. Like a drill press or a skill saw, it’s just a mechanical device on a depreciation curve. Any minute of attention it requires above filling the tank with gasoline is too many. It’s stock except for the simple modifications I made to it, and is fabulously maintained, both facts she is willfully unaware of. Don’t get me wrong, she really likes her truck because it’s comfortable, with good air and plenty of power, but that’s it. After all, it’s just a vehicle. As a CTO, I was very much in the former camp when it came to security and technology. Love technology and I get very excited about the possibilities of how we might use new products, and the philosophical advantages new developments may bring. It’s common, and I think that is why so many CTOs become evangelists. But things are different as an analyst. I have been working with Rich for a little over a year now and it dawned on me how much my opinion on technology has changed, and how differently I now approach discussing technology with others. We had a conference call with an email security vendor a couple weeks ago, and they have some really cool new technology that I think will make their products better. But I kept my mouth shut about how cool I think it is because, as an analyst, that’s not really the point. I kept my mouth shut because most of their customers are not going to care. They are not going to care because they don’t want to spend a minute more considering email security and anti-spam than they have to. They want to set policies and forget about it. They want to spend a couple hours a month remediating missing email, or investigating complaints of misuse, but that’s it. It’s a tool used to get their job done and they completely lack any emotional attachment their vendor might have. Cool technology is irrelevant. It has been one of my challenges in this role to subjugate enthusiasm to practicality, and what is possible for just what is needed. Share:

Share:
Read Post

Totally Transparent Research is the embodiment of how we work at Securosis. It’s our core operating philosophy, our research policy, and a specific process. We initially developed it to help maintain objectivity while producing licensed research, but its benefits extend to all aspects of our business.

Going beyond Open Source Research, and a far cry from the traditional syndicated research model, we think it’s the best way to produce independent, objective, quality research.

Here’s how it works:

  • Content is developed ‘live’ on the blog. Primary research is generally released in pieces, as a series of posts, so we can digest and integrate feedback, making the end results much stronger than traditional “ivory tower” research.
  • Comments are enabled for posts. All comments are kept except for spam, personal insults of a clearly inflammatory nature, and completely off-topic content that distracts from the discussion. We welcome comments critical of the work, even if somewhat insulting to the authors. Really.
  • Anyone can comment, and no registration is required. Vendors or consultants with a relevant product or offering must properly identify themselves. While their comments won’t be deleted, the writer/moderator will “call out”, identify, and possibly ridicule vendors who fail to do so.
  • Vendors considering licensing the content are welcome to provide feedback, but it must be posted in the comments - just like everyone else. There is no back channel influence on the research findings or posts.
    Analysts must reply to comments and defend the research position, or agree to modify the content.
  • At the end of the post series, the analyst compiles the posts into a paper, presentation, or other delivery vehicle. Public comments/input factors into the research, where appropriate.
  • If the research is distributed as a paper, significant commenters/contributors are acknowledged in the opening of the report. If they did not post their real names, handles used for comments are listed. Commenters do not retain any rights to the report, but their contributions will be recognized.
  • All primary research will be released under a Creative Commons license. The current license is Non-Commercial, Attribution. The analyst, at their discretion, may add a Derivative Works or Share Alike condition.
  • Securosis primary research does not discuss specific vendors or specific products/offerings, unless used to provide context, contrast or to make a point (which is very very rare).
    Although quotes from published primary research (and published primary research only) may be used in press releases, said quotes may never mention a specific vendor, even if the vendor is mentioned in the source report. Securosis must approve any quote to appear in any vendor marketing collateral.
  • Final primary research will be posted on the blog with open comments.
  • Research will be updated periodically to reflect market realities, based on the discretion of the primary analyst. Updated research will be dated and given a version number.
    For research that cannot be developed using this model, such as complex principles or models that are unsuited for a series of blog posts, the content will be chunked up and posted at or before release of the paper to solicit public feedback, and provide an open venue for comments and criticisms.
  • In rare cases Securosis may write papers outside of the primary research agenda, but only if the end result can be non-biased and valuable to the user community to supplement industry-wide efforts or advances. A “Radically Transparent Research” process will be followed in developing these papers, where absolutely all materials are public at all stages of development, including communications (email, call notes).
    Only the free primary research released on our site can be licensed. We will not accept licensing fees on research we charge users to access.
  • All licensed research will be clearly labeled with the licensees. No licensed research will be released without indicating the sources of licensing fees. Again, there will be no back channel influence. We’re open and transparent about our revenue sources.

In essence, we develop all of our research out in the open, and not only seek public comments, but keep those comments indefinitely as a record of the research creation process. If you believe we are biased or not doing our homework, you can call us out on it and it will be there in the record. Our philosophy involves cracking open the research process, and using our readers to eliminate bias and enhance the quality of the work.

On the back end, here’s how we handle this approach with licensees:

  • Licensees may propose paper topics. The topic may be accepted if it is consistent with the Securosis research agenda and goals, but only if it can be covered without bias and will be valuable to the end user community.
  • Analysts produce research according to their own research agendas, and may offer licensing under the same objectivity requirements.
  • The potential licensee will be provided an outline of our research positions and the potential research product so they can determine if it is likely to meet their objectives.
  • Once the licensee agrees, development of the primary research content begins, following the Totally Transparent Research process as outlined above. At this point, there is no money exchanged.
  • Upon completion of the paper, the licensee will receive a release candidate to determine whether the final result still meets their needs.
  • If the content does not meet their needs, the licensee is not required to pay, and the research will be released without licensing or with alternate licensees.
  • Licensees may host and reuse the content for the length of the license (typically one year). This includes placing the content behind a registration process, posting on white paper networks, or translation into other languages. The research will always be hosted at Securosis for free without registration.

Here is the language we currently place in our research project agreements:

Content will be created independently of LICENSEE with no obligations for payment. Once content is complete, LICENSEE will have a 3 day review period to determine if the content meets corporate objectives. If the content is unsuitable, LICENSEE will not be obligated for any payment and Securosis is free to distribute the whitepaper without branding or with alternate licensees, and will not complete any associated webcasts for the declining LICENSEE. Content licensing, webcasts and payment are contingent on the content being acceptable to LICENSEE. This maintains objectivity while limiting the risk to LICENSEE. Securosis maintains all rights to the content and to include Securosis branding in addition to any licensee branding.

Even this process itself is open to criticism. If you have questions or comments, you can email us or comment on the blog.