Today I read two very different posts on what to look for when hiring, and how to get started in the security field. Each clearly reflects the author’s experiences, and since I get asked both sides of this question a lot, I thought I’d toss my two cents in.
First we have Shrdlu’s post over at Layer 8 on Bootstrapping the Next Generation. She discusses the problem of bringing new people into a field that requires a fairly large knowledge base to be effective.
Then over at Errata Security, Marisa focuses more on how to get a job through the internship path (with a dollop of self-promotion). As one of our industry’s younger recruits, who successfully built her own internship, she comes from exactly the opposite angle.
My advice tends to walk a line slightly in the middle of the two, and varies depending on where in security you want to go.
When someone asks me how to get started in security I tend to offer two recommendations:
- Start with a background as a systems and network administrator… probably starting with the lowly help desk. This is how I got started (yes, I’m thus biased), and I think these experiences build a strong foundation that spans most of the tasks you’ll later deal with. Most importantly, they build experience on how the real world works – even more so than starting as a developer. You are forced to see how systems and applications are really used, learn how users interact with technology, and understand the tradeoffs in keeping things running on a day to day basis. I think even developers should spend some time on the help desk or cleaning up systems – while I was only a mediocre developer from a programming standpoint, I became damn good at understanding user interfaces and workflows from the few years I spent teaching people how to unhide their Start menus and organize those Windows 3.1 folders.
- Read a crapload of action thriller and spy novels, watch a ton of the same kinds of movies, and express your inner paranoid. This is all about building a security mindset, and it is just as important as any technical skills. It’s easy to say “never assume”, but very hard to put it into practice (and to be prepared for the social consequences). You are building a balanced portfolio of paranoia, cynicism, and skepticism. Go do some police ride-alongs, become an EMT, train in a hard martial art, join the military, or do whatever you need to build some awareness. If you were the kid who liked to break into school or plan your escape routes for when the commies (or yankees) showed up, you’re perfect for the industry. You need to love security.
The best security professionals combine their technical skills, a security mindset, and an ability to communicate (Marisa emphasized public speaking skills) with a wrapper of pragmatism and an understanding of how to balance the real world sacrifices inherent to security.
These are the kinds of people I look for when hiring (not that I do much of that anymore). I don’t care about a CISSP, but want someone who has worked with users and understands technology from actual experience rather than a library shelf, or a pile of certificates.
In terms of entry-level tracks, we are part of a complex profession and thus need to specialize. Even security generalists now need to have at least one deep focus area. I see the general tracks as:
- Operational Security – The CISO track. Someone responsible for general security in the organization. Usually comes from the systems or network track, although systems integration is another option.
- Secure Coder – Someone who either programs security software, or is responsible for helping secure general (non-security-specific) code. Needs a programmer’s background, but I’d also suggest some more direct user interaction if they’re used to coding in a closet with pizzas slipped under the door at irregular intervals.
- Security Assessor (or Pen Tester) – Should ideally come out of the coder or operations track. I know a lot of people are jumping right into pen testing, but the best assessors I know have practical experience on the operational side of IT. That provides much better context for interpreting results and communicating with clients. The vulnerability researcher or penetration tester who speaks in absolutes has probably spent very little time on the defensive or operational side of security.
You’ll noticed I skipped a couple options – like the security architect. If you’re a security architect and you didn’t come from a programming or operational background, you likely suck at your job. I also didn’t break out security management – mostly since I hate managers who never worked for a living. To be a manager, start at the bottom and work your way up. In any case, if you’re ready for either of those roles you’re past these beginner’s steps, and if you want to get there, this is how to begin.
To wrap this up, when hiring look for someone with experience outside security and mentor them through if they have the right mindset. Yes, this means it’s hard to start directly in security, but I’m okay with that. It only takes a couple years in a foundational role to gain the experience, and if you have a security mindset you’ll be contributing to security no matter your operational role. So if you want to work in security, develop the mindset and jump on every security opportunity that pops up. As either a manager or recruit, also understand the different focus of each career track.
Finally, in terms of certifications, focus on the ‘low-level’ technical ones, often from outside security. A CISSP doesn’t teach you a security mindset, and as Shrdlu said it’s insane that something that is supposed to take 5 years of operational experience is a baseline for hiring – and we all know it’s easy to skirt the 5-year rule anyway.
I’m sure some of you have more to add to this one…
Posted at Tuesday 6th April 2010 3:07 pm
(1) Comments •
By Mike Rothman
Speaking as a “master of the obvious,” it’s worth mentioning the importance of having a correct mindset heading into the new year. Odds are you’ve just gotten back from the holiday and that sinking “beaten down” feeling is setting in. Wow, that didn’t take long.
So I figured I’d do a quick reminder of the universal truisms that we know and love, but which still make us crazy. Let’s just cover a few:
There is no 100% security
I know, I know – you already know that. But the point here is that your management forgets. So it’s always a good thing to remind them as early and often as you can. Even worse, there are folks (we’ll get to them later) who tell your senior people (usually over a round of golf or a bourbon in some mahogany-laden club) that it is possible to secure your stuff.
You must fight propaganda with fact. You must point out data breaches, not to be Chicken Little, but to manage expectations. It can (and does) happen to everyone. Make sure the senior folks know that.
Compliance is a means to an end
There is a lot of angst right now (especially from one of my favorite people, Josh Corman) about the reality that compliance drives most of what we do. Deal with it, Josh. Deal with it, everyone. It is what it is. You aren’t going to change it, so you’d better figure out how to prosper in this kind of reality.
What to do? Use compliance to your advantage. Any new (or updated) regulation comes with some level of budget flexibility. Use that money to buy stuff you really need. So what if you need to spend some time writing reports with your new widget to keep the auditor happy. Without compliance, you wouldn’t have your new toy.
Don’t forget the fundamentals
Listen, most of us have serious security kung fu. They probably task folks like you to fix hard problems and deflect attackers from a lot of soft tissue. And they leave the perimeter and endpoints to the snot-nosed kid with his shiny new Norwich paper. That’s OK, but only if you periodically make sure things function correctly.
Maybe that means running Core against your stuff every month. Maybe it means revisiting that change control process to make sure that open port (which that developer just had to have) doesn’t allow the masses into your shorts.
If you are nailed by an innovative attack, shame on them. Hopefully your incident response plan holds up. If you are nailed by some stupid configuration or fundamental mistake, shame on you.
Widgets will not make you secure
Keep in mind the driving force for any vendor is to sell you something. The best security practitioners I know drive their projects – they don’t let vendors drive them. They have a plan and they get products and/or services to execute on that plan.
That doesn’t mean reps won’t try to convince you their widget needs to be part of your plan. Believe me, I’ve spent many a day in sales training helping reps to learn how to drive the sales process. I’ve developed hundreds of presentations designed to create a catalyst for a buyer to write a check. The best reps try to help you, as long as that involves making the payment on their 735i.
And even worse, as a reformed marketing guy, I’m here to say a lot of vendors will resort to bravado in order to convince you of something you know not to be true. Like that a product will make you secure. Sometimes you see something so objectionable to the security person in you, it makes you sick.
Let’s take the end of this post from LogLogic as an example. For some context, their post mostly evaluates the recent Verizon DBIR supplement.
What does LogLogic predict for 2010? Regardless of whether, all, some, or none, of Verizon’s predictions come true, networks will still be left vulnerable, applications will be un-patched, user error will causes breaches in protocol, and criminals will successfully knock down walls.
But not on a LogLogic protected infrastructure.
We can prevent, capture and prove compliance for whatever 2010 throws at your systems.
LogLogic customers are predicting a stress free, safe 2010.
Wow. Best case, this is irresponsible marketing. Worst case, this is clearly someone who doesn’t understand how this business works. I won’t judge (too much) because I don’t know the author, but still. This is the kind of stuff that makes me question who is running the store over there.
Repeat after me: A widget will not make me secure. Neither will two widgets or a partridge in a pear tree.
So welcome to 2010. Seems a lot like 2009 and pretty much every other year of the last decade. Get your head screwed on correctly. The bad guys attack. The auditors audit. And your management squeezes your budget.
Posted at Thursday 7th January 2010 5:44 pm
(2) Comments •
This is part 2 of our series on skepticism in security. You can read part 1 here.
Being a bit of a science geek, over the past year or so I’ve become addicted to The Skeptics’ Guide to the Universe podcast, which is now the only one I never miss. It’s the Skeptics’ Guide that first really exposed me to the scientific skeptical movement, which is well aligned with what we do in security.
We turn back to Wikipedia for a definition of scientific skepticism:
Scientific skepticism or rational skepticism (also spelled scepticism), sometimes referred to as skeptical inquiry, is a scientific or practical, epistemological position in which one questions the veracity of claims lacking empirical evidence.
Scientific skepticism utilizes critical thinking and inductive reasoning while attempting to oppose claims made which lack suitable evidential basis.
Characteristics: Like a scientist, a scientific skeptic attempts to evaluate claims based on verifiability and falsifiability rather than accepting claims on faith, anecdotes, or relying on unfalsifiable categories. Skeptics often focus their criticism on claims they consider to be implausible, dubious or clearly contradictory to generally accepted science. This distinguishes the scientific skeptic from the professional scientist, who often concentrates their inquiry on verifying or falsifying hypotheses created by those within their particular field of science.
The skeptical movement has expanded well beyond merely debunking fraudsters (such as that Airborne garbage or cell phone radiation absorbers) into the general promotion of science education, science advocacy, and the use of the scientific method in the exploration of knowledge. Skeptics battle the misuse of scientific theories and statistics, and it’s this aspect I consider essential to the practice of security.
In the security industry we never lack for theories or statistics, but very few of them are based on sound scientific principles, and often they cannot withstand scientific scrutiny. For example, the historic claim that 70% of security attacks were from the “insider threat” never had any rigorous backing. That claim was a munged up “fact” based on the free headline from a severely flawed survey (the CSI/FBI report), and an informal statement from one of my former coworkers made years earlier. It seems every day I see some new numbers about how many systems are infected with malware, how many dollars are lost due to the latest cybercrime (or people browsing ESPN during lunch), and so on.
I believe that the appropriate application of skepticism is essential in the practice of security, but we are also in the position of often having to make critical decisions without the amount of data we’d like. Rather than saying we should only make decisions based on sound science, I’m calling for more application of scientific principles in security, and increased recognition of doubt when evaluating information. Let’s recognize the difference between guesses, educated guesses, facts, and outright garbage.
For example – the disclosure debate. I’m not claiming I have the answers, and I’m not saying we should put everything on hold until we get the answers, but all sides do need to recognize we have no effective evidentiary basis for defining general disclosure policies. We have personal experience and anecdote, but no sound way to measure the potential impact of full disclosure vs. responsible disclosure vs. no disclosure.
Another example is the Annualized Loss Expectancy (ALE) model. The ALE model takes losses from a single event and multiplies that times the annual rate of occurrence, to give ‘the probable annual loss’. Works great for defined assets with predictable loss rates, such as lost laptops and physical theft (e.g., retail shrinkage). Nearly worthless in information security. Why? Because we rarely know the value of an asset, or the annual rate of occurrence. Thus we multiply a guess by a guess to produce a wild-assed guess. In scientific terms neither input value has precision or accuracy, and thus any result is essentially meaningless.
Skepticism is an important element of how we think about security because it helps us make decisions on what we know, while providing the intellectual freedom to change those decisions as what we know evolves. We don’t get as hung up on sticking with past decisions merely to continue to validate our belief system.
In short, let’s apply more science and formal skepticism to security. Let’s recognize that just because we have to make decisions from uncertain evidence, we aren’t magically turning guesses and beliefs into theories or facts. And when we’re presented with theories, facts, and numbers, let’s apply scientific principles and see which ones hold up.
Posted at Tuesday 23rd June 2009 7:02 pm
(3) Comments •
Note: This is the first part of a two part series on skepticism in security; click here for part 2.
Securosis: A mental disorder characterized by paranoia, cynicism, and the strange compulsion to defend random objects.
For years I’ve been joking about how important cynicism is to be an effective security professional (and analyst). I’ve always considered it a core principle of the security mindset, but recently I’ve been thinking a lot more about skepticism than cynicism.
My dictionary defines a cynic as:
- a person who believes that people are motivated purely by self-interest rather than acting for honorable or unselfish reasons : some cynics thought that the controversy was all a publicity stunt.
* a person who questions whether something will happen or whether it is worthwhile : the cynics were silenced when the factory opened.
1. (Cynic) a member of a school of ancient Greek philosophers founded by Antisthenes, marked by an ostentatious contempt for ease and pleasure. The movement flourished in the 3rd century BC and revived in the 1st century AD.
Cynicism is all about distrust and disillusionment; and let’s face it, those are pretty important in the security industry. As cynics we always focus on an individual’s (or organization’s) motivation. We can’t afford a trusting nature, since that’s the fastest route to failure in our business. Back in physical security days I learned the hard way that while I’d love to trust more people, the odds are they would abuse that trust for self-interest, at my expense. Cynicism is the ‘default deny’ of social interaction.
Skepticism, although closely related to cynicism, is less focused on individuals, and more focused on knowledge. My dictionary defines a skeptic as:
- a person inclined to question or doubt all accepted opinions.
* a person who doubts the truth of Christianity and other religions; an atheist or agnostic.
1. Philosophy an ancient or modern philosopher who denies the possibility of knowledge, or even rational belief, in some sphere.
But to really define skepticism in modern society, we need to move past the dictionary into current usage. Wikipedia does a nice job with its expanded definition:
- an attitude of doubt or a disposition to incredulity either in general or toward a particular object;
- the doctrine that true knowledge or knowledge in a particular area is uncertain; or
- the method of suspended judgment, systematic doubt, or criticism that is characteristic of skeptics (Merriam-Webster).
Which brings us to the philosophical application of skepticism:
In philosophy, skepticism refers more specifically to any one of several propositions. These include propositions about:
- an inquiry,
- a method of obtaining knowledge through systematic doubt and continual testing,
- the arbitrariness, relativity, or subjectivity of moral values,
- the limitations of knowledge,
- a method of intellectual caution and suspended judgment.
In other words, cynicism is about how we approach people, while skepticism is about how we approach knowledge. For a security professional, both are important, but I’m realizing it’s becoming ever more essential to challenge our internal beliefs and dogmas, rather than focusing on distrust of individuals. I consider skepticism harder than cynicism, because we are often forced to challenge our own internal beliefs on a regular basis.
In part 2 of this series I’ll talk about the role of skepticism in security.
Posted at Tuesday 23rd June 2009 7:00 pm
(0) Comments •
Had another one of those real world experiences today that was just begging for a blog post. A couple hours ago I was driving down the highway on my way to my physical therapy appointment when I saw a rollover car accident on the side of the road near an on-ramp. There were a bunch of bystanders, but the first police officer was just pulling up and there was no fire or ambulance in sight.
There is some good news and some bad news if you get in an accident by that particular on-ramp. The good news is it’s right down the road from the Mayo clinic. The bad news is a lot of doctors drive on and off that ramp at any given moment.
Very few of them work in the emergency department.
I first became an EMT in 1990, went on to become a full-time paramedic, and have dabbled in everything from ski patrol to mountain rescue to HAZMAT over the years. I’m still an EMT, although not doing much with it since moving to Phoenix. If I’d knocked someone up when I got certified they’d be getting ready for college right about now. That is, to be honest, a little scary.
Since no responders were on scene yet I identified myself and asked if they needed help. The other bystanders, including the first doctor, stepped back (she was calling the patient’s parents). The patient was looking okay, but not great, crying and complaining about neck and head pain. She did not remember the accident.
The next bit went like this:
DAD (Dumb Ass Doctor): Here, let’s put this under her head [holding rolled-up jacket]
Me: Sir, we don’t want to do that.
DAD: I’m a doctor. It’s fine, she was walking around [Note, most patients who scramble out of their overturned car through the missing windshield wander around a little bit until someone sits them down.]
Me: What kind of doctor?
DAD: Anesthesiologist. Trauma anesthesiologist. It’s fine. [Note, that means he puts trauma patients to sleep in an operating room so a surgeon can fix them.]
Me: Sir, we have a patient complaining of head and neck pain with a loss of consciousness; you do NOT want to manipulate her head.
DAD: I’m a doctor [inserts pillow, as patient cries out from the pain]. Gee [other doc’s name] don’t you remember that emergency training and the chain of command?
Me: You’re the doctor, can you gossip with your friends and stand over there now while I make sure she can still move?
For the record, I’ve never met an ER doctor in the world that will clear a patient’s c-spine in the field with that mechanism (a rollover) and pain on touch and movement. I would never pretend to be able to anesthetize a patient, but this bozo, like many doctors, thinks he’s fully capable of directing field treatment completely outside his experience.
Here’s the thing;as professionals we train hard at becoming experts in a particular domain. This doesn’t make us experts in adjacent domains. For example, I may be a security expert, but despite some broad knowledge I’ve specialized in certain areas, like information-centric security. If, for example, you needed me to read your IDS logs or deploy your UTM I’d send you to someone with practical network security knowledge.
When doing risk assessments or practical, on-the-ground security, make sure you engage the right domain experts before you break something. You may have kung-fu, but that doesn’t mean you aren’t a total freaking idiot.
Posted at Monday 28th April 2008 6:14 pm
(4) Comments •