They say it takes 10,000 hours of practice at a task to become an expert. This isn’t idle supposition, but something that’s been studied scientifically – if you believe in that sorts of things. (I’d like to provide a reference, but I’m in the process of becoming an expert at sitting in an Economy Class seat without wireless).

10,000 hours translates, roughly, to practicing something for 40 hours a week for around 5 years. Having racked up that many hours in a couple different fields, my personal experience tells me (if you believe that sorts of things) that the 10K threshold only opens up the first gate to a long path of mastery.

I can’t remember exactly what year I became an analyst, but I think it was right around a decade ago. This would put me well past that first gate, but still with a lot of room to learn and grow in front of me. That’s assuming you consider analysis a skill – I see it as more a mashup of certain fundamental skills, with deep knowledge and experience of the topic you focus on.

Some analysts think the fundamental tools of analysis apply anywhere, and it’s only a matter of picking up a few basics on any particular topic. You can recognize these folks, as they bounce from area to coverage area, without a real passion or dedication to any primary focus. While I do think a good analyst can apply the mechanics to multiple domains, being handy with a wrench doesn’t make a plumber a skilled car mechanic. You have to dig deep and rip apart the fundamentals to truly contribute to a field.

In a bit of cognitive bias, I’m fascinated by the mechanics of analysis. Like medicine or carpentry, it’s not a field you can learn from a book or class – you really need to apprentice somewhere.

One of the critical skills is the ability to change your position when presented with contradictory yet accurate evidence. Dogma is the antithesis of good analysis. Unfortunately I’d say over 90% of analysts take religious positions, and spend more time trying to make the world fit into their intellectual models than fixing their models to fit the world. When you are in a profession where you’re graded on “thought leadership”, it’s all too easy to interpret that as “say something controversial to get attention and plant a flag”.

Admitting you were wrong – not merely misinterpreted – is hard. I sure as hell don’t like it, and my natural reaction is usually to double down on my position like anyone else. I don’t always pull my head out of my ass, but I really do try to admit when I get something wrong. Weirdly, a certain fraction of the population interprets that as a fault. Either I’m an idiot for saying something wrong in the first place, or unreliable for changing my mind – even in the face of conflicting evidence.

The easiest way to tell whether an analyst sucks is to see how they react when the facts show them wrong. Or whether they use facts to back up their positions. I don’t claim to always get it right – I’m as human as everyone else, and often feel an emotional urge to defend my turf. This is a skill that takes constant practice – it’s handy for everyone, but critical for anyone who sells knowledge for a living.

And I believe it takes a heck of a lot more than 10,000 hours to master. I’m at double that and not even close.

On to the Summary:

Webcasts, Podcasts, Outside Writing, and Conferences

Favorite Securosis Posts

Other Securosis Posts

Favorite Outside Posts

Project Quant Posts

Research Reports and Presentations

Top News and Posts

Sorry, I didn’t have WiFi on my flights today and got home late, so I couldn’t compile a good list of stories. It doesn’t help that I’ve been slammed all week and haven’t read as much as usual. I suspect someone disclosed something, someone got hacked, and someone else tried to cover something up. That cover it? Oh – and there was a privacy violation by Google/Facebook/some social media service.