Being a "Great Coder" and 10000 Hours
groups.google.comThere really is a meme here I think, and it relates to ego.
Having said that, I definitely do think there is truth in massive variations between programmers, though I personally think that is a combination of a small pre-requisite talent (the ability to code at all, which I think a surprisingly small amount of people have - simply a genetically determine brain configuration), and mostly attitude + hard work.
For example, I took part in the recent Stanford AI class, though not about programming specifically, I was utterly down-heartened when I heard they sent out those 'send me your CVs you clever people!' emails to the top 1k students, it just made me feel like a lot of people were taking part in order to participate in a 'look how much cleverer I am than you' pissing match.
That kind of things ruins the collaborative 'learning for the joy of learning' side of things and has a tendency to make the whole thing into a sort of nasty elitist thing. I really wish they hadn't done that (N.B. I did fine on the course - 98.7% - so this isn't sour grapes).
The biggest problem with anything like this is the idea that 'here is some test of inherent intelligence - I am far better than you so you are inherently unable to do this thing' which is just the biggest barrier to actually trying to do something - if you think you inherently suck or at least are simply mediocre, your motivation to do that thing is severely reduced.
or perhaps I'm just ranting/projecting here :)
There is such a difference in coding output. Having had perhaps 50 work for me over the years (and being one myself), the top guys do perhaps 10x the output of the merely "very good" guys. And near infinite with the mediocre ones who on tough projects actually suck more time than they contribute.
The good guys also come in and contribute right off the bat. Like Christophe Balestra, who now is co-president of Naughty Dog. When he arrived on Jak 2 he was pounding out real working stuff the first or second day. By the end of the game (one year later) it was clear he was so kick ass that we promoted him across like 15 others guys to be co-lead with me on Jak 3. And he continues to kick ass to this day. I just site him, but I had the pleasure to work with around half a dozen other totally awesome guys too. Still, the "good" guys will take a system and do a great job with it over weeks. The great guys will knock it out in like 24-48 hours.
Lots of articles on this kind of stuff at my site too:
I've found that a difference between the "top" guys and the "very good" guys is that the very good guys are smart enough to do great work, but just aren't as into it. I know many guys who were mathematicians, physicists, or near-professional violinists and fell into programming because they couldn't make money from their other interest. These guys are often really good, but you can tell they wish they were doing something else. That said, the worst dudes are the ones who are really into it, but are really bad.
For the poor performers they may be nothing you can do. But for the merely "very good" are there practices that Balestra can teach them?
For example, one thing I've seen that can increase productivity by a factor of ten is good debugging skills -- which are generally teachable. The other thing is to get people on things they're excited about. Mentally checking out is another area I've seen strong people lose time.
Good debugging is key, and as anyone who ever worked with me will note, I'm a fantastic debugger (in no small part because I'm cold, rational, and rarely get upset). I keep meaning to write up a post for my blog with "Andy's rulez of debugging." There are really very simple, but very effective.
Like: "don't assume" and "divide and conquer" (they do require a bit of explanation)
Please do write that post, I for one would love to read it.
I wonder if you have much to add to http://www.amazon.com/Debugging-Indispensable-Software-Hardw... (which I really like).
Rule 5: "Quit thinking and look" is (I think) equivalent to your "don't assume".
Rule 6: "Divide and Conquer".
I'll have to check that out. Although my advice will be free :-) But I'm sure that many many other good debuggers have developed the same basic techniques independently. Still, the vast majority of programmers could use some improvement in this area. "Quit thinking and look" is exactly what I mean by "don't assume." People tend to get wrapped up their own view of things and forget that empiricism really wins the day. There is often even fundamental denial, as in "what bug? I haven't seen it." Clearly if someone saw it, unless they were hitting the crack pipe, it's real.
I often find that looking at the code hard and adding the appropriate tests and asserts can be better than immediately pulling out the debugger. Assert are continually testing your assumptions where you can only see them once in a debugging session.
Debugging is almost always about uncovering wrong assumptions, but I don't see how you can do without assumptions. Every line of code assumes certain state of the program that it operates on.
You can't have no assumptions. But half the time I help someone debug something they begin with, "it can't be in this part of the code" which is often unfounded. Now if you PROVE that it isn't, that's a different matter.
The best coders I know (including agavin, who comments below) are all insanely hard workers. There are a few other correlates, but these seem secondary to me:
- willing and able to rapidly learn new tools (especially languages, debuggers, build/test infrastructures, and profilers)
- understand software at many levels (so-called "full stack" programmers)
- more interested in producing a working system than in technical details
- self-confident enough to seek experts and extract information from them on areas of ignorance
- have a strong aesthetic sense of code
Definitely "being really smart" or "having a Ph.D" hasn't been a correlate in my experience; if anything, I've seen these to be negatively correlated with code production and quality.
And distribution of coding ability definitely seems to follow a normal curve; a handful of coders I know are 6 sigma out. There are many more two-sigma outliers, and tons in the middle, as you'd expect.
Definitely "being really smart" or "having a Ph.D" hasn't been a correlate in my experience; if anything, I've seen these to be negatively correlated with code production and quality.
Unfortunately this is largely the case. It's in many ways similar to why many on HN don't want to do Java enterprise LOB apps. It seems like painful drudge work. For a lot of really smart people who did their PhD -- the work it takes to build a production web app is painful drudge work. They'll happily build the prototype that proves the concept and their done. Everything else is a painful drudge work -- a solved problem ("I can reduce what's left to what Facebook did. QED.")
Um.. most PhDs in the experimental sciences involve lots of drudge work, and 14 hour days, and the people who make it are generally very hard workers. This may not be true in theory or engineering.
I have a PhD in CS. There's a lot of "drudge work" in the design and execution of experiments. I just spent four days designing, executing and analyzing experiments to see if our model was accurate and my implementation was correct. (Both are, which is nice.) But those experiments will never be published - experiments like them will, but not those. Those experiments are what I call "guiding experiments" - they're not rigorous enough to convince others that what we did works, but it's enough to convince me we're doing the right thing, and give me the confidence that things will work out as expected when we do the full, rigorous experiments.
So, yes, research has its own grunt work. And some of my code will make it into production.
> The best coders I know (including agavin, who comments below) are all insanely hard workers
I'd +1 this if I could. This is the instinct I've had for a while, and probably also the key as to why it is so rare.
I suspect a key part of it is learning how to work effectively, and avoiding burnout in the process, neither of which are trivial to achieve.
Thanks for the comment, extremely interesting!
To me, insanely hard work always connotes working 80 hours/wk and not having (or spending time with) a SO, kids, or other hobbies besides the job.
I don't know if that's what you mean, but that kind of life is not one I would go for, even assuming I am capable of being more than one or two sigmas good.
I don't think that's true. I consider myself an ok coder, but I'm lazy. I work in bursts, but even a good day is likely to be less than six hours of actual work. Bad days are near 0. If I had the work ethic of the parent commenter, I could get a hell of a lot done in a reasonable amount of time.
Some of that laziness is because I'm not crazy about what I'm working on. Being truly honest with myself, though, it may be an internal defect that will prevent me from becoming a great coder until I get over it.
Anyway, that lack-of-laziness is what I think the parent was attempting to describe, not 80-hour weeks.
It partly depends on "stage of life" as well. Most of the outlier coders I know are 40+ now and don't work 80 hours weeks; nor do they ignore their families, etc. However, all did at one time in their lives, and all still have a strong work ethic. Work ethic often goes along with long hours, but beyond a certain level of experience, I see excellent producers working smarter and not just harder, to use the cliche. It could be that it takes the proverbial 10k hours to get to that level of experience; I don't know.
I definitely don't mean to imply working 80+hr weeks. Perhaps the parent does, but I personally don't think it's healthy nor do I do I think it's conducive to avoiding burnout let alone happiness. You can work hard in a 40hr week.
"- understand software at many levels (so-called "full stack" programmers)"
People should start using "multi-stack" to describe those jobs when PHB's decide they know best which technologies to use, and not surprisingly, they're all incompatible and different than last month's technologies.
simply a genetically determine brain configuration
I sincerely doubt this. I think some people like it more in the beginning, so they get the positive reinforcement very early. But I think that what tends to happen is people convince themselves a head of time "This is hard, I'll never be able to do it" which becomes a self-reinforcing thing.
you know that was my first reaction to the email too (actually my first reaction was petty jealously that I didn't get that email)
when I thought about it some more, that might have be a really nice break for someone who deserved it, and the type of person who took that class and was in the top 1,000, is probably the type of person who would make the most of a nice break
take away, the more I thought about it the less I had a problem with it
Oh I'd be being dishonest if I said there wasn't a petty ego blow there on my part too, esp. since there were a few answers that I just didn't check properly. Part of it is that I was simply frustrated that it had become that sort of thing. Had a perhaps romantic view of this kind of thing. Certainly don't mean to take away from those who scored so well :)
Human nature. We all are the victim of "ego" once in a while in our life.
Ego is what bring us down from the top of cockyness mountain as well. The fall would make us humble (and hopefully learn something from it).
At the end of the day, I believe human should learn to live with each other than to shove one another.
I think there's a meme in the term “greatness”. Perhaps a better formulation of the rule is the more traditional one, that it takes 10 000 hours to reach your peak at whatever it is you're practicing.
It's not automatic (you still have to do the right things), and you can be great way before that (especially relatively speaking).
A little off topic but somewhat relevant: When someone asked Larry Wall why there aren't any Perl certifications to classify the experts, he replied "I'm not going to tell people whether they're certified or not. My approach to language design has always been that people should learn just enough of the languages to get their jobs done. They shouldn't have to learn the whole language to begin with. But with certification, you have to be learning the whole language."
In my own experience, this is especially true if you're self-employed / startup guy. Am I a great PHP coder? Not by a long shot. Does it in any way affect our user-experience or how much money we make? Again, not by a long shot.
Isn't that the blub paradox? What if there was an unknown-to-you language feature that you were implementing by hand, or worse, if you were avoiding an end-user feature entirely because it would be too much work and so you're not even considering it? How would you know if it could affect user experience or revenue if you don't know about it?
I've certainly used a language for a while, and then read the book on it, and said "oooh!", and then gone back and fixed all my old code. I could have saved a lot of time if I had known about that feature in the first place. (The extreme case are the posts on thedailywtf.com of people who don't know about loops, but there are much higher-level examples, too.)
That's not to say you can't run a perfectly usable and profitable startup knowing just a little bit of PHP (and as Larry says, more power to you!), but I don't know how one could claim that it has no effect.
In real world this isn't the case. First of all, when you're working on projects, you see that there are some things which you're doing over and over (like CRUD). Now at least in my opinion it is okay to create your first site by using mysql_real_escape_string like functions as long as by the end of the day it gets the job done. Because my priority as a small website owner is always getting new signups. And as far as I'm concerned, my lead does not care how I'm inserting his email into the database. But had I spend my time learning the proper MVC design, I maybe would have lost three precious months of leads which does indeed make a difference to my bank account.
Second of all nowadays you're never working alone (even when in fact you're working alone). What I mean is every programmer nowadays uses Google and StackOverflow to do a lot of his work. So when you're doing something wtf like using the mysql_real_escape_string, you maybe once in a while try to check how other programmers are doing it because maybe for you it takes too long or you're just stuck. And then you learn a thing or two about how to do is more easily. Still, it doesn't mean you have to become a great programmer of the language first, instead you just need to hone the things which you need the most for your business.
It's important to also continue striving for self-improvement. If you hit a level where you've stopped wanting to learn more and started wanting to simply compare yourself to others, of course you'll look good by comparison - but you're not getting any better.
I started programming using Python. When I started, I wanted to understand the "feel" of Python programming, and that was all my mind could comprehend. Now that I have a decent smattering of Python knowledge, I now realize I don't have a total grasp of transfer protocols and I should probably learn a lower-level C language as well. I started off not knowing one thing, now I have added two more.
I'm still a better programmer, but now I realize I'm clueless about even more stuff than I was before.
This is a very good point. I thought I knew a lot about programming/CS until I decided to get my M.S. Now, the more I learn, the more I realize I don't know.
It is actually a little defeating in the sense that I have had to accept that I will probably never know as much as I would like to.
Identifying gaps in your knowledge is half the battle.
Learning and getting better isn't limited to eliminating known unknowns, it includes becoming aware of unknown unknowns.
Once you can google a problem, there is a very high chance you'll be able to solve it.
Skills get rusty. What you learn today you will forget tomorrow, unless you use them in some way.
This is so true, and manifests so sharply/painfully when I go to refresh my very nascent skills in Clojure. I've learned the same material so many times. It's a bummer.
When you don't know much about given subject you usually don't know how much you really don't know and most people at that stage actually have impression that they know a lot It is kind of catch 22 problem :-)
People would rate you "great" but you'll be able to say "How would you know?". At which point, the better you get, the worse you'll know you are. Anybody who rates themselves as "great" is probably on the uphill side of the learning curve.
This attitude annoys me greatly. I understand Dunning-Krueger and all, but I still don't buy the idea that skill is inversely proportional to confidence. I think it's simplistic to think that confidence directly implies lack of skill and vice-versa.
I mean clearly no one likes an ego-maniac who acts like a know it all. But at the same time, sometimes people are right when they say good things about themselves. Even if they act completely narcissistic.
In fact, I would say that maturing has done nothing but increase my confidence in my skill level. I think that I can say that I'm a good programmer and that I hope to be a great programmer some day without being cocky or egotistical. It doesn't mean that I'm always right or that I don't make mistakes. But at the end of the day, I have confidence in my abilities.
I think that "confidence" bifurcates once you gain experience. There's confidence in your ability, and there's confidence in your accomplishments. The former tends to decline, because you realize there's so much more out there that you could be doing to but just don't have the time or mental capacity for. The latter tends to increase as you rack up projects and see people use them.
I thought I was hot shit when I entered college, and I had the test scores - but no tangible accomplishments - to prove it. And then I tried to convert those test scores into tangible accomplishments, and found that maybe I wasn't as hot shit as I thought I was. I think I'm significantly dumber now than I was as a 19-year-old hotshot fresh out of high school. I can see all the alternative ways of doing things, and all the mistakes I made, and all the mistakes I'm still making. And looking ahead of me, I see all this complexity and all these challenges for the things I want to do, and I didn't see that when I was a wide-eyed kid, and it makes me feel pretty stupid.
But looking behind me, I've done some pretty cool stuff. FictionAlley.org. Write Yourself a Scheme in 48 Hours. Ported Arc to JavaScript. 2 products for somebody else's startup that never went anywhere, and a startup of my own that also never went anywhere. Wonder Wheel & Search Options. The websearch visual redesign of 2010. Google's Authorship program. Google's first canvas-based homepage doodle. The [let it snow] easter egg.
And I think about how I just made tens, perhaps hundreds of millions of people happy last weekend, and it feels pretty good. So even though I don't know anything, I must be doing something right.
I think it's one thing to be confident about being great relative to most people, and another to learn just how great the distance still is between you and "the best." I've definitely had this experience while learning things - as I move into say, the top 5-10%, I feel pretty good. And I'm confident putting myself there. But then I see just how far I have to go to get from 5% to 1%, or 1% to 0.1%, and it's the kind of thing that wouldn't have been obvious at all until I got to 5% because I'd have no idea what that difference even meant, and how hard the gap is to bridge.
The problem:
As it is mentioned in the email. People tend to believe, being great is about knowing a lot of facts and stuff from memory. Having information about stuff others don't know. While the fact is intelligence and knowledge only acts as catalyst in the path to success. They are not success or don't lead to success in themselves. Unless you don't understand this you will keep wondering why you are not getting successful while some guy you consider mediocre is winning.
What matters in real world is productivity. Ability to discover things quickly. Learn quickly. Learn to understand and properly state problems. Search for solutions quickly. And then use the best tools at disposal to build things in as little time with acceptable quality in the problem domain. Intelligence and knowledge of facts in this system at maximum fastens your rate of success nothing more nothing less. But yes practice helps. But practice in the right areas.
If you believe reading algorithm and data structures text books and searching for puzzles online will make you a good programmer, then I'm not sure. It may prepare you for interviews, it may also get you a job a bit web giant. It may make you look super intelligent in front of a panel or your team. But in terms of producing software for solving business problems, those facts from memory and even their practice at maximum serves as a catalyst not a crucial ingredient to success.
Apart from practicing writing programs. Learning API's, best practices, tools, techniques and your other usual day to day programming tasks. You also need to practice to be a better team player, you need to learn design, you need to learn customer interaction skills, you need to learn how to gather requirements.
You need to learn how to manage resources - time, money and people effectively. You need to learn effective ways of running software teams. The list goes endless.
Its no longer "Can write awesome code" == "Success". There are a gazillion parameters that will decide you success. And programming is just one of many of them.
Here is the shocker. You don't really a 1000 years of life to give 10 years to each. And even if you had a 1000 years of life you would be bored of giving 10 years to each and you would keep forgetting what you did decades back.
So, just be more productive and iterate your work endlessly. Find flaws and fix them. Do it in iterations. You will be taken care of.
So, just be more productive and iterate your work endlessly. Find flaws and fix them. Do it in iterations. You will be taken care of.
I think, somewhere in there, you have to learn how to make sure you're taken care of, even if it's just learning how/when to ask for things you deserve for your efforts. Every workplace isn't a strict meritocracy.
"If you believe reading algorithm and data structures text books and searching for puzzles online will make you a good programmer, then I'm not sure. It may prepare you for interviews, it may also get you a job a bit web giant. It may make you look super intelligent in front of a panel or your team. But in terms of producing software for solving business problems, those facts from memory and even their practice at maximum serves as a catalyst not a crucial ingredient to success."
I call this the difference between being functionally and theoretically great. After finishing my undergraduate, it dawned on me that my education gave me a theoretical ability. Outside the classroom application is what yields functional ability.
"So, just be more productive and iterate your work endlessly. Find flaws and fix them. Do it in iterations. You will be taken care of."
Absolutely agree here too, refinement I think is a big factor to success.
Great post. Reminds of a recent interview I had. After we discussed my resume and a myriad of topics on tech, software, time management, and people the interviewer gave me a programming problem to solve in order to get a second interview. I asked about gotcha questions or if I was going to need to code with someone staring over my shoulder and he said that none of his employees would ever code that way while working there so what would that test?
It makes complete sense. So what if someone can answer how many ping pong balls are in a school bus, if they can't produce or get along with the team they will not be a good hire (unless of course the job is figuring out how many ping pong balls are in a school bus).
This sounds a lot like Joel's Duct Tape Programmer: http://www.joelonsoftware.com/items/2009/09/23.html
Revisiting your old code is like reuniting with an old friend, and also your friend is now disfigured. Despite this evidence that I've improved so much in 10+ years of writing code, my self-assessment has stayed constant: very good, close to great. (This must be how Zeno's arrow feels.) Tim Daly's post provides some welcome clarity. I'm happy to still have a ways to go. "The lyf so short, the craft so long to lerne."
Well look on the bright side, in Chaucer's time the "lyf" was shorter. Chaucer himself died at 57.
That might just add to the pressure, though, if you think about it too much.
It is a sobering thought that when Mozart was my age, he had been dead for five years. —Tom Lehrer
Whenever I read some of my old code, I am always surprised at how although my code was more simplistic than what I write now, there is a certain amount of elegance and efficiency that I have lost over time. Almost like, in an effort to become a better programmer, I now waste too much time trying to be "fancy".
Wouldn't it be awesome if one of those "looking for a technical co-founder" types would say they were looking for a programmer "who knows how bad they are"?
Not everyone who thinks they're bad at something is wrong.
but everyone who thinks they're a "ninja" is wrong.
The dictionary basically defines "ninja" as someone who is skilled in stealthy movement and camouflage. Interpreted by me to mean someone who can sneak in and do the job without being seen.
Given the innate ability to do the job of programming from anywhere, at any time, I am of the opinion that many programmers can meet the description of ninja. I know in my career, I have worked closely with several people whom I have never actually met.
I do think the usage of ninja is silly, myself. But I do understand that it is more descriptive than just telecommuter, which still seems to imply some physical contact with others.
Curiously I think you missed the more important point of the post. The literate programming insight is clearly more valuable as a "take-away" idea.
Literate programming has the potential to allow a program to "live". Certainly it allows a program to outlive the authors. Most programs no longer have the original authors available. Look at the thousands of dead programs on sites like Sourceforge. Or consider the number of commercial programs that are no longer maintained by the authors.
Your program should be written to pass the "Hawaii test". That is, you give a new hire your literate source code, send them on a fully paid trip to Hawaii for two weeks. When they come back they should be able to maintain and modify the program as well as the original authors.
Literate programming is not documentation. It is a form of communication. You need to motivate code you introduce. You need a good story line. You need to get it past an editor-in-chief.
See "Lisp in Small Pieces" for a great literate program example.
Tim Daly daly@axiom-developer.org
The better he got, the worse he knew he was.
The inverse Dunning–Kruger effect: http://en.wikipedia.org/wiki/Dunning–Kruger_effect
Why do you say it is the inverse and not the regular Dunning-Kruger effect?
Surely silvestrov miswrote and meant "converse", not "inverse".
And it's because the classic statement of the Dunning-Kruger effect is that the unskilled have an unrealistically high estimation of their own abilities. It doesn't talk about what the skilled think.
Actually, it does say. People assume that they're closer to average than they are, whether they're above or below the true average. The Wikipedia link above even says "the highly skilled underrate their own abilities, suffering from illusory inferiority."
Inverse is correct.
D-K effect is "If a person is unskilled, then they have high estimation of their abilities."
Converse is: "If a person has a high estimation of their abilities, then they are unskilled."
Inverse is: "If a person is skilled, then they would have a low estimation of their abilities." <-- this matches silv's assertion
That has much to do with culture, though, see note [9].
'Great' doesn't mean anything if not a relative value. So saying you are great just means saying you are one of the best. That's all that counts.. who cares if you're not as good as the mythical uber-coder?
Sure, once you reach the other side of the bell curve, you realize you don't know anything, compared to everything there is to know, but that doesn't make you anything less than great.
I liked the comment about literate programming. When I need to write a new set of related functions (or classes if I am using an OOL), a great way to start is by writing function stubs, and then write the internal comments. Even better to not immediately write the code. Anyway, this works better for me than TDD.
one of the things everyone seems to forget about the whole 10000 hours thing is that the research says it's 10000 hours of mindful practice. that's a lot harder than 10000 hours of practice...
Same reason I don't type properly. I've clocked tens of thousands of hours of typing, but next to none of deliberate practice.
I, of course agree with point author makes in this post, however 10,000 hours is also meme that Malcolm Gladwell uses to sell his books. I read articles that disprove his assertion and show that often he will fudge data to make a point.
Still 10,000 hours is really good metaphor and it should be understood as such. It is a lot like that left and right brain, sounds really good and people pick this up because it sounds good.
The 10,000 hours predates Gladwell. He just wrote a book based on existing research. I haven't seen anything that actually disproves the assertion (again, it's not Gladwell's). Do you have references?
The problem is that isn't not just 10,000 hours. It's 10,000 hours of "mindful practice".
That amounts to a huge, ill defined, fudge factor that allows you to discount any counter examples.
So it's not clear that 10,000 hours is a falsifiable statement.
So it's not clear that 10,000 hours is a falsifiable statement.
It not as tightly defined as we'd like, but I think it's still fairly close to falsifiable. For example, I think the Dan Plan is a good experiment. He has a pro golf coach and clearly is doing deliberate practice. It's only one data point, but I think if we get several people like Dan who tend to fall on one side or the other then I think we have strengtened or weakened the theory significantly.
True, however it has been considered not true. There are example, Werner Heizenberg is one,
http://michaelnielsen.org/blog/malcolm-gladwell%E2%80%99s-ne...
that people made genuinely great discoveries without needing 10,000 hours, and there are other examples where people did 10,000 hours of deliberate practice without achieving results. I actually found a critique (can't really find it now) which convinced me that he fudges data to support his claims and sell his books, author examined his references that support MG claims and found many to not be quite supportive and that it is a pattern that is present in most of his books.
His books are entertaining read, well written and all, just not reliable scientific information.
There is discussion on Quora where some of the criticism is voiced as well: http://www.quora.com/Malcolm-Gladwell-author/What-are-some-c...
if i understand correctly, the dunning-krueger effect means i should just accept i am god-like????