We analyzed thousands of interviews on everything from language to code style
blog.interviewing.io> Furthermore, no matter what, poor technical ability seems highly correlated with poor communication ability – regardless of language, it’s relatively rare for candidates to perform well technically but not effectively communicate what they’re doing (or vice versa), largely (and fortunately) debunking the myth of the incoherent, fast-talking, awkward engineer.
My interpretation of this is that interviewees who can communicate clearly about code (whether they wrote it or not) correlate with high technical ability. Does this suggest that rather than having the interviewee write code on the spot, one could give them some new code they've never seen before and ask them to reason about it aloud for 30 minutes, then gauge their technical ability based on their ability to communicate clearly about the code?
In other words, could you replace live-coding with "here's some code, tell me about it"?
That's actually how I conduct interviews. It's been immensely successful at weeding out candidates within 5 minutes; although most interviews are 45 - 60 minutes.
Basically, give them less than ten lines of code, ask them what it does, where are a couple bugs, ask what would you name the function, etc. Then we talk about how to improve it. I'd say, less than 20% of people I interview pass. It's actually amazingly low how many people can find a bug and communicate it.
Half the people don't even tell me what they are thinking. And no matter how many times I try to work with them, act like their buddy, or w.e. they just kind of shut down. They think in their head, don't work through the problem at all.
"Half the people don't even tell me what they are thinking."
Maybe they are introverted or simply need to think before they talk. Around 50% of people are introverted (less in usa). Many people are like that and simultaneously quite skilled. Moreover, some environments punish errors, so people who worked/studied there tend to be conditioned to think before talking.
"And no matter how many times I try to work with them, act like their buddy, or w.e. they just kind of shut down."
You are not buddies, you are interviewer about to decide whether they get hired. Many people shutting down might mean that they are not comfortable juggling "buddy" social role and expectations and "serious job interview" social expectations simultaneously.
"You are not buddies, you are interviewer about to decide whether they get hired. Many people shutting down might mean that they are not comfortable juggling "buddy" social role and expectations and "serious job interview" social expectations simultaneously."
Very good point.
Right -- you don't need the "act like a buddy" part for this technique to work.
I'm not sure about introverted. I am, but when I get excited about a topic, I can be as loud as the next person. As long as I feel I have something to contribute, that is. I can rant for two minutes straight and then suddenly and unexpectedly shut up when I've made my point. People are sometimes surprised and ask why I suddenly stopped talking. I'll answer: "I've made my point."... :3
So my bet is on error avoidance.
I'm pretty similar, but solving a bug in an interview question is hardly something I'd get excited about.
I agree. Still, the point I was trying to get across is this: While introverts do not strive to take center stage for its own sake, they are perfectly capable to do so for any good reason. Introversion is not shyness.
Being an introvert isn't an excuse for poor communication. Regardless of its cause, poor communication is bad in a work environment, and it makes sense for interviews to select against it.
This is significant, and can be more so depending on the company and environment.
I've had my own team members tell me they aren't engaged, don't speak up in meetings, etc. because they are an introvert. Like it's a condition we must accommodate. Bullshit.
Communication is a skill. We value clear communication in our company. If you join our company, you may not be an expert in communication, but you will grow that skill.
An interview is a different scenario, but still -- you need to be able to communicate with me. If you can't, we will pass on you as a candidate.
"don't speak up in meetings"
People should speak in meetings when they have something to say. Not just so they speak - that just waste everybody time.
The more important point is that if 80% of people shut down while they are talking to someone, maybe has more to do with that someone then 80% of population being uncommunicative.
We're not quite that naive and clumsy. I'm the first one to make sure we keep discussion on-point and actively shutdown the monologuers of our team.
So let's clarify -- "speak up in a meeting" means when a senior person doesn't provide input at a time that would provide better direction, identify issues earlier in a process, etc. If you're a senior engineer, I need your input to ensure we're doing the right thing(s).
The forum of a group meeting isn't always in the wheelhouse of someone who freezes up when speaking in front of others. It's understandable -- I used to be one of those people. It can be out of your comfort zone, and doing so feels super-risky as well as plain frightening. It is still just a skill to be learned, no different than building muscles through exercise.
In our group, we talk a lot about trust and support. Everything we do is in the spirit of making each other stronger. Everyone has strengths and weaknesses -- share your strengths with the group, and build your weaknesses from the strength of others. It sounds pie-in-the-sky, but as the head of my group, I make sure we take this seriously.
So we don't have 80% of our group shutting down -- nobody talks over the top of anyone else, and we don't allow it. We need the input of the best of those, and sometimes that involves speaking up. If you're on my team and that notion makes you nervous, I understand. But as I've always explained to my team, we're going to help you build skills, and communicating is one of them.
If you want diversity you try to accommodate different personaility types. Otherwise you end up with a mono culture.
There is value in silence and limited communication.
Communication is a behavior and act, not a personality type. The suggestion here is that it's ok for someone who doesn't communicate well to be accommodated in the name of diversity. The notion of "poor communicator" as a personality type doesn't jive with me.
Replace "communication" with "data modeling" or "project management" or "reading". There are plenty who don't necessarily possess those skills.
Should we expect that others shouldn't develop skills in those areas (assuming they are important for their role) in the name of "diversity"?
While I don't fully agree with the parent of your reply, I think they're right about communication being a skill. I am an introvert myself, but I've learned not to let that get in the way of me communicating with my colleagues or friends.
Not speaking up in meetings doesn't mean they don't have this skill though. Some people (like me) don't like to add on-the-spot opinions on something, but prefer to go and mull it over, look into it etc before providing feedback. Doing it on the spot often leads to a lot of wasted time and overly long meetings. Meetings and other synchronous communications are, in my opinion, very wasteful and expensive.
Thinking in your head before talking is not poor communication through. Not being able to explain yourself is, never talking is and shooting every half baked idea loud while others concentrate is.
The fact that someone takes time to put together ideas before opening their mouth is not bad sign. Bad sign would be if they don't share result once they have done thinking.
This is interviewing 101 though: establish a report with the interviewer if at all possible. If the interviewer tries to be a buddy that is in your favor to go along.
rapport
TIL Rapoport's rule, https://en.wikipedia.org/wiki/Rapoport%27s_rule
Oh I do love the interwebs.
I'm an introvert, hates social gatherings, etc. Yet I've found the ability to think aloud extremely helpful. Sometimes just vocalizing my thought makes the thinking clearer, almost as if hearing myself again makes the brain think over every thought once again. In a collaborative situation (such as pairing) this is even more helpful.
I believe it does exactly this. The rise of "rubber duck" debugging is a tool to extract this sort of thing, in fact.
Generally to get candidates who are like this to open up, I simply assert:
"Tell me what you are thinking about right now"
"What is jumping out at you for you to be stalling?"
"Did you see something?" --> "Why?" --> "You seem to be quiet and deep in thought, what are you thinking about?"
etc...
Yeah, that's great if you're NT and all that, but if you're interviewing someone who is anxious about the situation this might only add to the pressure, creating a melt down situation making it impossible for them to think.
In a normal work environment where he or she is left to think freely the the same candidate might excel.
I'm sorry, but this is an interview. It's entire point of existence is to allow communication.
If the candidate would like some quiet time to think about the solution, I expect them to respond with "I think I'm forming a solution, just give me a couple of minutes to think about it before I present my case".
If they can't even do that, then I'm sorry, but they are no good to anyone. You can be the smartest and best developer in the world, but if you're completely incapable of representing yourself and your ideas; to have a dialog about your work, you are effectively worthless as an employee.
Some companies might have a place for that special someone, who you can lock in a room for a month and he will later emerge with an amazing new piece of code that will solve your problems, shielded from all the problems of the outside world, but companies where that is possible are very rare.
Some companies might have a place for that special someone, who you can lock in a room for a month and he will later emerge with an amazing new piece of code that will solve your problems, shielded from all the problems of the outside world, but companies where that is possible are very rare.
Also, being able to develop like that is very rare because in the real world, requirements change or are vague and need to be clarified, or your software has to interoperate or integrate with other software. There is simply no way to avoid regular communication and still produce something that works how it needs to. Problems where you can shut yourself away for a long period of time certainly exist, but in the grand scheme of software jobs, they're few and far between.
And on a normal workday in normal surroundings that person might do just fine. An interview is a totally abnormal, high pressure situation.
The point is that with this kind of make or break style of candidate vetting you're probably leaving a lot of great talent by the wayside.
Thanks, yes, that's exactly my point.
Unfortunately interviews are brief, the point of this isn't to add pressure its to relieve it. Silence for too long in such a short period introduces pressure. Letting them use me as a sounding board we can create a connection that opens dialogue and allows ideas to flow.
The whole point is to make them comfortable and that takes the highest priority. Being able to read the individual and react to them is key, not everyone is the same, hence why there are a multitude of responses, and the ones given are just a starting point.
Ah, I read you now, thanks. I may have reflected on one interview that went particularly badly, being somewhat reminded of it. But with the extra information, that is a great approach if I understand you correctly - especially the silence adding pressure part, which is absolutely true.
I.e. if stuck at something, even if seeming trivial, it might pay (to go back or have a chat around it, even guide him/her a bit so as, etc) to get the candidate loosened up and feeling safe which should then yield better responses after.
You can bikeshed interviewing processes that leave out this or that potential candidate forever. And any candidate who bombs in any kind of interview could turn out to be the next Einstein.
There is no one-size-fits-all interview technique; if there was, it would be used all over the place. Good interviewing techniques are about playing the numbers, not leaving no programmer behind.
That sounds awful, you are actively sabotaging someone by ruining their concentration while they're trying to solve a bug in unfamiliar code under time pressure. This is like having a manager requesting status updates every 10 minutes when you're trying to fix a production failure.
Even if I solve the problem I'm walking out of that interview with a very negative opinion of your company.
If you completely shut down during the interview and don't communicate, you're probably getting walked out from the interview. If you cannot or will not communicate, then the interviewer is unable to evaluate you.
I also don't think he was advocating nagging either, simply asking questions to get the candidate talking. If you "go dark" for 10 minutes, you're pretty much definitely stuck. Talking to the interviewer might get you unstuck. Staring at the whiteboard quietly probably will not.
Yep, totally not nagging, giving them room to think and then jumping in to get them talking is definitely what you want to do here.
This also isn't your day to day job so you have to keep in mind that these interviews are generally held to an hour length and you can't really give them 30 minutes for a single problem that wouldn't give you much insight into the person.
It also heavily depends on the culture of the company as well. So being able to communicate effectively and act under pressure is something that goes with pair-programming and code reviews.
To note, I have never had anyone ever leave an interview feeling bad about the company or our process, the interview is about the individual.
In one interview, I was left in a room alone with coding exercises. The involved frameworks I did not knew and I had internet available. Then they came in and we talked about solutions.
I still think it was pretty cool way of interviewing.
You might be over-estimating the complexity of the kinds of problems generally used for this.
You're not bothering people while they're taking an expected amount of time to look at it. You're stopping a several minute silence after handing someone a basic loop and if statement.
For most positions we're talking about, this should not be a major problem. I found when doing something similar people either looked at me like I was crazy for giving them such a basic problem and after being reassured there was no catch answered simply and clearly, or just couldn't do anything. Followup questions about the code then went terribly (e.g. a JS contractor asking for £500+/day who didn't know what the difference between global and local variables was, why you should put var in front of things, etc).
These things are often good to weed out the incredible number of people who seem to have little to no coding ability at all for jobs where that's an obvious pre-requisite. I've also had people fail to solve a basic problem (roman numeral generator or reader) at home in whatever language they wanted, not even getting vaguely close. One person even sent in their broken version pasted into a word document.
If someone can't express himself, most likely that person won't work well in the team.
Some people take a bit of time and familiarity to warm up.
True. I am not sure if I want to take the risk though. I don't want to be a friend or tell jokes but i want to be able to talk technical stuff.
There is an enormous difference between a person taking a minute to collect their thoughts without speaking and a person being unwilling or unable to communicate. I can't be sure, but it sounds like at least some people in this thread are talking about pushing back against the former as well as the latter.
I am fine if they think about it for minutes or need a while to sort thoughts out. But at some point I would like to get some response. What else can I go by in the interview? And even if that person can write great code I am not sure things will work in the long run if he can't explain himself.
> Half the people don't even tell me what they are thinking. And no matter how many times I try to work with them, act like their buddy, or w.e. they just kind of shut down. They think in their head, don't work through the problem at all.
As it happens most of my "aha!" code-related moments come when I'm either under the shower or when I'm washing the dishes. Very rarely did it happen for me to look at a piece code while in front of my computer and then immediately be stuck by what that code does or by some hidden bugs in it. It's all sort of mechanical, at least while I'm sitting at my desk. I'd say it would be very hard for people like me to reproduce that "aha!" moment during an interview, in front of some other people who expect me to have those moments of enlightenment right there and at that precise moment.
You can always find something to talk about in the code, though. You can start by just describing what each line does, in your opinion. Then you can go back and start to group the lines - e.g. "lines 1-3 are the initialization, lines 4-6 are probably intended to flurb the glorb", etc. Yes, it's possible that you can find an interviewee who is great technically but shuts down during an interview. The question is how much time and effort you want to invest digging for that person, especially in light of such studies.
You could look at a more complicated piece of code, let them think about it for a bit, and if they don't have that aha moment you can still discuss it with them. For example, pick a problem you someone on your team struggled a bit with, something that you wouldn't necessarily expect a candidate to solve during an interview, and discuss solutions with them. I think you could learn a lot about the interviewee this way.
omg, are you me. No ahas at interviews. Good to great ahas at random times. Fuck whiteboarding interviews.
> They think in their head
I'm left wondering why this would be a problem.
I don't know if the following is what the GP meant, but:
I've often had the pain of dealing with people who try to do it all in their head. Sure, one or two are good enough where it's not a problem. The rest of them? They keep getting confused and making mistakes. For things that are not very complex. If they had only written it out or drawn on paper/board, they would not get confused.
It's most obvious when I try explaining some code's algorithm to them. They'll begin with understanding, and then 70% of the way through they'll keep getting confused because they couldn't hold it all in their head.
I've often had to tell these people "Don't try to do it in your head!"
I have sort of the opposite problem with a coworker. This coworker only wants to talk through things and doesn't want to work them out on paper or a white board. The problem is that I can understand/remember almost everything I read, but have almost no ability to comprehend things that are being spoken (I suspect people are talking faster than my ability to think and parse what they are saying). I've talked to him about this to no avail. He just wants to talk things through, and then expects me to be following along and to provide insights when I have no idea what he is talking about. But if we can get it on paper or a whiteboard then I can reason about it quite easily.
I know the type of person. I usually can't follow them either (less about them being fast talkers, more about them being bad at explaining things). I found myself to usually only half-listen to what they say, and letting the other half of my mind to reconstruct the problem independently by itself. Sometimes I fail, and then I need to ask them to repeat what they just said. Other times my mental model is spot on, and I can find their problem quickly.
Try to maneuver him to where there's a whiteboard and write things down as he's talking through them: if he's reading cues, he'll slow down so that your visible note-taking keeps pace.
If that doesn't work, get him a rubber ducky [] and tell him to go talk to it if he's not willing to compromise on communication styles.
I worked with somebody like that once. He would get massively frustrated because he needed that back and forth to think through ideas and I was giving him nothing. I would get frustrated because I had to digest information before I could say anything about it and he wouldn't stop talking long enough for that to happen.
I had a lot of success with asking him if I could think about the problem and we could schedule a meeting in half an hour or so. He was much happier with that result because I had thought through the problem and was now able to give him the back and forth he needed. I was also much happier because I was no longer expected to come up with insight on problems I had no time to chew on.
I am often both requesting and receiving verbal problems related to the software I maintain (an Engineering simulation and analysis tool). When receiving a problem verbally is to slow the other person down by challenging any assumptions they may be making. By asking them to clarify it allows me to build a mental model of the problem before at a sustainable pace and also helps them identify a bad assumption they may be making.
When I am asking someone else to be the rubber duck I strive to highlight any assumptions I am making at each step of defining the problem. If I don't do this the other person will usually struggle to keep up.
Can you keep some paper or a whiteboard around?
You don't have to just accept doing your mutual conversation exactly as he likes it.
That is not the opposite problem. Talking through them and working them out on paper are the same thing. They are the opposite of someone doing it in their head.
Not being able to successfully and clearly communicate their thought process about a simple problem one-on-one is a bad signal for being able to communicate clearly about a difficult problem in a group setting or being a useful sounding board for technical ideas.
It's clearly important to be able to communicate about technical problems, but is it really necessary to be able to narrate your thoughts about a technical problem in real time as you're first having them?
In my work experience I don't remember ever having to do that. I'd look at code & investigate a problem independently, then talk to someone about it afterward. I think that should be considered at valid approach to the interview.
It sounded from the example like the code under discussion was quite simple, "ten lines of code". So it's plausible your "look at & investigate independently" is allowed to happen, and should take less than a few minutes, say, and then this interviewer is inviting them to walk through things in a shared manner.
In other words, from my perspective, you're not actually in disagreement with their method. They weren't saying "narrate in real time as you're first having [thoughts]". Practically nobody does that ever, or is expected to do it, except in other practices like mindfulness or therapy and what have you. They were saying they ask them to understand it, explain it, and then talk about improving it. OP's problem was with the candidates whom couldn't understand the code, and weren't even willing to talk about the state of their understanding so that the interviewer might help them walk through things further...
Yeah, but if you shoot for elaborate ones you will get politicians/lawyers not coders.
I don't have any source but from my experience mathematical/computer minded people are more likely to be introverted/talk less then arts people.
>I don't have any source but from my experience mathematical/computer minded people are more likely to be introverted/talk less then arts people.
I think you are right about that, but that just means that if they want to be effective members of a team in an organization, they need to work on their communication skills. Ability to communicate is something you can learn to do better.
I know this for a fact. I used to teach public speaking, and students came out far better at it than they came in. The same is true for many other communication skills. See for instance Marshall Rosenberg's book Nonviolent Communication, which is great for interpersonal conflict situations.
In fact, I think a lot of introversion is at least in part due to poor communication skills that could be improved with training and practice.
I studied engineering and went to an engineering school (EE, ME, CE, etc). I have had contact with many engineers on a daily basis, although I do not do this work anymore myself, and I have found that engineers can talk very well - actually I have never met one that can't and/or won't (the real problem is getting them to shut up). So in my experience, smart useful people are always able to communicate. There may be the occasional savant who is awkward or such but I have not met them.
As for programmers - same thing. If they are any good they are more than able to talk the talk.
The article explicitly notes, though, that most excellent programmers were also excellent communicators.
Effective communication doesn't necessarily mean you talk a lot or are extroverted. It means that you say things that matter when they matter, and that is a crucial skill to have in almost any work environment.
Having a high signal-to-noise ratio is (obviously) also critical to communicating successfully about code / technical issues. Talking for the sake of talking or waxing poetic about the understood 5% of the problem at the expense of the 95% remainder is a negative signal too.
No. Thinking out loud is not the same as communicating a thought process.
for one thing, you can't get partial credit for ideas/work that only exists in your head. (Similar to why it's a good idea to show work on homework assignments and tests)
If they sat there for a bit thinking about the problem and then provided a solution, I'm sure that would be fine. But that's not the situation he was describing.
Because you might need to work with the team at some point to help each other solve a problem.
At least in my case, there's a big difference between
(a) discussing an open problem with a colleague, and
(b) explaining what's going on in my mind as I'm attempting to solve the problem single-handed.
I'm very effective at (a), and it's a skill that I've used many times as a software developer and/or grad student.
But (b) is more typical in psychoanalysis sessions.
May I ask what you feel the difference in the two is?
I often have to explain a problem to my coworker and vice versa. It's also really helpful when pair programming. Just curious about your experience.
There's a weird dynamic in interview questions you rarely, if ever, get in the workplace: the person who actually knows about the problem is staying quiet while the one just starting to reason through it is doing all the talking about it.
There's no back and forth -- you're not actually trying to help them solve anything -- it's someone who already knows the answer judging how much of it I can reason out in half-an-hour. I can't take 5 minutes to think it through, I can't stop talking for any significant amount of time without being told to narrate what I'm thinking about, etc. That's nothing like a workplace situation, where they'd be expect to hold up half the conversation, be providing input and thoughts on the problem, and I could just tell them I'd be with them in 15/this afternoon/tomorrow/etc.
It would be more realistic if the interview didn't get to see the question before the interview, either. (And why I think design questions work better than coding questions -- you can do something they're not expecting in the design, ask more kinds of questions about the spec, etc which cause an honest back-and-forth.)
"They think in their head, don't work through the problem at all."
Those are independent clauses. Just because the individual is not talking out loud, should not be indicative that they aren't working through the problem. Personally, I am a very visual thinker and layout visual models in my head for problems with high (3-5d) dimensionality that would take longer to draw clearly in 2D space while explaining all the traversals I'm mentally making. I would then reduce a solution, and explain that, or I apologize for zoning out.
It became obvious to my office mates when I was working through and reducing a problem space, because I would "hang" on the thought. Sometimes I would be down, stuck in thought, for 15-30 seconds and according to them they could tell because my face loses emotion and my eyes flutter about.
That's an interesting approach, but I wonder if you're screening for people who can speak and code vs just people who can code.
I'm a product manager and I suspect I would do pretty well in this type of interview unless the snippet is especially complicated and/or esoteric.
In that case you might be screening for people who can talk but not actually code or people who can code but might not be the best communicator.
Interestingly enough, if youre optimizing for specific outcomes, you may not need to screen out the latter.
Engineers who struggle to communicate but are productive can be extremely successful given the right environment - including having someone on the team who knows who to work with various personalities and is technical enough.
Just some rambling, but it's worth pointing out given the scarcity of engineering talent. Not everyone is going to be Paul Bucheit, ace product manager and engineer all wrapped into one.
I'm not sure why you're being downvoted, but it's a valid comment. Even if someone disagrees, you clearly communicated a point that adds to the discussion.
Many are taking the article at face value, and are assuming there are no methodological flaws and that this study has been repeated enough times to invalidate any comments to contrary. It also didn't turn this topic into a "case closed" matter, as it's too big of an unsolved problem for that. They also just analyzed interviews, but did they monitor job performance after the interviews and for how long?
Additionally, differentiating between being a "good communicator" and a "good interviewee" is frequently overlooked. People too frequently rationalize their own interview approach as not the problem. Most candidates accept that they could improve and perform better at interviews, but a lot of people conducting the interviews don't think their process could do better.
> In that case you might be screening for people who can talk but not actually code or people who can code but might not be the best communicator.
From TFA and literally the thing that started this whole thread:
> Furthermore, no matter what, poor technical ability seems highly correlated with poor communication ability – regardless of language, it’s relatively rare for candidates to perform well technically but not effectively communicate what they’re doing (or vice versa).
Right but that would also be true and exactly what you'd expect if interviewer mainly judged technical competency on ability to communicate. Since the form is self reported it's impossible to separate out whether the correlation is causal (poor communicators are poor technically) or due to interviewer bias (poor communicators have a harder time convincing interviewers of their technical chops) or even other factors (being flustered hurts both a candidates communication and technical skills).
This is after all an advertorial piece of data spelunking not a study.
I'd suggest making it more than 10 lines of code. Similar to kids finding easter eggs -- you want a reasonable number of things out there to talk about. It sounds like you try to get there by keeping it simple, but I would posit that some people's brains may lock up on a single block of code and having it in context or more there gives them something else to look at. Our brains are very bad at having insights when focused on the thing you're trying to have an insight about.
I bet you could go farther and make a really bad piece of code and introduce it as such, and then have them talk about just how many things are bad about it.
Just curious, if you've weeded them out at around 5 mins, why does it take 45-60 mins?
Also, could you possibly just do an initial phone/Skype interview to save time in that case?
I've been in this situation. Often we know within a few minutes whether the candidate is a good fit for our team, but for legal reasons we have to stretch out the interview so that the candidate can't make a claim against us. It would be better for both of us if we could cut them loose quickly, but we have to cover our butts because that's the kind of litigious society we live in.
And their is an initial phone interview, but it is done by HR or our boss, neither of whom are technically proficient.
Isn't that why a lot of companies do tech phone screens?
If it's because of anti-discrimination laws, can a technical phone screen be short? You could discriminate based on voice too so I'd think that'd still be an issue.
I think the advantage is that the phone screening is much shorter than an hour, for everyone.
And "you haven't passed this interview" is less suspect than "I ended this interview early"
It's been immensely successful at weeding out candidates within 5 minutes
Indeed, it's amazing how quickly people reveal themselves in this way.
No whiteboards, no quiz shows, no grilling of any kind needed. Just: "tell me your story".
When you put people under the pressure of not having a job, a time limit and worst, the pressure of having someone looking over you, bugging you, and judging you, it can be incredibly hard to think straight or say anything at all.
You can act buddy all you want, but people aren't stupid. They know if they take a few extra seconds too long, you're already thinking "this guy sucks"
Each moment of silence makes the interviewee feel more tense. Instead of staring at them, why not leave the room while they read through the code and collect their thoughts?
You could even have HR do it. When they arrive, HR hands them the code and instructions. Then, they read it for 15 minutes, until you walk in.
I'm absolutely not an "introvert" of any sort, but if you give me something I've never seen and ask me to reason about it, you'd better take a very wide step back, stfu, and wait for 10m - if you're to get any results from my slow brain.
That's great! I really hope interviews like that become more common.
Something I've learned from making Youtube tutorials is that you need to have a deeper knowledge to actually teach something. The point at which I study a topic and have it "click in my head" comes way before I can actually manage to "explain it back clearly". As I'm writing the script for the tutorial I often realize there's a gap in my understanding that I have to go back and work on.
The old saying "If you can't explain it simply, you don't understand it well enough" applies here I think. :)
That has been my experience as well, one level abstracted. Back when I was a position when I last had to organize teaching employees technical domain-specific skills, the best way we found for them to learn was for them to teach others, this was accomplished by having one generation of employees train the next and then reviewing the newly trained employees.
Then for the next round of hiring, advance the employees up the education pyramid. The trainees become trainers, and the previous trainers mentor the new trainers by reviewing the newly trained employees.[1] It was one of the things I think we did we really well and that I am really proud of (even though the business went belly-up).
When training is done is this pattern, hopefully a virtuous cycle of education is established throughout your organization.
1: Note that this was at a quickly growing company, with recruits that had little to no relevant actual know-how to do what we did.
I occasionally wonder if something like this would significantly improve our education systems, where part of our schooling includes tutoring / teaching someone from the next year.
We do it in reverse. In the first 2 hour interview, candidates learn our tech from 2 or 3 engineers. We then send them home with a set of coding tasks that are all relevant to what we do. As in functions of utilities that we actually would need to write, as opposed to puzzles. They choose one of 6 tasks, write ~100 lines of code, then come back for the next 2 hour interview and lead a code review of the code they wrote. This approach allows them to take the same time and care they would if working for us, rather than putting them in an unnatural state of coding at a white board in front of one to n scowling veterans.
Engineers who are capable can clearly explain their coding choices and their style is evident. If they've borrowed too liberally or had someone else write it for them, it becomes apparent pretty quickly because they can't clearly explain the code they supposedly wrote.
This is a fairly new approach for us, but so far, the candidates have appreciated it and the team reports that this approach gives them a good sense of the candidates ability.
So, based on the new information presented in the article that people who can talk clearly about code can also write code, are you likely to change your interview practices to omit the take home task and ask the candidate to talk about code that somebody else wrote? It would be faster, easier and according to the data in TFA just as accurate, yes?
Do you have data that shows how fast and easy the take home test approach is so I can compare? I only have anecdotal data.
I do give candidates the option of the take home vs. white boarding. So far (it's admittedly still a pretty small sample) they've all opted for the take home. However, we'll take a look at this approach as well.
> Do you have data that shows how fast and easy the take home test approach is so I can compare? I only have anecdotal data.
I'm not aware of any solid data around take home tests. All I can give is yet more anecdotal data, which is that they take a lot of time on both ends. So much time, in fact, that I've had several interviews where the interviewer never looked at the test code I was required to send in. Whether they work or not is irrelevant if nobody ever looks at the code.
I see. The take home assignment is not a test or exam, per se. It's code sample that responds to one of 6 coding tasks. We expect something around ~100 lines, typically in C. The candidate leads a code review of their code by the team when they return for the second 2 hour interview. The code review usually takes about 45 minutes. The remaining 1:15 is 1-1 interviews. So we can assure candidates that someone reviews the take home assignment as they lead that review.
Who doesn't love homework, especially as an adult.
No thanks.
Is it really that bad? An onsite interview takes the full day anyway.
Yes, it is. What other job asks you to do homework outside of programming?
This was the style of interview I ran. I usually picked snippets from an open source project. I usually grabbed pull requests that fixed bugs.
Have them delve through it. Reason aloud, ask questions about it. See how they gain context, and how they troubleshoot. The biggest down side was they sometimes need to reference external projects, or documentation. I would often have to search on my work laptop and then let them should surf the docs.
When I'm nervous, I don't communicate well.
When I'm nervous, I don't code well.
I don't know about other people, but for me personally, when you measure me under a high-stress situation, all you are measuring is just my level of nervousness.
I must ask, did you get a university degree and if so how? Personally I have found interviews to be trivial compared to exams in school, so while I do get quite nervous for some of my math examinations I have never had this experience in interviews.
I have a CS degree from a first-tier school. My experience with exams and interviews is almost the exact opposite of yours.
In exams I don't get as nervous because I know that if I know the material then I will pass, if I don't then I won't. You can't bullshit your way past the professor/TA grading your exam.
But in interviews I know that people who are far better than me have failed[0], and that people less competent than me could get the job. So it's not my coding ability or my experience that's being measured here, it's something else. And I don't know whether I have that something else or not. Maybe I do, maybe I just don't.
[0] https://twitter.com/mxcl/status/608682016205344768?lang=en
I feel exactly the same way. The good thing is you keep doing the interviews for some time at some point you run into normal people and then it's easy.
Last but not least, after you fail an exam you can usually see it with errors underlined and often have the teacher explain them for you. If you answer interview questions in a suboptimal way, you get no feedback. You don't know what was the error you made, if any. Maybe another candidate simply read about the logic puzzle that was used.
The canned response I got was "We're sorry, but we can't currently recruit you. Please try again in half a year or so, maybe we will have some new opportunities." If you press on, you just get evasive answers.
Hmm sure, but companies that don't give canned responses are liable for lawsuits. A teacher's explanation can be wrong or misleading. I've also had some of teachers refuse to explain the alleged errors in my work, and I've even had to argue with teachers that they are wrong and I'm right.
I suppose if you went to a great school and then to some shitty interviews, you'd like the school more than the interviews, and vice versa. The difference however is that a teacher is only motivated by their altruism to help you, whereas a company has actual financial motivation to find good people during the interview.
You're justifying company behavior, but my post was in response to the one who wondered how could a person pass exams but fail job interviews. I pointed out there are valid reasons that are not the candidate's fault.
I'm in the profession to write and debug computer programs, not debug people who like to keep me in the dark. This is frustrating.
That's A Bingo. That's how false negatives are made.
I used to take "relatively rare" a step further and viewed it as a myth. I'd get into debates with coworkers that all one needs to do is have a conversation with a candidate to fully judge their abilities. No need for coding tests or the typical questions. Unfortunately after we brought in a ton of candidates there were a few people who fell into this bucket. I have no idea how it happened but they couldn't code themselves out of a wet paper bag when it came time to do so even they kicked ass in technical conversations. Sadly it became a bit of an "I told you so" moment where the outliers are made up to be more frequent than they actually are.
I like your idea though.
Had a similar experience with a remote sysadmin. Interviewing, he seemed to understand everything. He explained how he'd setup networks, details on using SSH for bastion hosts, etc. Seemed to really know stuff.
After hiring him, he was unable to even SSH to a box. As in he didn't know how to give us his private key, or how to configure his SSH client.
Maybe one guy takes interviews under other people's names then lets them mess up the work.
OTOH if he had given you his private key, surely he would have failed the sysadm test right there.
That happened to me once, during a phone interview with Google.
My brain simply froze during the live coding test. I looked like an idiot, even though the coding challenge was one I could have normally handled in my sleep.
My guess is that it was due to anxiety about doing well on the interview. But to this day I'm gun-shy regarding live coding tests.
I did that at a Google interview too, on a variation of a question I'd practiced a few days before no less.
It shouldn't take much demonstration to see that they are basically competent, and there shouldn't really be a difference between asking them to loosely state the commands they would issue and the process they would take and watching them do it.
People normally perform better by verbalizing the process than they would by sitting at their computer and performing it, since a lot of the concern in an interview can be whether you're getting your code/commands syntax-perfect or not.
The only way I can really see this going badly would be if the interviewer was only asking generalized questions that someone like a non-coder HN reader would be able to pick up from the zeitgeist without actually having ever implemented. Instead of asking about trends or fads, it's probably better to ask detailed questions about implementation processes.
If they really have internalized all the applicable concepts but can't express them in computer language, well, it shouldn't be hard for them to get up to speed on the syntactical fineries. Make sure you're testing for grokkiness of the core concepts needed, not how well they're following the trends or the news.
Personally I use a very small code test and put the rest of the energy into a detailed conversation.
Its fairly easy to see how this correlation breaks down. It is almost a truism that someone who is good at communicating with a wide range of people, is likely to make that wide range of people think they are more technically competent than people who aren't good at communicating with a wide range of people. Especially given the virtually random nature who you will communicate with during any given interview cycle.
But, its quite possible for people who are not naturally able to communicate with a wide range of people to be able to communicate with a smaller subset (say people they are more comfortable with, or people working on similar problems to them).
The more interesting question is, could those people who communicate less broadly, in a situation where they do communicate well, perform technically at the same level (or better) as people who are more natural communicators?
Another way to say this is, evaluating interviews is fairly prone to bias towards people who interview well. As someone who is hiring I'd be much more enthused about their results if it correlated to job performance, not hiring results. Its almost gospel at this point that there are people in software who interview well who do not make good hires, but the methodology of this article doesn't talk about this at all.
All told, nothing about this research is persuasive to me on my opinion of "interviews are largely worthless in determining who will be a good hire".
My previous company did this. I have seen three variants over my time: explain what this code does; explain why this code is wrong and how to fix it; explain why this code is inefficient and how to optimize it.
> explain why this code is inefficient and how to optimize it.
That sounds like a trick question. I've done a lot of code optimization. Here are a few generalizations I've drawn:
(a) You rarely know what part of a program is the bottleneck until you profile it on an appropriate workload.
(b) You often don't know why it's slow until you dig into performance metrics like cache-miss ratios, branch-mispredicts, etc. Often at the disassembly level.
But for a short, isolated piece of code I assume you would be able to suggest a handful of plausible inefficiencies and explain how they could be tested for and optimized for.
Well that would be a perfect answer if you ask me.
No, it would not be a perfect answer, and could work against you: it shows you are not willing to work with me and assume some intelligence on my part (until proven otherwise) and thus you are potentially not going to work well with the rest of the team. I would respond that we have already profiled this code and discovered it is a bottleneck, now answer the question. It won't be hard to overcome the negative impression, but it left a negative that you don't want.
Now if you responded in the form "Of course in the real world we don't look at code until after a profiler shows the need. I see X which is suspicious, let me look deeper". In this case the comment is perfect: it shows you trust, but know to verify that trust. It also has the side effect of giving you time to think.
I fail to see why analyzing a problem before solving it somehow translates into "not going to work well with the team". Problem analysis skills are far more valuable than problem solving skills. Anyone can search on Google for a solution to a given problem, but no one on Google can tell you what the actual problem is. If you really thing the answer I was responding to is not acceptable, then the question is a lousy question imho.
"It doesn't make sense to hire smart people and tell them what to do; we hire smart people so they can tell us what to do." - S. Jobs
Remember, this is an interview situation, I can't give you a realistic 500000 line program with a slow hot spot and ask you to fix: we only have a few minutes (maybe an hour). I have simplified the problem to a single function that is a bottleneck, I want you to analyze this function so I can see how well you can analyze things.
Refusing to do something because it hasn't been profiled is not analysis, it is refusing to do analysis. You have been given a problem and been asked to analyze it. That refusal makes you not a team player.
Once again, this is an interview. Noting that this scenario (a single function that is known to be too slow for reasons that can be fixed in 10 minutes) never happens in the real world is good - it shows you understand that real world analysis includes elements not included here. Stating you won't do this because you don't have profile data is not being a team player.
I understand what you're saying, but I still fail to see how all of this translates into not being a team player. That really has nothing to do with this at all. If the interviewer still wants a solution based on the facts as they lay on the table, he can always ask again. The first is answer is still the best answer in my opinion, because it is much closer to a real-world situation and it shows a deeper contextual understanding of situation. It also implies experience.
Anyone can repeat someone on StackOverflow who says that function X really is faster than function Y. That's hardly an accomplishment. It's like asking someone to perform a basic non-trivial calculation out of his head -- why bother?
That's the problem -- you're smuggling in non-obvious, hidden assumptions about the framing of the interview[1] and then penalizing someone who doesn't share them, without giving them the chance to isolate what those assumptions were and why they should use those in this context.
So they're getting rejected for a reason other than technical skill.
[1] "Okay, we're going to talk like I've already fixed problems A, B, and C, but not X, Y, and Z, because we obviously don't have time for a full interaction that is open-ended enough to allow for all of those to be an issue, and you're being judged on how quickly you focus on and solve the X/Y/Z."
I think you misconstrued the intent of my original post. I was merely trying to point out that the interview question was (in my experience) based on a very questionable premise.
I wasn't saying anything about how one ought to handle such a situation during an interview.
I totally agree that a response akin to "Your question makes wrong assumptions. Let's move on to the next question." would be completely inappropriate for an interview, or if the same issue arose in a normal work situation.
>No, it would not be a perfect answer, and could work against you: it shows you are not willing to work with me and assume some intelligence on my part (until proven otherwise) and thus you are potentially not going to work well with the rest of the team
I don't know how what was said implies this in anyway.
This is one of the biggest problems with interviews. Every interviewer has some dorky idea that x means y is a terrible developer while another interviewer thinks x means y is a great developer.
"I want an opportunity to learn" - Oh great. We want someone whose interested in learning more and isn't cocky.
"I want an opportunity to learn" - Oh no, that just tells me you don't know much and aren't interested in contributing
Yeah, that's why interviews are nerve-wracking. At their core, they're very subjective. The interviewee doesn't know if the question is supposed to be a trick to see if they say "Well, of course I couldn't optimize it, because I haven't seen the performance data yet!" or if they're supposed to suspend disbelief for purposes of the exercise and say "Well, of course it'd be best to profile, but we could change the way it does X...".
The chance that the interviewer expects some weird thing like that is random, so there's no way to know whether you're getting it "right" or not. Select the wrong answer and you get viewed as an ignoramus or a smartass. Just have to take the gamble you like the most and see if it aligns with the interviewer's preferences.
Once I was asked what kind of software publications I like to read. As a habitual HN user, this should've been the easiest question ever. However, when asked, I guess I was feeling strict that day and said "Well, I don't really read a lot of software-specific stuff, it tends to the more businessey side of things rather than being strictly software related", thinking of highly technical blogs like Lambda the Ultimate. The interviewer said "What about things like Joel on Software?" and I said "Yeah, I've read most of Joel's stuff, but I don't really consider it very 'softwarey'".
I could tell that cost me the interview, but what can you do? Someone else in a strict mood that day may've been delighted by such an answer, happy that I didn't hold delusions and could tell the difference between comedic rants about the business of managing projects and hiring developers and serious, borderline-academic publications that include equations and ponder theoretical dilemmas.
The best thing is just to take the whole process casually without getting over-committed or holding a grudge. False negatives cost companies much less than false positives and they're evaluating you in the context of their other applicants, who are also essentially random, so there's no way to really know anything. It's just about whomever seems least risky and most useful, and as an interviewee, there's no real way to know what that means from the POV of the interviewer, so it's hard to optimize your responses.
The real way to win interviews is to recognize them not as technical processes, but social processes. Admitting this makes engineers nervous, but it's important they understand that humans, not compilers, will be performing the evaluation, and they must learn to speak human to get satisfactory results.
I think "find the bug in this code piece" questions are dumb! When you program, you don't get a helpful fairy (or Clippy for you Windows users) telling you the code is wrong and needs fixing. The bug manifests itself in production or when you try to execute the code. Bad output, exceptions or segfaults, slow operation, infinite loop. Then you have a vague idea what to look for, or at least what kind of input causes errors. Furthermore, if you're the author you should know what the code is trying to do, and how. I believe programmers in teams are usually told to fix their bugs, not fix someone else's bug ?
I'd rather see a question like "Please modify this piece of code to add X functionality."
Those are my preferred types of interviews when I'm an interviewee and a technique I've used a few times on the other side of the table. It's so effective and efficient. I don't know why it isn't used more. Maybe it finally will be, now that there's some solid data supporting it.
That's what I do. If I can have a meaningful conversation with someone about stuff he did or stuff I am describing most likely they will do well. If the conversation gets very vague or confusing most likely things won't work out.
Some companies just love fast-talking engineers because they sound like they know what they're doing.
Some engineers spend a lot of effort trying to project a specific image of themselves at the expense of bringing up the morale of others in their team.
On a few occasions, I've seen an engineer tear apart the work of another engineer at a team meeting in order to make themselves look good and more senior in front of their boss - Often, the critique is delivered with such vigour that it sounds rehearsed. I've seen this behaviour in startups mostly.
I've never experienced an interview like this, but it sounds like it would be both challenging and fun - especially if it was done in a language (or pseudo-language) that you'd never seen before (then again, they could lob BF at you or something eso like that!).
Based on other responses to your post, it's heartening to see that others do this kind of interview, so maybe in the future I might encounter it.
The version of this where that code is clearly unsuitable for its stated purpose is my favorite coding interview question.
The unsuitability should be extremely obvious, no "spot the missing semicolon". Maybe it has a severe performance problem, maybe an obvious security flaw, maybe a logic error. The point is to prompt a discussion of better alternative ways to do what the code is trying to do.
this is also a prominent technique I've used in the hundreds of interviews I've conducted.
I do this frequently and I utilize their resume to power the code.
Someone says "I am a 9/10 in JS" I give them something that uses the animation frame api, the dom api, and a few more javascript APIs that I personally am not fully comfortable saying that I am a 9/10 on (been working with JS for over ten years) and see how they decipher what it is doing, what jumps out, and how they would review it / give feedback.
9/10 times they don't actually understand Javascript.
Overall, the data lines up with my own intuition, but I thought I might throw my own interpretation into the ring.
One of the biggest keys to doing well on technical interviews is to completely separate the problem solving from the coding. The strongest interviewers will discuss the problem and solve it at an abstract level using diagrams. Once satisfied with the solution, they'll code the entire thing making few mistakes.
I think this is what drives most of those metrics. Strong interviewers submit code later, and have a higher chance of it being correct because they take the time to problem solve upfront. Their thought process seems more clear because there isn't the iteration of "this should work, let me code it, oh no wait, that's wrong, let me erase this now..."
I think this applies to actual software management too. I don't see people with the most code commits as the ones who gets promoted. The ones promoted are usually the ones who really scope out the problem. Once they're sure of a complete solution they code out their plans on a steady pace.
I think Einstein's the one that said: "If I had an hour to solve a problem I'd spend 55 minutes thinking about the problem and 5 minutes thinking about solutions."
Yes guys that do have a lot of commits (and those who don't think through the problem long enough) end up with commits like:
"Fix" "Real fix" "Fix the fix because of fix"
Then you know you don't want those on your team. It really bites on tight deadlines when you have to put something to production but poor dude needs to push that really, really last fix.
> The strongest interviewers will discuss the problem and solve it at an abstract level using diagrams. Once satisfied with the solution, they'll code the entire thing making few mistakes.
I agree. In general I'm mediocre at coding interviews, but I do best when I have a chance to whiteboard and draw the problem. On the other hand I do absolutely terribly on phone screens with a shared document and no where for the interviewer to see my drawing.
Completely devalues working with prototyping though.
>> An average, successful candidates interviewing in Python define 3.29 functions, whereas unsuccessful candidates define 2.71 functions. This finding is statistically significant.
The "average" is too sensitive to outliers and should not be used for such a comparison...
[Edit] Being bored I calculated the Kolmogorov-Smirnov statistic based on the chart. It is between 10%-10.5%. The number of defined funtions seems to be a significant but weak indicator.
We actually did the KS test as well, but we omitted the results for narrative clarity. Our KS test statistic is also < 0.05.
I would recommend you to put also the more sophisticated results (like K-S tests etc.) in the blogpost. For people interested in technical interviews and charts they may be more important than narrative clarity.
Soon to be a clickbait article: "7 simple ways to hack your interviews. #1: Define 10,000 functions. That makes you a good programmer"
4 functions will do. You just need to be above average. 5, just to be safe.
The title is quite clickbaity: "We analyzed thousands of technical interviews on everything from language to code style. Here’s what we found."
What's wrong with this, I think, is that a (journalistic) title should give an ultra-condensed summary of the main point of the article. This title suggests that the authors gathered a lot of data but didn't find much.
(I find myself quite intrigued by clickbaity titles somehow, sorry for that.)
A lot of this article rings true with my experience. And I agree with the comments dinging live coding tests - those are the worst. I can't code effectively unless I'm calm and can concentrate, and these things are almost designed to get you off-balance.
Even more galling when you have a healthy GitHub portfolio that they refuse to even look at in favor of a quiz (this has happened recently).
To note a few things regarding the interview and hiring process (I do a lot of it, have done about 20 in the past two days), the reason for the quiz and not using your portfolio (unless its extremely outstanding and you are fine doing a live coding questionnaire) is so that we can judge all the candidates equally and fairly.
It gives a common baseline to judge. Each candidate does the same thing and we have a good idea of what we are looking for, the rest of it tells us how you think, how you approach your work, organize your work, and best of all? You can compare that to how others do it.
Now, not to say that's everyone, that is what we use it for.
---
Before, as an Engineer / Manager, I hated doing "live coding" tests when it wasn't relevant. For example doing "algorithms" or "palindrome" or "sliding window dns" or "O(n)" examples when you're doing front-end or a management position screams to me that the people doing the interviewing don't know what they actually want.
Instead quizzes or live coding that are relevant like "tell me how to access all the elements in this particular element and traverse the children to apply some styling" is much more relevant and will show me the thought process, their ability to retain information, and their recall. It also shows communication ability when they get stuck and ask for help or use me as a sounding board.
It's not always about your implementation, but how you handle the situation and communicate
I agree with you about irrelevant questions, and out of the dozen or so coding quizzes I've done over the past several years, mostly with start up companies, those are unfortunately usually what I got. One company was advertising for a front end role and, sensibly enough, asked front end-related questions - that was the exception.
And the thought that quizzes provide a fair point of comparison comes across to me as putting process ahead of substance. Interviewing isn't meant to be fair to everyone - only one person gets the job, after all - so it's not like handing out cookies and stickers in middle school. It's meant to see, in part, whether the person is capable of generating working code. If you have a person who can provide samples to prove it, requiring an artificial quiz really is a slap in the face to a lot of good candidates.
Well a good interview process has steps that come first to evaluate the individuals substance, certainly if someone is simply approaching an interview going... Here do this quiz, that's not beneficial.
What's important to us is the person, their ability to communicate, learn, interests, then their skill set and what we have is a good fit for them and us.
Focusing on a singular part of the interview without taking a step back and reviewing the whole isn't too beneficial.
A lot of tech interviewers fall back on college experience of the pop quiz. Or what made them feel smart (which is why standardizing such things across managers is important). Really smart engineers that want to relive the A grades from school will try and be clever and undo such standards.
That all said, the correlation between great whiteboard / coding tricks and them being a successful part of a team isn't so great. It's not their job to code short programs under pressure of being watched. It's a proxy. There are better ways.
"Poor technical ability seems highly correlated with poor communication ability"
Yep. Articulate attention is the name of the game (where "communication ability" sounds a little nebulous).
If you can't organize your thoughts, bring them to the forefront of your attention, name them, you're likely bad at handling abstractions. And abstractions are at the core of "technical ability" -- the ability to name things, find the appropriate abstraction boundaries, chisel structure out of chaos.
Articulate speech is the greatest human invention for a reason.
Testing for that (plus conscientiousness -- can you pay attention to details and get shit done?) during interviews makes perfect sense.
If you filter the interviews to only interviewees who:
- liked the person
- rated the questions 3 or 4 stars
- gave the interviewer 3 or 4 stars for being helpful
Do the trends still hold?
How are those trends compared to only looking at interviews with:
- disliked the person
- rated the questions 1 or 2 stars
- gave the interviewer 1 or 2 stars for being helpful
Judging by this graph in the article, and somewhat counter to the claim in the article: https://plot.ly/~aline_interviewingio/952.png?share_key=Htks...
Looks to me that interview length is correlated with success rate. If your interviewer stops before 60 minutes, there's a bias towards successful interviews. It seems like the interviews that end up being "no"s tend to get hard-stopped right at the 1-hour mark.
I like to ask one question that probes basic analytic ability and a second question that probes programming aptitude. Generally, the first question either takes 3-5 minutes or the whole 45-50. It's usually a problem of the form "write a predicate (Boolean-valued) expression that is true when..." applied to something simple, and it's a basic test of being able to use relations and logical operations to characterize a situation. It's depressing how many great-looking candidates with awesome degrees, resumes, and phone-screen performances get stuck trying to describe how to tell whether two calendar entries (just start/end times) conflict with each other.
Here is a hint. If you aren't a fresh graduate avoid companies making you code in a browser under a time pressure.
None of the graphs are loading for me. It says "If the problem persists, open an issue at support.plot.ly" Unfortunately, I have to pay money to file reports...
Looks like this has been fixed.
Same here
One of the language results doesn't make sense. It claims that it matters, significantly, if you solve interview problems in Java when the hiring company is a Java shop, but not when the hiring company is a C++ shop.
But that's reversed. It is in fact fairly difficult for a high-level language programmer to pick up C++, and facility with C++ (or at least C) is a common, accepted goal for C++ hiring shops. A C++ shop that hired candidates without regard for their aptitude at C++ would have real problems.
The data suggests that it is easier for a C++ programmer to get make a good impression in a non-C++ shop, than at a C++ shop where they likely to test you on C++ edge cases.
Interesting that this effect does not show up for Java programmers.
I think this makes sense, because C++ is seen as this difficult beast. Non-C++ shops will be impressed that you know any C++, while a C++ shop will want to dig much deeper and make sure you are sufficiently advanced at the language to do the job. Non-C++ shops won't dig as deeply (either because it doesn't seem relevant -- you're not going to be using C++ anyway, so you just need to show programming ability and not C++ mastery -- or the interviewers don't know C++ well enough themselves, since they don't use it in their jobs) and won't ask you about in-depth edge case language features (because why would they if they don't use the language), but in a C++ shop, they will care about all of these things and have people who use C++ and can ask in-depth questions.
Why isn't it the same in Java? I'm not sure, perhaps its because Java has less gotchas as a language (certainly a lot less undefined behaviour and weird memory-related gotchas, no templates, no multiple inheritance etc) and C++ has this "its a difficult language" prestige which Java doesn't have.
They give P ~= 0.36 to this claim, in an article that I think makes more than 1/0.36 ~= 28 claims, so I would suspect that maybe that's just a coincidence in their data?
There's just one problem: this assumes that code challenges are present in all (engineering) interviews.
Graphs have been fixed! Sorry about that, HNers.
There's something awry about the "fewer code execution errors" graph: I count an odd number of columns when they ought to be in pairs.
Clicking on the graph to go to plot.ly and viewing its "data" tab, it looks like there's a blank X value for:
text y x 0 bucketed_success_rate: -0.02<br>pct: 0.1<br>`Interviewer Would Hire`: False 0.0987654320987654You might want to think about upping the contrast and/or font weight too. It's nearly unreadable for me and I have pretty good vision.
Edit - it looks way better if I disable Open Sans. It might just be a font issue with Chrome or Windows.
Order has been restored!
These people are not particularly good at interpreting statistics.
>These people are not particularly good at interpreting statistics.
Maybe that is a tad bit too harsh, but surely the use of "big difference" and of "significant" seems like not being justified by the actual data:
>On average, successful interviews had final interview code that was on average 2045 characters long, whereas unsuccessful ones were, on average, 1760 characters long. That’s a big difference! This finding is statistically significant and probably not very surprising.
An average of 1760 vs an average of 2045 indicates a general average-average of around 1900 lines, so that would be 1900+/- 7%, and anyway the difference in ranges is so little that anything could cause it.
To have more or less 200 characters, merely calling variables a, b, c, etc. vs. FirstUserChoice, DefaultArrayIndexingField, you know what I mean, would be enough.
Same goes for:
>On average, successful candidates’ code ran successfully (didn’t result in errors) 64% of the time, whereas unsuccessful candidates’ attempts to compile code ran successfully 60% of the time, and this difference was indeed significant.
As I see it 60% or 64% as an average are almost exactly the same number, and bear very little significance. Maybe it is just me missing some sensibility ...
What is the blue bar on top of the page?
pace.js or something similar to it