Triplebyte Engineer Genome Project
blog.triplebyte.comWhen we started Triplebyte, we'd thought there would be pretty much a straight line from being a bad to great programmer and we'd just have to figure out where to put the cut off when deciding whether to work with an engineer. The biggest surprise has been just how much disagreement there is amongst companies on what a "great engineer" actually means.
That's when we realized we were actually working on a mapping problem and the first step was figuring out a universal set of criteria that all companies care about. Then if we could assign the right weight for each attribute to specific companies, we could route engineers only to companies they'll be a strong technical fit for.
It'd be great to get thoughts on the criteria we chose and experiences from engineers who have done a lot of technical interviewing,
To make it even more complicated, most companies might not even know what they need. The classic example is algorithms: many companies will say they care about algorithms, but few of them actually need those skills.
"Back-end web understanding" seems oddly specific compared to everything else on that list.
Why that and not something more general which might encompass other kinds of domain-specific knowledge? There are a lot of companies on your list which seem like they might care more about other skills that don't really fit anywhere else. (Experience working with databases for example, for one of the 7 or so database companies.)
I assume the (very broad) criteria listed on the blog are supersets of very specific sub critera. Maybe if you could elaborate on what exactly those sub critera are, it would make things clear.
We have specific guidelines / process that we use to measure each. For example, professional coding is a focus on writing clean code, that is well designed on the micro level (good names, good modularity on the function / class level), and good testing. We measure this using a rubric as we watch each engineer code. Low-level understanding is knowledge of how computers work, under the hood (bits, bytes, character encoding, operating systems, networking, etc). Again, we have a rubric (and these topics are covered at multiple points as each engineer goes through the process). Our measurements are not perfect, but companies really do vary widely in how much they care about different areas (understanding how computers work say, vs. having a great coding process). By mapping this and matching we save everyone pain. No one had tried to do this before, and I am pretty excited about it.
I recall part of the Triplebyte screening process was to implement a solution to a whiteboard-type problem under a time constraint. It occurred to me that not all companies would necessarily place heavy weight on solving whiteboard problems in the interview process, yet Triplebyte seemed to filter applicants right out of the gate based on this measurement. I am curious if the genome project will alter this aspect of the screening process?
I went through the Triplebyte process (to the end) and did not ever implement a solution to a whiteboard problem under a time constraint.
I went through the triplebyte process twice; once in the take-home project track and once in the "normal" track.
Their feedback to me on the take-home project track was "we thought you did a great job; very impressive. But you interview so poorly that we decided not to move forward." I don't see anything to do in response to that other than to keep trying to interview, so I tried to do the "normal" track.
It was entirely time-constrained, supervised whiteboard problems.
If I recall correctly, I was asked to implement a solution to a combinatorial problem within 1 hour.
I still don't understand why I would want to go through the hassle of doing an onsite interview with TripleByte only to have to go through further onsite interviews at the hiring companies?
If TripleByte's onsite interview allowed me to skip the onsite at the hiring company, then I'd be all for it, but it is like it's just a layer of friction.
For the record, I've had zero problems applying to companies by either emailing them or getting contacted by them via LinkedIn, email, etc. I just don't understand what benefit they bring at this moment. Maybe if the job market tightens and they were exclusive providers for companies, then sure, but all the SV companies have teams of recruiters emailing people all day long. As a hiring candidate there's no reason why I would want to go through their onsite.
Finding the right company to join is hard, you have to find which companies are doing interesting things that match your interests and then narrow down to the ones where you'll be both a technical and cultural fit. Failed interviews are a big time suck and we see that most people only have the stamina to interview with a few companies and they'll often accept one of the first offers they get, rather than optimizing for the companies they're most excited about. We have the data to match you with companies you'll be a strong technical fit, which saves you wasting time speaking to companies who don't value your particular engineering skills. The end result is a more efficient job search process, giving you more options by speaking with less companies.
We do also reduce the total amount of time engineers have to spend in technical interviews. Triplebyte candidates skip the technical phone screens, usually an hour per company at least. If you're speaking with at least 3 companies (which everyone working with us is), you've already saved time as our technical interview is 2.5 hours.
Happy to talk more about this, harj AT triplebyte.
Sorry, I appreciate that you're trying to add value but I still don't see it.
Until you can get the hiring companies to eliminate their onsite interviews and only rely on you to test candidates programming abilities, then there really is no advantage to going through TripleByte. There are only disadvantages.
If I have a bad day and mess up the TripleByte interview, then I'm automatically excluded from a bunch of top companies through TripleByte. But if I apply to each company individually, I get the same chance to perform and a poor performance won't affect the others.
However, if I do well, the only thing I get after doing a TripleByte onsite is to skip is a 1 hr phone screen, which, if I'm good I'll be able to pass anyway. And also I'm still relying on you to give me access to the companies which may or may not be the ones I want.
There just doesn't seem to be any practical advantage to using TripleByte over any other recruiter. I'd in fact skip TripleByte because it requires me to do a lengthy onsite which would require me to take a day off (presumably you don't do this on weekends) on top of the onsites from other companies. And the advantage of skipping phone screens doesn't seem worth it.
We never provide negative info to companies. If you fail our interview, you are free to apply to whatever companies you want on your own.
The practical advantage of going through Triplebyte is that you'll pass more interviews. We match you with companies where you will pass at a higher rate. Because you only have time to interview at a fixed number of companies, this will give you more options, a higher salary (from more competition), and a better fit. We also help candidates negotiate (A big help if you're afraid to do this), and help people with bad resumes (say self-taught programmers) get in the door at top companies. Traditional recruiters do just the opposite (they filter heavily on resumes)
I used Triplebyte when looking for a job recently. The advantage in it for me is that Triplebyte thought about what I said I was interested in and matched me up to a half dozen interesting small companies that I would never have found any other way.
That service is easily worth thousands of dollars to me (and I also used Hired.com for the same thing, where thousands of dollars would have been paid in lieu of my salary if I had been matched up.) Instead all I had to do was spend two or three hours writing some code and talking on the phone.
If there are other recruiters that competently provide this service then nobody told me about them.
"There just doesn't seem to be any practical advantage to using TripleByte over any other recruiter."
Except, of course, that the companies who work with TripleByte probably trust them a LOT to provide great/qualified candidates. Most companies don't trust recruiters very much at all-- some aren't terribly motivated by anything other than closing as many deals as possible, and the vast majority don't know anything about coding. And many/most of Triplebyte's companies probably don't work with recruiters.
Why do you think failing a tech screen with Triplebyte would exclude you from top companies? You could still apply to those top companies easily, no? When those companies get a promising applicant, they don't do pore thru Triplebyte failed-screens data. Even if they wanted to, I presume that information isn't actually available to them.
Cost/benefit-wise seems like it adds credibility and saves time going with Triplebyte unless there isn't 2+ YC companies you are excited to apply to.
We've worked with enough engineers to be confident they're finding value and advantages to using Triplebyte. The feedback has included finding interesting companies (especially earlier stage ones) they'd not known about, skipping technical phone screens (even if you know you'll pass, they still suck), getting feedback on the Triplebyte technical interview, having interview scheduling handled by us, getting a high offer rate and help thinking through offers.
Assuming you can get companies to engage with you (i.e. you already have the right resume credentials) you could invest the time in achieving these same goals yourself as you're saying. For people who don't want to make that time investment, we can save time in a way that other recruiters can't by filtering companies using data about your technical skills and skipping those phone screens. The companies we work with don't trust other recruiters to do this screening correctly.
Our interview also doesn't require taking a day off work, it takes 2.5 hours and is done remotely via Google Hangouts.
Technical phone screens don't have to suck if both parties have a good attitude toward them - they should be fun! Just like our jobs should be more or less fun. Can Triplebyte match me only with companies with this perspective?
While hiring is indeed a big problem that can be addressed with a data-driven approach, I'm not sure the approach of "we have data, just trust us" is fair to all parties.
The naming of Engineering Genome Project is styled after Pandora's Music Genome Project. The difference is that Pandora uses data to provide relevant and immediately verifiable results by the user, such as music along the same genre and artist. In contrast, an Engineering Genome Project uses criteria such as "applied problem solving" and "professional code" that is impossible for a user to interpret intuitively.
Well, the engineers who go through our process are in a good position to verify the effectiveness of the matching. Granted, the bar to reach that point and check the quality is higher than it is for Pandora (readers can't go check right now what their matches would be). But I don't think that's an argument against trying to do a better job matching engineer with companies. This is an important area that's been largely overlooked.
The categories that you mention (applied problem solving and professional code) really are important. Companies differ widely in how much care about those two things (solving problems in the interview effectively vs. showing clean, well-structured code and good testing process). When an effective but iterative (and sometimes sloppy) programmer interviews at a company that values process highly, the result is wasted time and pain for everyone.
Link to their blog post: http://blog.triplebyte.com/triplebyte-engineer-genome-projec...
Ok, we changed the URL to that from http://techcrunch.com/2016/05/04/technical-recruiting-platfo..., which points to this.
For a second I thought they were going to start sequencing genomes of engineers and start screening via DNA...
That's under the "Long Term" section of our product roadmap.
I think (or hope) that you forgot the "/s" at the end.
I think this writeup is a tad lengthy. It's not until the fifth paragraph that I understand what's even going on
>Intelligent matching with software is how hiring should work. Failed technical interviews are a big loss for both sides. They cost companies their most valuable resource, engineering time. Applicants lose time they could have spent interviewing with another company that would have been a better fit.
I feel like that should have been the headline for this. For a company that is meant to match people to companies, I think their external communication should be excellent not just good. How can I trust that this company will communicate my strengths and weaknesses in a way other people can understand if it's difficult for me to follow one of their flagship blog posts?
Whenever someone decides to hire someone, all of their criteria are heavily biased towards what kind of skills the candidates posses.
I wonder if someone can come up with a reasonably accurate way to determine how well or easily can a candidate acquire particular skills.
I realize this line of thought might not be popular for most startups who would want someone to get going as soon as they start. But if you're having a tough time hiring a Machine Learning engineer and you get applications from a bunch of smart folks who want to gain experience in Machine Learning, would it be a good idea to give them a shot?
The traditional 'puzzle solving' in interviews was probably geared in this direction, but I'm wondering if there are better ways to gauge this.
This is something we're able to do too by encouraging people to reapply and tracking how much they've improved between technical interviews. It makes sense for companies to do this too but they don't, mostly because it's never any single person's area of focus.
Can you share more information about your findings here?
More specifically, has any company ever been content with the delta in experience/knowledge a candidate might have gained between interviews, enough to hire them? This as opposed to continuing to evaluate the candidate against an absolute benchmark.
Because if not, then this sort of evaluation doesn't really help, does it?
If people reapply wouldn't they know the interview already? Seems to me TB is obsessed with UDP and thinks it is 70% of what engineering is all about.
User Datagram Protocol? What gives you that impression? In any case, we run several versions of the interview, to allow people to reapply
> I wonder if someone can come up with a reasonably accurate way to determine how well or easily can a candidate acquire particular skills.
This is known as an "IQ test".
I understand why many people are skeptical, but you guys are doing something new in a space that has long been ignored. Keep up the good work!
I got rejected by Triplebyte and then hired by Google a while later. I prepared much more for the Google interview though, so I don't know if Triplebyte was at fault.
I have to say, my interest was piqued by your hiring strategy.
It's a shame it's limited to just engineers. I've been looking for a recruiter company like this for data science.
Very interesting. How do I know what I value as an employer?
Edit to my own question....
The 7 genome dimensions looks really reasonable. But hypothetically thinking I still want it all!
Employers don't actually (in most cases) have a very good grasp of what qualities they select for. It's function of the engineers doing the interviews and the engineering culture, and most companies are not aware how much this differs between companies. We model what each company looks for by sending them candidates with specific attributes and reading their feedback (we get honest feedback from companies after interviews, which is pretty rare). We're then able to see how feedback for the same candidate differs between companies.
Interesting approach. How do you plan to scale this since you also need data points from the actual interviews done by companies?
How does one assess communication skill without humans? A leap forward in NLP?