Who Y Combinator Companies Want

If you’re a programmer interested in joining a YC startup, apply to Triplebyte and we’ll match you with the ones you’d be the best fit for.

Companies disagree significantly about the types of programmers they want to hire. After 6 months doing technical interviews and sending the best engineers to Y Combinator companies (and interviewing the founders and CTOs at the top 25), we’ve analyzed our data. There are broad trends, but also a lot of unpredictability. Key takeaways include:

1. The types of programmers that each company looks for often have little to do with what the company needs or does. Rather, they reflect company culture and the backgrounds of the founders. It’s nearly impossible to judge these preferences from the outside. At most companies, however, non-technical recruiters reject 50% of applicants by pattern matching against these preferences. This is a huge frustration for everyone involved.

2. Across the companies we work with there are several notable trends. First, companies are more interested in engineers who are motivated by building a great product, and less interested in engineers with pure technical interests.This is at odds with the way the majority of programmers talk about their motivations. There’s a glut of programmer interest in Machine Learning and AI. Second, companies dislike programmers with enterprise backgrounds. Our data shows that companies are less likely to hire programmers coming from Java or C# backgrounds.

3. These results show extrapolation from insufficient data on the part of many companies. Talent can be found among programmers of all backgrounds. We’re mapping the preferences across all YC Companies in more detail, and encouraging companies to consider people they would normally reject. In the meantime, programmers looking for jobs with YC companies may want focus more on product and be sure to mention experience outside of Java and C#.

The problem

My co-founders and I have been running a recruiting company (Triplebyte) for the last 6 months. We interview programmers, and help the best ones get jobs at YC companies. We do our interviews without looking at resumes (in order to find great people who look bad on paper), and then see feedback on each engineer from multiple companies. This gives us a unique perspective on who YC companies want to hire.

When we started, we imagined a linear talent scale. We thought that most companies would be competing for the same (top 5%) of applicants, and all we had to do was measure this. One of the first people to pass our process really impressed us. He was a superb, intelligent programmer. He solved hard algorithm problems like they were nothing, and understood JavaScript deeply. We introduced him to a company he was excited about, and sat back to watch him get a job. We were startled when he failed his first interview. The company told us they valued process more than raw ability, and he’d not written tests during the interview. He went on to get a bunch of offers from other companies, and one founder told us he was among the best programmers they had ever interviewed.

This lack of agreement is the rule, not the exception. Almost no one passes all their programming interviews. This is true because of randomness in many interview processes (even great people are bad at some things, and an interviewer focusing on this can yield a blocking no), and also because companies look for very different skills. The company that rejected our first candidate ranked testing in the interview above algorithmic ability and JavaScript knowledge.

Mapping these preferences, it was clear, was key to helping engineers find the right startups. If we could route programmers to companies where their skills were valued, everyone would win. To that end, we’ve spent the last two months doing detailed interviews with CTOs and lead recruiters at the top 25 Y Combinator companies. In this blog post I’m going to write about what we learned from talking to these companies and sending them engineers. It’s interesting, and I hope useful for people applying for programming jobs.

Setup

To map the preferences of the top YC companies, we wrote paragraphs describing 9 hypothetical programmers, embodying patterns we’d seen from running 1000+ interviews over the last 6 months. These range from the “Product Programmer” who is more excited about designing a product and talking to users than solving technical challenges (we internally call this the Steve Jobs programmer) to the “Trial and Error Programmer” who programs quickly and is very productive, but takes an ad hoc approach to design. In reality, these profiles are not mutually exclusive (one person can have traits of several).

We then set up meetings with the founders and lead recruiters at the top 25 YC Companies. In the meetings we asked each company to rank the 9 profiles in terms of how excited they were to talk to people with those characteristics.

Results

The grid that follows shows the results[1]. Each row shows the preferences of a single (anonymized) company. Each column is a hypothetical profile. Green squares means the company wants to interview engineers matching the profile, red means they do not. Empty squares are cases where the founders’ opinions were too nuanced to be rounded to interest or lack of interest.


The first thing that jumps out is the lack of agreement. Indeed, there’s no single company interested (or disinterested) in all 9 profiles. And no profile was liked (or disliked) by more than 80% of companies. The inter-rater reliability of this data (a measure of the agreement of a group of raters) comes out at 0.09[2]. This is fairly close to 0. Company preferences are fairly close to unpredictable.

The impact of these preferences on programmers, however, is totally predictable. They fail interviews for opaque reasons. Most companies reject a high percentage of applicants during a recruiter call (or resume screen). Across the 25 companies we interviewed, an average of 47% of applicants were rejected in this way (the rate at individual companies went as high as 80%, and as low as 0%). The recruiters doing this rejecting are non technical. All they can do is reject candidates who don’t match the profile they’ve been taught to look for. We’ve seen this again and again when we intro candidates to companies. Some companies don’t want to talk to Java programmers. Others don’t want academics. Still others only want people conversant in academic CS. We’ve seen that most engineers only have the stomach for a limited number of interviews. Investing time in the wrong companies carries a high opportunity cost.

I don’t want to be too hard on recruiters. Hiring and interviewing are hard, shortcuts must be taken to keep the team sane, and there are legitimate reasons for a company to enforce a specific engineering culture. But from the point of view of programmers applying for jobs, these company preferences are mercurial. Companies don’t advertise their preferences. People who don’t match simply apply, and are rejected (or often never hear back).

Patterns

There is some agreement among companies, however, and it’s interesting.

1. There’s more demand for product-focused programmers than there is for programmers focused on hard technical problems. The “Product Programmer” and “Technical Programmer” profiles are identical, except one is motivated by product design, and the other by solving hard programming problems. There is almost twice as much demand for the product programmer among our companies. And the “Academic Programmer” (hard-problem focused, but without the experience) has half again the demand. This is consistent with what we’ve seen introducing engineers to companies. Two large YC companies (both with machine learning teams) have told us that they consider interest in ML a negative signal. It’s noteworthy that this is almost entirely at odds with the motivations that programmers express to us. We see ten times more engineers interested in Machine Learning and AI than we see interested in user testing or UX.

2. (Almost) everyone dislikes enterprise programmers. We don’t agree with this. We’ve seen a bunch of great Java programmers. But it’s what our data shows. The Enterprise Java profile is surpassed in dislikes only by the Academic Programmer. This is in spite of the fact we explicitly say the Enterprise Programmer is smart and good at their job. In our candidate interview data, this carries over to language choice. Programmers who used Java or C# (when interviewing with us) go on to pass interviews with companies at half the rate of programmers who use Ruby or JavaScript. (The C# pass rate is actually much lower than the Java pass rate, but the C# numbers are not yet significant by themselves.) Tangential facts: programmers who use Vim with us pass interviews with companies at a higher rate than programmers who use Emacs, and programmers on Windows pass at a lower rate than programmers on OS X or Linux.

3. Experience matters massively. Notice that the Rusty Experienced Programmer beats both of the junior programmer profiles, in spite of stronger positive language in the junior profiles. It makes sense that there’s more demand for experienced programmers, but the scale of the difference surprised me. One prominent YC company just does not hire recent college grads. And those that do set a higher bar. Among our first group of applicants, experienced people passed company interviews at a rate 8 times higher than junior people. We’ve since improved that, I’ll note. But experience continues to trump most other factors. Recent college grads who have completed at least one internship pass interviews with companies at twice the rate of college grads who have not done internships (if you’re in university now, definitely do an internship). Experience at a particular set of respected companies carries the most weight. Engineers who have worked at Google, Apple, Facebook, Amazon or Microsoft pass interviews at a 30% higher rate than candidates who have not.

Advice

If you’re looking for a job as a programmer, you should pay attention to these results. Product focused programmers pass more interviews. Correlation is not causation, of course. But company recruiter decisions are driven largely by pattern matching, so there is a strong argument that making yourself look like candidates who companies want will increase your pass rate. You may want to focus more on product when talking to companies (and perhaps focus on companies where you are interested in the product). This is a way to stand out. Similarly, if you’re a C# or Java programmer applying to a startup, it may behoove you to use another language in the interview (or at least talk about other languages and platforms with your interviewer). Interestingly, we did talk to two YC companies that love enterprise programmers. Both were companies with founders who have this background themselves. Reading bios of founders and applying to companies where the CTO shares your background is probably an effective job-search strategy (or you could apply through Triplebyte).

If you run a startup and are struggling to hire, you should pay attention to these results too. Our data clearly shows startups missing strong candidates because of preconceptions about what a good programmer looks like. I think the problem is often extrapolation from limited data. One company we talked to hired two great programmers from PhD programs early on, and now loves academics. Another company had a bad PhD hire, and is now biased against that degree. In most cases, programming skill is orthogonal to everything else. Some companies have legitimate reasons to limit who they hire, but I challenge all founders and hiring managers to ask themselves if they are really in that group. And if you’re hiring, I suggest you try to hire from undervalued profiles. There are great Ph.Ds and enterprise C# programmers interested in startups. Show them some love!

Conclusion

YC Startups disagree strikingly about who’s a good engineer. Each company brings a complex mix of domain requirements, biases, and recruiter preferences. Some of these factors make a lot of sense, others less so. But all of them are frustrating for candidates, who have no way to tell what companies want. They waste everyone’s time.

I’m excited about mapping this. Since we started matching candidates based on company preferences (as well as candidate preferences), we’ve seen a significant increase in interview pass rates. And we only just completed the interviews analyzed in the post. I’m excited to see what this data does. Our planned next step is to not only interview founders and recruiters at these companies, but also have the engineers who do the bulk of the actual interviewing provide the same data.

Our goal at Triplebyte is to build a better interview process. We want to help programmers poorly served by standard hiring practices. We’d love to have you apply, even if — or especially if — you come from one of the undervalued groups of programmers mentioned in this article. We’d also love to get your thoughts on this post. Send us an email at founders@triplebyte.com.

Thanks to Jared Friedman, Emmett Shear and Daniel Gackle and Greg Brockman and Michael Seibel for reading drafts of this.


Footnotes:

1. Astute readers will notice that there are more than 25 rows in the graph. This is because we’ve recently added these questions to our onboarding flow for new companies we work with. If you run a YC Company, you can log into Triplebyte with your company email address, and add this data (we’ll use it to send you more candidates).

2. I calculated this using Fleiss’ kappa. This measures the agreement between a number or raters, with -1 being perfect disagreement, 0 being the agreement that would result from random coin tosses, and 1 being perfect agreement.

A Taxonomy of Programmers

We’ve been interviewing hundreds of programmers and matching them with YC startups. To help intelligently match programmers with companies, we’ve created a number of hypothetical programmer descriptions. These profiles are drawn from patterns we’ve seen in 1000+ technical interviews over the last 6 months. We’ve had success using these profiles to match engineers with companies. If you have any suggestions for additional profiles, we’d love to hear about them in the comments.

Academic Programmer: Candidate has spent most of their career in academia, programming as part of their Masters/PHD research. They have very high raw intellect and can use it to solve hard programming problems, but their code is idiosyncratic.

Experienced Rusty Programmer: Candidate has a lot of experience, and can talk in depth about different technology stacks and databases, explaining their positives and negatives with fine detail. When programming during an interview, they’re a little rusty. They usually get to the right place but it takes a while.

Trial and Error Programmer: Candidate writes code quickly and cleanly. Their approach seems to involve a lot of trial and error, however. They dive straight into programming problems and seem a little ad hoc but their speed enables them to ultimately solve the problems productively.

Strong Junior Programmer: Candidate is fresh out of college, with some internships and less than a year full time work experience. They really impress during a technical interview, have numerous side projects and impressive knowledge of computer science and programming in general. They’re well above average from other junior programmers.

Child Prodigy Programmer: Candidate is very young (e.g. 19 years old) and decided to go straight into work, skipping college. They’ve been programming since a very young age and are very impressive in their ability to solve hard technical problems. They’ve also been prolific with side projects and are mature for their age. It’s likely they’ll found a company in the future when they’re older.

Product Programmer: Candidate performs well on technical interviews and will have the respect of other engineers. They’re not motivated by solving technical problems, however. They want to think about the product, talk to customers and have an input into how product decisions are made.

Technical Programmer: Candidate is the inverse of the Product Programmer. They interview well and communicate clearly. But they aren’t motivated to think about the user experience or product decisions. They want to sink their teeth into hard technical problems.

Practical Programmer: Candidate solves practical programming problems with ease, even very abstract programs. They aren’t comfortable with computer science terminology though (e.g. data structures, algorithms) and don’t have a deep understanding of how computers work. They are strongest with ruby/python/javascript, not so much with lower level languages like C.

Enterprise Programmer: Candidate is strong in academic computer science (algorithms, data structures, complexity analysis), has experience, and solves technical problems well. Their working experience is with large enterprise companies (e.g. Dell/Oracle/IBM). They want to join the startup, although they don’t have experience taking ownership of projects. They program mostly in Java using an IDE such as Eclipse.

Note: If you run a YC Company, you can log into Triplebyte with your company email address, and add your preferences (we’ll use it to send you more candidates).

Gaming the H-1B system (for good)

A recent article in the NY Times exposed how flawed the H-1B lottery process is. A handful of giant outsourcing companies flood the system with applications, making it near impossible for startups to hire international engineers. 

These companies are gaming the system. But there is a way to turn this game against them, by exploiting the Achilles heel in their plan - the H-1B transfer. Getting a H-1B is tough because regardless of your personal merits, you're in a lottery with thousands of other candidates. Your choice of employer is limited by those willing to play the lottery.  There's no lottery for transferring a H-1B though. The process is straightforward with no quota, you just have to find an employer willing to file the paperwork. This gave us an idea. 

We're announcing the Triplebyte H-1b transfer program. If you're working on a H-1B at one of these outsourcing companies, apply to Triplebyte and we'll cover all the costs of transferring your H-1B. We'll help you find a startup doing work you're excited about and walk them through the H-1B transfer process, making it a no brainer for them. We'll also provide you with an immigration lawyer, to answer any questions you have, and we'll cover the cost of that too.   

We're going to expand the pool of startups doing H-1B transfers so you have the same choice as anyone else.  We recently placed an engineer using a H-1B transfer, at a startup who wouldn't have considered doing this without our help. Many founders mistakenly assume that applying for and transferring a H-1B are synonymous. 

Helping great people move here is something that's personally important to us. My life was changed by moving out here to work on my first startup (after a year of struggling with trying various approaches to getting a visa). My co-founder Guilllaume moved here from France to work at Justin.tv and then found his own startup, Socialcam. We want to see more talented people coming here to work on building the future, not being cheap labor for giant corporations.  



Thanks to Theo Negri and Buildzoom for shining a light on this issue in the original story.

Take-home interviews

Today we're announcing our second experiment, take-home projects. We're going to try a new way of assessing programming ability by having programmers work on a project on their own time instead of coding during an interview. We know there are benefits and drawbacks to this approach, I'll go into more detail into our thinking behind this below.

Anyone who passes our take-home project assessment will get exactly the same service from us as people who do the regular interviews. We'll work hard to find several YC startups they'd be a great fit for, fast track them through the hiring processes, and handle all logistics of flights/accommodations/scheduling.

The Problem

Several weeks ago, we interviewed a recent college grad. He'd done well on our quiz, had great personal projects, and I was excited to talk to him. As soon as the interview started, however, I could tell that something was wrong. I gave him a programming problem, but he could not get started. He'd start to write one thing, mutter that it was a bad place to start, and go back to something else. He switched languages. His breathing accelerated. He started to shake.

Programming interviews are stressful. Fundamentally, the applicant is being judged. They have to understand the question, produce a working solution in limited time, while explaining everything they are doing with no time to stop and gather their thoughts. At its worst it's adversarial.

Some programmers find that this stress pushes them to do their best in interviews. Others find it debilitating. There are programmers with track records of solving hard problems who simply freeze when subjected to the stress of an interview. They babble. They become unable to program.

This does not mean that they are bad programmers[1]. I gave the fellow in our interview a much harder problem to do on his own time. I assumed that he'd never get back to us. The project was a lot of work. Three days later, however, I had a complete solution in my inbox. We got him back on the phone, and he was able to talk in depth about what he had done, about the underlying algorithms, and about the design trade-offs he'd made. The code was clean. He was clearly a skilled programmer.

The Solution

To solve the problem of interview anxiety, we're adding a second track to our interview process at Triplebyte. Applicants, if they choose, will be able go through our process by completing programming projects on their own time. They'll still do interviews with us, but rather than doing interview problems, they will just talk about the project they already completed. Those who do well will be matched with Y Combinator companies, just like programmers who go through our regular interview.

The project-based track will require a larger time commitment (and we expect lots of people to stick with the standard track for this reason). However, doing a larger project is almost certainly a better measure of actual ability to do a job then a traditional interview is.

Here's how our process works:
  1. When a candidate books a 45-minute interview, they can indicate that they want to do a project.
  2. Three days before the interview, we'll send them a list of projects, and they'll pick one and start to work on it. We expect them to spend about 3 hours on the project (or as long as they want to spend to show us that they're a good programmer).
  3. During the interview, we'll talk about what they've programmed, go over design choices and give feedback.
People who pass the 45-min interview will go though the same process in the 2-hour final interview. Rather than pick a new project, however, they'll take the same project further, incorporating feedback from the 1st interview. Those who pass the 2-hour will talk to Harj, get intro-ed to YC companies, and start new jobs!

I'm particularly excited being able to see iterative improvements to the project between the two interviews (an important part of doing an actual job). It's an experiment, and I have no idea how it will turn out, but giving people the option to do larger projects and avoid stressful interviews just seems like a good idea. In a few months, after we've done a meaningful number of these interviews, I'll write about how their results compare to our other interviews.

1. The stress of interviewing seems to be different than the stress of performing a job. None of the people we've spoken to who do poorly in interviews report problems performing under deadlines at work, or when a website is down and there's pressure to get it back up.

Three hundred programming interviews in thirty days

We launched Triplebyte one month ago, with the goal of improving the way programmers are hired. Too many companies run interviews the way they always have, with resumes, white boards and gut calls. We described our initial ideas about how to do better than this in our manifesto. Well, a little over a month has now passed. In the last 30 days, we've done 300 interviews. We've started to put our ideas into practice, to see what works and what doesn't, and to iterate on our process. In this post, I'm going to talk about what we've learned from the first 300 interviews.

I go into a lot of detail in this post. The key findings are:
  1. Performance on our online programming quiz is a strong predictor of programming interview success
  2. Fizz buzz style coding problems are less predictive of ability to do well in a programming interview
  3. Interviews where candidates talk about a past programing project are also not very predictive

Process

Our process has four steps:
  1. Online technical screen.
  2. 15-minute phone call discussing a technical project.
  3. 45-minute screen share interview where the candidate writes code.
  4. 2-hour screen share where they do a larger coding project.
Candidates work on their own computers, using their own dev environments and strongest languages. In both of the longer interviews, they pick the problem or project to work on from a short list. We're looking to find strengths, so the idea is that most candidates should be able to pick something they're comfortable with. We keep the list of options short, however, to help standardize evaluation. We want to have a lot of data on each problem.

We're looking for programming process and understanding, not leaps of insight. We do this by offering help with design/algorithm of each problem (and not penalizing candidates for this). We evaluate interviews with a score card. For now we go a little overboard, tracking the time to reach a number of milestones in each problem. We also score on understanding, whether they speak specifically or generally, do they seem nervous, and a bunch of other things (basically everything we can think of). Most of these, no doubt, are horrible measures of performance. We record them now so that we can figure out which are good measures later.

Screening

The first experiment we ran was screening people without looking at resumes. Most job applicants are rejected at the screening stage. The sad truth is that a high percentage of the people applying for any job post on the Internet are bad. To protect the time of their interviewers, companies need a way to filter people early, at the mouth of the hiring funnel. Resumes are the traditional way to do this. However, as Aline Lerner has shown, resumes don't work. Good programmers can't be reliably distinguished from bad ones by looking at their resumes. This is a problem. What the industry needs is a way to screen candidates by looking at their actual ability, not where they went to school or worked in the past[1]. To this end, we tested two screening steps:
  1. A fizzbuzz-like programming assignment. Applicants completed two simple problems. We tracked the time to complete each, and manually graded each on correctness and code quality.
  2. An automated quiz. The questions on the quiz were multiple choice, but involved understanding actual code (e.g., look at a function, and select which of several bugs is present).
We then correlated the results of these two steps with success in our subsequent 45 minute technical interview. The following graph shows the correlations after 300 interviews.

Correlation between screening steps and interview decisions


We can see that the quiz is a strong predictor of success in our interviews! Almost a quarter of interview performance (23%) can be explained by the score on the quiz. 15% can be explained by quiz completion time (faster is better). Speed and score are themselves only loosely correlated (being accurate means you're only slightly more likely to be fast). This means that they can be combined, into what we're calling the composite score, which has the strongest correlation of all and explains 29% of interview performance![2].

The fizzbuzz-style coding problems, however, did not perform as well. While the confidence intervals are large, the current data shows less correlation with interview results. I was surprised by this. Intuitively, asking people to actually program feels like the better test of ability, especially because our interviews (the measures we're using to evaluate screening effectiveness) are heavily focused on coding. However, the data shows otherwise. The coding problems were also harder for people to finish. We saw twice the drop off rate on the coding problems as we saw on the quiz.

Talking versus coding

Before launching, we spoke to a number of smart people with experience in technical hiring to collect ideas for the interviewing. The one I liked the most was having candidates talk us through a technical project, including looking at source code. This seemed like it’d be the least adversarial, most candidate friendly approach.

As soon as we started doing them however, I saw a problem. Almost everyone was passing. Our filter was not filtering. We tried extending the duration of the interviews to probe deeper and looking at code over Google hangouts. Still, the pass rate remained too high.

The problem was we weren’t getting enough signal from talking about projects to confidently fail people. So we started following up with interviews where we asked people to write code. Suddenly, a significant percentage of the people who had spoken well about impressive-sounding projects failed, in some cases spectacularly, when given relatively simple programming tasks. Conversely, people who spoke about very trivial sounding projects (or communicated so poorly we had little idea what they had worked on) were among the best at actual programming.

In total we did 90 experience interviews, scoring across several factors (did the person seem smart, did they understand their project well, were they confident, and was the project impressive). Then we correlated our factors with performance in the 45 minute programming interview. Confidence had essentially zero correlation. Impressiveness, smartness and understanding each had about a 20% correlation. In other words, experience interviews underperformed our automated quiz in predicting success at coding.

Now, talking about past experience in more depth may be meaningful. This is how (I think) I know which of my friends are great programmers. But, we found, 45 minutes is not enough time to make talking about coding a reasonable analog for actually coding.

Interview duration, and interviewer sentiment

A final test we ran was to look at when during the interview we make decisions. Laszlo Bock, VP of People at Google, has written much about how interviewers often make decisions in the first few minutes of an interview, and spend the rest of the time backing up this decision. I wanted to make sure this was not true for us. To test this, we added a pop-up to our interviewing software, asking us every five minutes during each interview if the candidate is performing well, or poorly. Looking at these sentiments in aggregate, we can tell exactly when during each interview we made the decision.

We found that in 50% of our 45-min interviews, we "decide" (become positive for someone who ends up passing, or negative for someone who does not pass) in the first 20 minutes. In 20%, however, we do not settle on our final sentiment until the last 5 minutes. In the 2-hour interview, the results are similar. We decide 60% in the first 20 minutes (both positively and negatively), but 10% make it almost to the 2-hour mark. (In that case, unfortunately, it's positives turning to negatives, because we can't afford to send people we're unsure about to companies)[3].

Conclusion

It's been a crazy month. Guillaume, Harj and I have spent nearly all our time in interviews. Sometimes, at 10 PM on a Saturday, after a day of interviewing, I wonder why we started this company. But as I write this blog post, I remember. Hiring decisions are important, and too many companies are content to do what they've always done. In our first 30 days, we've come up with a replacement for resume screens, and shown that it works well. We've found that programming experience interviews (used at a bunch of companies) don't work particularly well. And we've written software to help us measure when and why we make decisions.

For now, we're evaluating all of our experiments against our final round interview decisions. This does create some danger of circular reasoning (perhaps we're just carefully describing our own biases). But we have to start somewhere, and basing our evaluations on how people write actual code seems like a good place. The really exciting point comes when we can re-run all this analysis, basing it on actual job performance, rather than interview results. Doing that is why we started this company.

Next, we want to experiment with giving candidates projects to do on their own time (I'm particularly interested in making this an option, to help with interview anxiety), and interviews where candidates are asked to work with an existing codebase. We're also adding harder questions to the quiz, to see if we can improve its effectiveness. We'd love to hear what you think about these ideas. Email us at founders@triplebyte.com.

Thanks to Emmett Shear, Greg Brockman and Robby Walker for reading drafts of this.

An earlier version of this post confused the correlation coefficient R with R^2, and overstated the correlations. Since this blog was posted, however, a new version of the quiz has increased the correlation of the composite score to 0.69 (0.47 R^2)

1. This is a complex issue. There are good arguments for allowing experienced programmers to skip screening steps, and not have to continually re-prove themselves. At some point, track record should be enough. However, this type of screening can also be done in very bad ways (e.g., only interviewing people who have worked at top companies or come from a few schools). Evaluating experience is something we plan to experiment with, but for now we're focusing on how to directly identify programming ability.

2. It’s worth noting the error bars (showing 95% confidence intervals). The true value for each of the correlations in the graph falls in the range shown with 95% confidence. The error bars are large because our sample is small. However, even comparing the bottom of our confidence interval to Aline Lerner’s results on resume screening (she found a correlation close to 0), shows our quiz is a far better first step in a hiring funnel than resumes are.

3. We're not perfect, and we certainty reject great people. I always like to mention this when talking about rejections. We know this (and think it's true of all interview processes). We're trying to get better.

Improving the technical hiring process

Guillaume, Ammon and I are excited to announce the launch of our new company, Triplebyte. Our goal is to build a consistent and data-driven process for hiring programmers.

Most companies make up their hiring process as they go along. We certainly did that when hiring at our own startups. This has problems. Resumes are relied on heavily as the first screen, but many great programmers have really bad resumes. Technical interviews are typically run by an interviewer who is unsure which questions to ask or how to evaluate answers. Final hiring decisions are based on gut feeling, which is rarely (i.e. never) measured for accuracy.

This is a manifesto of how we believe technical hiring should work. We want to build a company that specializes in assessing the ability of engineers without relying on the prestige of their resume credentials. Once we've identified them, we're going to help them find great places to work. We'll use the latter to measure how well we're doing at the former.

We're going to do two things differently. First, track decisions as quantitatively as possible. Second, run experiments with our own process. We expect it to change completely over time. Frankly, we'd love to get rid of interviews entirely.

We're starting our first experiment today - blind phone screens. First, we ask a few questions to verify you're a programmer. It's our version of an online FizzBuzz. Once you pass those, we ask you to schedule a 15-minute technical phone call. We only want to talk about one thing: code you've written in the past. That's literally the only thing we'll ask you about. Our hypothesis is that's enough to help good programmers stand out.  After that we'll go deeper into code you've written before over a couple of 45 minute technical interviews via screen share.   

Humans are complicated and making decisions about their ability is difficult. We're excited about trying because the potential reward is so large. A better hiring process can significantly reduce bias. It'll open up the opportunity for anyone, from anywhere, to be assessed on their ability. It'll help startups find the programmers they need to build great products. We think this would be a great thing for the world and we're excited to build it!

If you have ideas for other ways we could experiment with our process, or if you think there's a better approach than the one we're taking, we'd love to hear from you. founders@triplebyte.com.