Aline Lerner, CEO of Interviewing.io, has been on every side of the software engineering hiring equation.

She started her career as a software engineer out of MIT, took on recruiting, started her own recruiting agency, and now has formulated a test environment for job candidates to prepare their job interviewing skills.

If you’re curious about how you can get better at interviewing for software engineering jobs, then listen to our conversation to find the answers!

For the full text transcript see below the fold:

Aline Lerner

Video:

Max: Welcome all. Max of The Accidental Engineer here. Today we’re joined by Aline Lerner. Welcome!

Aline is CEO of Interviewing.io, a startup here in San Francisco.

Aline, do you mind intro-ing our audience to what exactly you guys do?

Aline: Yeah, I’d be delighted: we’re trying to make engineering hiring less stupid.

Let me put that a little more concisely–the idea is that we think resumes are a terrible way to gauge if somebody can actually code, and they open your hiring decisions to any number of unfairnesses.

Fizz Buzz Cackle

Historically, when people look at resumes they are typically looking for where people went to school or where they worked in the past.

If you haven’t gone to one of five schools or worked at one of ten companies, you’re screwed, essentially.

That doesn’t seem right, especially for a field that we all like to think is meritocratic.

But of course, in my experience it’s anything but.

I guess I should probably say what I’ve done before I start this conversation?

Max: Sure. Yeah, your background.

Aline: It was a bit windy but it ended up culminating in this Interviewing.io thing that we’re doing now.

I’m a software engineer by trade, and then I fell into recruiting, accidentally.

I was working at a small company, and nobody there was doing recruiting. As a result all of us were kind of doing it, and we kept getting interrupted from our real jobs–which was to write code–to look at resumes and do any number of other things.

So I thought, “I can actually take one for the team here and make a spreadsheet.” I got into it and I realized that it’s a very interesting space that’s fraught with a lot of inefficiencies. I thought, “I know how to code, maybe I can make this better.”

I ended up working as a recruiter for a while, both in-house, and then I started my own recruiting firm.

I realized so much of this space is driven by pedigree rather than ability, and that seemed dumb because you can tell if somebody can code, or at least if they can’t code, maybe, really fast.

So when I started Interviewing.io, the motivation was to try to make hiring really about what people can do and do it in a way where companies would actually be incentivized to use that approach over what they do now.

The way it works is we have a platform where anybody can practice technical interviewing.

When I say practice I don’t mean like doing coding challenges on their own time or anything like that–I mean you actually get to practice with another human being. The really nice thing is that it’s completely anonymous.

You are free to mess up without anybody knowing who you are. Interviewers are typically senior software engineers at companies like Google, and Facebook, Dropbox, Airbnb, a bunch of other nice sounding brands–where they are pretty proficient at giving algorithmic interviews and judging the results.

You do these interviews. You actually get feedback. It’s very, very realistic, and it’s completely free.

Then if you do well in these interviews–you don’t have to, but if you want–you can also use our platform to find a job.

So once you do well in a series of these interviews, you unlock our jobs portal. There you can look at all the companies we work with, which include companies like Lyft, and Twitch, Asana, Corra, Evernote, a number of others. I think we have over 50, and growing very quickly.

Interviewing.io's Jobs Portal

You can say, “I want to interview with this company,” and you press a button and then you have an interview with that company. The best part is that, just like practice, those interviews are on our platform and they’re anonymous.

Your first interaction with a company, rather than having to hope that a recruiter reaches out to you, or try to apply and hope you hear back even though you probably won’t, because it’s like screaming into a black hole, here you have this guaranteed interview. It’s going to happen very soon, and it’s going to be with an engineer at that company. If you do well in that interview, you can share your information. Then, typically, the next step is an onsite.

We’ve seen people go from that interview to an offer in a week or less if they’re local. So it fast tracks the process and we hope it makes it a little more pleasant and efficient for both sides. Does that make sense?

Max: That does make sense, especially since I actually have had the opportunity to try out your guys’ service. I did a practice interview, I will do more.

Aline: Did you do well?

Max: You know, it’s hard to say. I actually found out about you guys from interviewing a friend of mine, Paul Carleton, who now works at Stripe, but-

Aline: He did well.

Max: He did do well. He previously came on the show to talk with us about his experience preparing for re-interviewing with Google, where he had worked previously. He was opening his options up to other employers that he was thinking about working for as a software engineer. We’ll include a link to his interview in the show notes, but he mentioned you guys.

He had some really positive things to say about the experience, so I gave it a try.

It was about an hour long. It was entirely audio-only, talking with my interviewer, who gave me a problem of the algorithmic nature.

I think I did okay. I got some helpful, constructive feedback, particularly about what specific topics I should go brush up on.

One of the topics I’ve seen you guys write about with this very interesting perspective you guys have into the technical interview numbers on how people do and how people think they do is that–correct me if I’m wrong–but people really are bad judges of how they do in interviews. Is that correct?

Aline: That’s absolutely right. I should look and see what your feedback was.

Max: Yeah, sure.

Aline: Or what his feedback was about you. Let me give a bit of context about the data we collect and then I can tell you how everybody’s really bad about everything.

Max: Yeah, sure, and some are clearly good–or get better through practice.

Aline: For sure. During each interview, basically we collect everything. This is all the code that you write. There’s audio and then there’s a collaborative coding environment as well, where you can run code, and whiteboard, and a text chat thing.

After each interview, both sides leave some feedback about the other side. It’s kind of like Airbnb where you stay somewhere and then you review your host. Then the host reviews the guest, and after you both review each other, you see each other’s reviews.

We like this because it peels back the curtain on an activity that’s typically a black box and mysterious and scary. Right?

You generally don’t get feedback as an interviewee. If you did well enough, you get to move forward, but you don’t actually know what you did right or wrong a lot of the time. If you didn’t do well, you get a form letter saying, “Good luck in your future endeavors.” Right?

Max: Yeah.

Aline: We ask things like your interviewer will rate you on your technical ability, your problem solving ability, and communication.

But then you as the candidate also rate the interviewer on how good their questions were, and how engaging they were, and whether you’d want to work with this person.

Then we also ask interviewees how well you think you did. After each interview, we can actually compare perceived and actual performance.

And, yeah, one of the things that was very, very surprising to us is that most people can’t tell at all how they did. Of course, there’s two sides of the coin. There’s imposter syndrome, which many of our users suffer from, myself included.

Then there’s the Dunning–Kruger effect, which is the opposite–where you think you did very well even though you did very poorly. There’s a bit of that too, but by and large, people have no idea how well they’ve done.

I think this is true both in practice and in real interviews.

One of the interesting implications of that for hiring is that we’ve seen that if a person thinks they did poorly–even if they did well–their willingness to work for that company drops off.

So if they think they did badly they’re going to go into this self-flagellation exercise, and they’re going to think, “I’m a piece of shit. I can’t code, and they know it now.”

Or they’ll potentially do another thing where they say, “Oh, this employer is really bad at vetting me. They didn’t pick the kind of interview that let me showcase what I’m best at.”

In either case, that’s really bad for the employer because this is a good candidate.

One thing I always tell people when they’re hiring, especially if they’re an interviewer, is if the person did well, if you can do that, tell them right away to avoid that kind of rationalization exercise that they go into to avoid cognitive dissonance. Tell them right away and move them forward, and don’t give them the chance to decide that they totally didn’t want to work for you anyway.

Max: You have had perspective from every side of the table in this way–from having been an engineer to taking on a recruiting role, to being a freelance recruiter, to starting a company about the recruiting process.

Aline: Yeah, I’m deep in this space. I’m going to die in it!

Max: Besides the tremendous psychological … That’s dark!

Aline: Yeah, it is.

Max: But besides the tremendous psychological games that people are dealing with inside themselves about their self-confidence and how can they really get better at this interviewing game–is the fact that there’s a certain amount of time that recruiting takes.

I was curious, this is something I’ve seen in other areas of life where the probability of a deal being struck between you and somebody else has a half-life.

The longer it takes to close a deal, the less likely it is to happen. I’ve seen that happen with recruiting–that when counterparties take long to respond to different steps of the recruiting process, it is a pretty big leading indicator of whether you’re going to get that offer or the candidates going to accept the offer.

Is there any visibility that you guys have into that? Or do you have personal anecdotes to share about candidates or employers failing to proceed through the recruiting vetting process fast enough?

Aline: Yeah, I think that’s absolutely true, right?

The faster you move, the better off you generally are. There’s some causation and some correlation, it’s a tangled rat’s nest of different things. But what we’ve seen over and over is that employers that don’t follow-up immediately, especially with candidates that the market says are high value, the ones that do have that pedigree, they’re going to lose that candidate because they have so many options.

Another thing I’ve seen–and this is along the same lines, maybe a bit tangential–but there are employers who put candidates through all sorts of hoops. They’ll send you some kind of long coding challenge to do before you even get to talk to a person. We think that’s dumb, generally.

I think that anything where there’s value asymmetry–where you’re expecting the candidate to do something and you’re not giving them something back–is not the best strategy.

Especially in this market, where good engineers are so scarce–or at least because it’s inefficient they’re perceived as very scarce–anything you can do to not only move them quickly through the process but give them value at each step is critical. It’s night and day between the employers that do this well and ones that don’t.

It sucks, a lot of the big companies that smaller companies look to for cues to how to hire well are doing this really, really poorly.

Like, if you apply at Google, it takes for-fucking-ever to actually get through their process. They can afford to do that because they’re Google, and the reason people want to work there isn’t because of their process, it’s because they get to work with really, really smart people at a brand that is on a pedestal in this community. Then smaller companies look at the way Google does things and think that they can get away with it, and they cannot. But-

Max: Yeah, this is a topic that’s come up frequently, that we share as advice for our audience who are job seeking is that when you enter into a recruiting funnel with a company, pay attention to how much skin in the game the other player has.

Having early human-to-human contact is a positive indicator that they care about their recruiting process and that you’re going to have more of a feedback mechanism from those type of players.

Aline: I think that’s not always true. I’ll disagree with you.

Max: Okay, sure.

Aline: The reason is that it really depends on the relationship between the recruiting org and the engineering org in a given company.

These kinds of things are generally a signal of that relationship. Now, whether that relationship is productive or not may not actually impact your day-to-day once you’re in the org and working as an engineer.

A lot of the time recruiting can be a bit disorganized, and it’s not because the engineering org sucks, or the company sucks.

It’s because maybe the recruiting department is not as good. But there are plenty of companies who have great engineers that have bad recruiting departments.

When I was a recruiter, I always advised my candidates – I mean, I would try to pick companies that did this well, of course – “just because a company doesn’t do a great job of recruiting, doesn’t mean that you wouldn’t be happy there as an engineer.” But it does mean that if you’re coming in as an engineering manager and you’re going to have to be hiring people for your team down the line, that should be a red flag because you’re going to have to be engaging with that same department later on.

Max: I guess in that case the recruiting team might be giving that face-to-face early on interaction with candidates, and they might be failing to provide valuable communication.

Aline: Yep, yep.

Max: I find those early stages for candidates are highly affected by people’s impressions of the employer and themselves, and the nature of questions.

When I mention having early human-to-human contact, one of the things that I’ve done with technical screens and I’ve seen done successfully–but at great expense–is to proctor them, like you guys do. I find the idea of outsourcing that proctoring to a team like you-

Aline: It doesn’t make any sense to me.

Max: Yeah?

Aline: I want to make that very clear, we don’t believe in outsourcing that proctoring either.

In our case, the practice pool is a way for us to pre-vet candidates, although we also think it gives candidates a ton of value to get that practice. We would never encourage a company to forgo any part of their process, right?

It’s just that we think that they should do their technical screen on our platform. If we were hiring for your employer, you would be interviewing our candidates and doing exactly the same thing that you would do in a technical phone screen, except that you would be doing it on our platform, anonymously.

I don’t know if I even mentioned that earlier. That’s actually really important. I guess I did, but the company screens are anonymous as well, and that takes some getting used to. We don’t want you to trust our vetting, we want you to trust our pre-vetting enough to be willing to talk to people, if that distinction makes sense.

So, you would still do the same thing you would normally do in that screen. Then, after that, if you like the candidate they are yours to do with as you please.

Max: Got it. Besides warning people that their self-assessments of how they’re doing in job interviews are highly inaccurate, do you have any other broad, very general advice for job seekers?

Aline: Well, you should use our platform to practice!

But outside of that, one of the other interesting pieces of data that we’ve seen is that from interview to interview, people’s performance varies quite a bit.

And we have a lot of people that have killed on a few interviews, that do very, very poorly on a few others. I think that a lot of this is because the interview process–at least in most incarnations–is really non-deterministic.

As you said, there’s so many factors that go into whether it’s going to turn out right.

Some of it is just the rapport with the interviewer. Some of it is whether they’re asking you something you know about or not.

Of course, the best interview questions shouldn’t be drawing on specific knowledge, but it’s really hard to come up with good interview questions and not everyone does it well. Sometimes it’s like, “Oh, shit, my brain just didn’t click on that particular thing.”

We’ve seen over and over that the process is just kind of a chaotic mess. One of the reasons that we like what we’re doing is that when we decide whether someone is good at coding, we’re looking at a number of data points in aggregate. The good news is that after a number of interviews, you do start to converge, right? In aggregate, you can make some conclusions.

But to turn this into potential advice is like, “Hey if you mess up a few interviews, everybody does. Just keep going and get better.”

That’s easy advice to give but the data does support it and I hope that that makes it a little less cliché and a little more useful, especially for people that are just getting into this field.

What we’ve seen over and over is [candidates] tend to get very discouraged, especially if they’re not used to the format. One thing that I would encourage everybody to think about is the fact that it really is a game, and you’re just flipping coins. Eventually, they’re going to land the way you want, and that’s how it is as long as you’re actually making effort and trying to continue to broaden your knowledge on the topics that you messed up.

Max: Is there a certain level of expertise that you’d suggest you guys are a better suited service for, when it comes to candidates? Like, people in our audience who are job seekers, maybe they haven’t had an engineering job yet, would they be good candidates for trying out Interviewing.io?

Aline: Generally, our platforms a bit better for engineers like you, people that are experienced.

The reason for that is that we are not yet equipped to be an educational service, so you just jump into the deep end. Once you get on our platform, you’re going to be paired with a senior engineer and they’re going to start asking you stuff. If you’ve never done a technical interview, it might be overwhelming.

We do have junior folks that have done very, very well. And one of the things that’s exciting to me–a job description will always have some requisite years of experience requirement, those are bullshit a lot of the time.

I think we all viscerally kind of suspect that it’s bullshit, but what’s nice about our platform is that if you’re junior and you’re interviewing like your senior, then you should have the opportunity to get those jobs.

It’s not perfect, but we’ve seen people get hired for senior positions even though maybe they’ve been coding for a year. Those people are remarkable and I’m happy that we were able to give them that little nudge forward.

So I’d say if you’re familiar with the technical interview process, then you will probably get value out of our platform. If you are not, then there are a bunch of other great resources that you can probably hit up first, and revisit our tool.

The only exception is, earlier this year we launched a university offering. With that we are trying to kill career fairs because we think they’re stupid and a waste of everyone’s time.

It also sucks, we wrote about this on our blog, how companies typically just go to the save five schools) and then they’re singing a song about how much they care about diversity, but that’s not how it works.

We want to create a platform where it doesn’t matter where anybody went to school at all, and if you want smart students, no matter who they are, where they come from, we will surface them for you. Because we know how they do in practice, and then all of a sudden, everything else but performance is off the table.

In that case, we’re very, very open to junior candidates, whether they’re college juniors, so intern candidates, or college seniors, new grads. We have a separate funnel for that. When you’re using that, you just get on our platform. You just take a quick five, ten minute coding challenge, and then if you do well in that, you’re in. Then you can use us to get internships and new grad positions.

Max: Is there a tremendous amount of interest in the internship path? Is that a common use case for anybody?

Aline: It’s very interesting for students. With companies, it varies a bit. This is our first time doing it, so we’re still kind of figuring out how to best package that up and sell it. But we’ve had success with companies like Lyft and Quora, which is very exciting to us. Talking to a few larger companies now, as well. It’s weird. Like a lot of companies are very bought into career fairs, because that’s how it’s been for so long. So time will tell whether we can disrupt that approach a little bit. But I hope whether it’s us or somebody else that somebody does. And I’ve been to those events, I know, everybody thinks they’re stupid and people keep doing it anyway. So that’s a sign that somebody should fix it, and I hope it’s us.

Max: I think in many cases, it’s an excuse to get out of the office, like go grab a coffee. I’ve done my fair share, too.

Aline: Yeah, I mean, they can be fun. I’ve been to them and it’s fun to talk to students, right? Because they’re all so eager and excited, especially to talk to people that are doing the things they’re thinking about doing.

At the same time, though, if you think about the cost of the engineering time, especially if you’re a smaller company.

God, even the cost–like, I went to MIT, I think going to a career fair, it’s like something like $20k to get a table that’s not in a dark basement.

To get any kind of prime real estate, it’s prohibitively expensive to anybody but the companies with the largest bankrolls. That kind of perpetuates this cycle of inequality in a lot of ways for a lot of students. People always want to work with these large companies, but for many of them, I think, getting into a startup would be a better approach. Not for everyone, right?

But I think people should have the opportunity to make an informed decision, rather than only ever being exposed to one of a few things.

Max: Another aspect of the campus career fair that also makes me think about your guys’ service is how, at a campus career fair, you get students with majors in every department.

I’m curious whether this interviewing problem–this technical screening problem–is unique to computer science, web development?

Or whether you guys have any plans to broaden your service offering to non-engineering interview processes?

From all the guests we’ve had on previously this is a pretty unique problem in the labor market–is that software engineers are the ones that have these bizarre technical screens. So I’m curious as to whether you guys note any parallels in other verticals or job markets?

Aline: That’s a great question. We got this question a lot when we were fundraising, so I think I have an answer, but you tell me what you think.

The honest answer is that I don’t know.

We’ve been around for about a couple years and even now we are primarily equipped for backend and full stack software engineers which is a pretty narrow slice of what engineering is. Right? We still haven’t done much with mobile for instance, or devops or data engineering.

Data science is an adjacent vertical, but there’s enough overall and we already have the tooling. We can do SQL, we can import various toolkits, and Python. I think we’ll have to tackle those first before we go outside of the engineering umbrella.

However, more broadly, I ask myself two questions to see if a vertical might be suitable for this approach. One is: is hiring people for companies in this vertical really, really hard?

The reason for that is if it’s not hard then it’s not a “hair-on-fire” problem, and then why would they change anything about what they do?

I mean, what we do is really fucking weird. From a company’s perspective, it’s like, “Hey, instead of knowing stuff about candidates, we’re just going to have you talk to randos. You’re not going to know anything and you’re going to need precious engineering time to do it, upfront.” That’s absurd, right? Compared to how things happen, it’s crazy.

Now the reason it works is because we know how people have done in interviews, and then once you explain that, and we’re like, “Your typical conversion rates in a phone screen with us are going to double or triple, because past interview performance, magically predicts future.” Yeah, at least correlates with future interview performance.

But, on the surface, it’s weird. It requires a process change, and the only way companies, especially larger ones, which is where the money is, make process changes is if they’re desperate. So, that’s one thing, and we’re fortunate that they are. Our hope is that we can get companies to buy into this approach while the market is the way it is, and then maybe when it turns we’ll already be a more accepted way of doing things and it’ll be like the thing you do. Let’s hope.

Max: I would argue, and I would put money on the software engineering job market going up by whatever metric you’d like to use.

Aline: Certainly we hope so. And, certainly when we pitch what we do, we lean in that direction. But, it’s weird. When you’re a founder, you have to believe 2 very different things at all times.

One is that you’re going to succeed no matter what. And the other one is that you’re going to fail.

You have to believe both of those things acutely. So it’s an interesting head space to be in.

The other thing, though, that I ask myself about different verticals – and I think you alluded to this when you talked about the bizarre way we interview people – is there a way to gauge ability without a lot of context or prior knowledge or infrastructure? At least at a first pass. So, technical interviews are a pretty good example, right?

If I ask you to reverse a string, or whatever, or something like a fizz buzz, one of these stupid questions everybody hates, but if you can’t do it, that’s a pretty strong signal. If you can’t reverse a string as an engineer, it’s not perfect, but at least it’s a pretty good proxy and it takes like five minutes for me to figure that out.

In a field like design for instance it’s a little harder to do that. You can ask somebody a question, but you probably want to see their portfolio. Designers live and die by their portfolio and that changes the dynamic a little bit.

With product management, I don’t know that there’s a very quick question you can ask somebody that’s going to gauge whether they can do the job without also looking at the kinds of product that they’ve managed in the past.

One vertical that I think could lend itself well to this, at least from this perspective, is what is called management consulting. They all do case studies in those interviews. It doesn’t matter who you are, if you do well in these case studies then you can potentially can move forward, but I don’t know what the labor market looks like for that.

But those are the two criteria. We are definitely open to it, but I have no idea if we’ll do it or not.

Max: Okay. Well, I think it would be a good point in the video to plug how people can sign up for you guys, and what it’s like for a first-time Interviewing.io user.

Aline: Yeah, awesome. Thanks for the opportunity. I love plugging our stuff. It’s pretty simple, so you can go to Interviewing.io. There is a link that says, I think now we say, “Give it a try.” We did some A, B testing. It used to say, “Sign up.” But I think, “Give it a try,” converts 10 or 15% better than, “Sign up.” So, there’s a little nugget of unexpected info. Either way, there’s a sign up button, and then after that, we’re still official in private beta, although we’re going to be out of it very soon, so there might be a bit of a wait. Either way, it’s all going to open up later this year. For many people there isn’t a wait any longer. We’re trying out a few different things in that regard.

Once you’re in you’ll be able to schedule an anonymous practice interview almost immediately. You’ll just see time slots and you pick a time slot. Then if you connect your calendar it goes on your calendar, and then all you have to do is show up at that time and you’ll be matched with some rando on the internet that will ask you questions.

But these randos typically know what they’re doing, or so we hope. At the end of that interview you’ll get some feedback and you’ll also have a recording of your interview that you can look at later and hopefully use to improve.

We also have a repo of public interviews that people have done that they’ve been willing to share. Our users have told us that being able to watch other people’s interviews has been illuminating. Because I think everybody wants to know what good looks like and none of us actually know. So being able to peel back the curtain on that a little bit has been nice.

But, yeah, that’s it. You sign up. You pick a time slot, you do your thing. After a few interviews, if you do well, you’ll get an email saying you’ve unlocked our jobs portal. Then you can go and look at that and book interviews at top companies without having to talk to people or schedule stupid stuff.

Max: Our audience should absolutely give it a try–I did!

Aline, thank you for coming on. I hope we have a chance to circle back!

Aline: I would very much appreciate that, and thank you for having me.

If any of your audience is interested in technical interview data and some of the counter-intuitive results that we’ve gotten from that, just go to blog.Interviewing.io, it has all the graphs, and it talks about how everything we know about everything is wrong, and interviewing is a mess. So if you feel that interviewing is a mess, you might enjoy seeing some data around that to confirm your biases!