How to Reduce Bias When Hiring Software Engineers Virtually

Bias-test your questions, train your interviewers on clear communication and implement standardized scoring rubrics.
Shannon Hogue
Shannon Hogue
Expert Columnist
April 16, 2020
Updated: August 20, 2020
Shannon Hogue
Shannon Hogue
Expert Columnist
April 16, 2020
Updated: August 20, 2020

Many engineering leaders I’ve talked to over the past month have expressed genuine concern. The concern is that as organizations tighten their spending, diversity and inclusion initiatives will be on the chopping block. This concern isn’t just relegated to budgeting. An often-overlooked factor of decentralization — like taking hiring and interviewing remote — is that it adds variables. And variables create bias.

You can combat and reduce bias in virtual hiring by using structured interviews, training interviewers, and standardizing scorecards.


Ensure Your Interview Questions Don’t Have Bias Baked In

First and foremost, you should be using standardized interview questions. Letting each interviewer come up with their own questions is a fundamentally flawed way to create a strong hiring signal because it’s impossible to compare performance across your (now distributed) network of interviews.

To create standard interview questions, think about how they’re structured. A well-structured question will make clear to the candidate what is expected of them and will reduce the risk of false negatives.

One example of a non-structured interview question requires candidates to illustrate a diagram of a well-known software system on a whiteboard. This question is so widely used it has become an industry cliché, and I’ve even heard of companies trying to recreate this in a virtual setting. Unfortunately, the question is really bad at evaluating the skills of candidates who haven’t been previously exposed to the definitions of the boxes and lines on the board and is even worse at being inclusive of candidates who aren’t set up with at-home whiteboarding equipment or digital pens. This is a perfect example of introducing bias against candidates from non-traditional backgrounds. 

A more effective system-design question could be based on presenting a simple, concrete mechanism to the candidate and asking them how they would approach scaling it or adding a new feature. This question is meant to be open-ended and clarifiable. Allow the candidate to ask follow-up questions so that they understand the system you are discussing, as well as what kind of answers you're looking for. And because it’s a discussion, it doesn’t have to involve any haphazard virtual whiteboard workarounds!


Train Your Interviewers to Recognize and Reduce Bias

Bias was already something we needed to be mindful of when hiring in the pre-COVID-19 world, but that challenge is only magnified in a remote setting. In virtual interviews, it’s harder for the interviewer to build rapport and demonstrate empathy. This increases the likelihood that interviewers and hiring managers will "connect" with individuals who are similar to themselves. 

There are plenty of easy-to-find training resources on recognizing unconscious bias, but for the purposes of interviewer training, focus on communication. Specifically, clarity, encouragement, and guidance.

Clarity reduces the variables and bias that come with ambiguity. I touched on this in my column last month, but the key is to make sure interviewers are clear about the competencies they are assessing. This is the most predictive (and fair) way to evaluate a candidate’s problem-solving abilities.

Encouragement is another important element. Combative and aggressive interviewers are something we’ve all probably experienced at some point. However, unless your goal is to have a combative and aggressive engineering culture, performing under those stressful conditions isn’t a predictive indicator of on-the-job success. Believe that your candidate can succeed, regardless of their background.

Guidance, or “handholding” as we call it in technical interviews, is the easiest way to skew the results of an interview and introduce bias because a carefully hidden tip to a candidate you feel good about can totally transform an interview. It can be okay to give hints, especially on a piece of knowledge that is not a strong signal for success in the role. Technical interviews shouldn’t be trivia games, after all, but always record the hints that are given to avoid the bias of false-positives with candidates who “connect” with their interviewer. This will lead to wasted time in bad final-round interviews with candidates who don’t have the required competences, or worse...a bad hire.


Use Standardized Scoring Rubrics

I also covered consistent measurement in my last piece, but it’s an area that I’ve received a ton of questions about over the past month, so it’s worth revisiting.

Structured scoring rubrics make it clear to the interviewer which competencies matter, and they help the interviewer decide what they should spend more time evaluating.

Create one by first identifying which competencies are both relevant and important to assess during the interview. Next, for each competency, list observable behavior and results as checkboxes (select all) and/or radio buttons (choose one).

For example, if Technical Communication is a relevant competency, you might list “Technical Communication” on the rubric with a specific scale: 

  • ( ) Notably high-quality communication
  • ( ) Clear explanations
  • ( ) Confusing or disorganized explanations
  • ( ) Notably negative communication

Finally, you’ll notice that the above rubric example does not include a hiring decision. The results of an interview should be looked at objectively, without an explicit “hire” or “don’t hire” conclusion. This lets others review the results and check for potential indications of bias. This also arms the hiring manager with the most objective data possible to make the best decision.

Getting technical interviewing right is hard, even under ideal conditions. When things get difficult, it’s easy to cut corners, especially at a time when organizations are scrambling to maintain even the most basic business processes. But cutting corners means introducing variables, which means introducing more bias. Teams and companies who get it right during this time will come out with stronger engineering teams and cultures. They’ll have stronger employer brands and more diverse teams, and their candidates will have fair and enjoyable experiences that will bolster retention in the long-term.

Expert Contributors

Built In’s expert contributor network publishes thoughtful, solutions-oriented stories written by innovative tech professionals. It is the tech industry’s definitive destination for sharing compelling, first-person accounts of problem-solving on the road to innovation.

Learn More

Great Companies Need Great People. That's Where We Come In.

Recruit With Us