Recruiting

High-Volume Recruiting Especially Susceptible to Unconscious Bias

Any hiring process comes with the potential for unconscious bias to play a role. Some companies have methods for combating such biases, with an emphasis on delaying the application of bias. In other words, by hiding a candidate’s identifying information until after the person has been considered, the recruiting manager has time to learn what’s important about that candidate before learning his or her age, race, and even sex. That takes time, and large-scale hiring operations often don’t have the available time or resources to take those steps.

I recently spoke with Eyal Grayevsky, an expert in hiring bias and the CEO and Cofounder of Mya Systems, an artificial intelligence (AI) recruiter. He is very aware of the problems of bias during hiring, remarking that “there’s several systemic unconscious bias issues through the recruitment process that impact people adversely.”

Why Unconscious Bias Is a Problem

Put briefly, unconscious bias creates unfair disadvantages for those applicants who are discarded because of that bias. It’s especially devastating because while everyone has unconscious biases, few people are aware of the critical role they play in their decision-making. If people apply their biases during, say, a commute to drivers from specific states, not many life-altering decisions will be made. But if a hiring manager applies a similar bias during hiring, he or she may very well overlook excellent candidates for no particularly good reason. Worse, the manager may even create liabilities should unfair hiring trends arise.

Biases Exist for Every Imaginable Characteristic

Bias is often conflated with race or gender, and that certainly happens. But it does not stop there. Biases can exist for clothing, hair styles, accents, nationality, religion, or even a candidate’s education level, to name a few. Yet none of these things defines a candidate’s value. The inherent worth of any employee is tied to a host of factors that are difficult to assess by reading a résumé or conducting an interview.

For that reason, even overt biases, like the tendency for those who come from certain select schools to be considered more desirable, are not adequate for evaluating a candidate’s true potential, says Grayevsky. In that specific case, he adds that “it’s an antiquated way of thinking. When you take that approach, you’re really putting certain groups with systemic issues around access to education or other systemic issues that have maybe put certain groups at a disadvantage.” In other words, it’s not a very good idea.

Large-Scale Recruiting

Every recruiting program has the potential to be negatively affected by unconscious bias. However, Grayevsky notes that large-scale recruiting is especially susceptible to these issues, saying that “when organizations are hiring at scale and dealing with a tremendous amount of applications, they are spending a lot of time sourcing.” And there are some organizations that are filling thousands of roles. He continues, “Given the pace and the scale of the recruitment process in those scenarios, hiring managers are rapidly assessing applications to try to rapidly fill those roles. Given the speed at which recruiters are forced to operate,” there is an increased chance of unconscious bias.

Yes, all of the ordinary issues with unconscious bias in recruiting programs are amplified by scale. When workers have to sift through hundreds or thousands of applicants, they will almost certainly rely on their instincts to make it more manageable. And at the root of the unconscious bias problem is instinct. Flash-of-the-pan judgments based on little more than a gut feeling do not lend themselves to reasoned, informed decision-making.

How an AI Recruiter Can Help

AI like what Grayevsky has developed can help take the human foibles out of the early stages of the process to give recruiters a somewhat purer view of candidates. For example, having AI chatbots collect information from candidates can help software collect a lot of valuable information about those candidates’ viability. Then, that information can be presented to hiring managers, giving them a chance to make a decision based not on biases but on useful information.

The AI, according to Grayevsky, can ask questions like “What are you looking for in your role?” and “What do you feel makes you most qualified for this opportunity?” Typically, such questions are asked in a conversation well after hiring managers have already made the decision to interview a specific candidate and not others. But arming hiring managers with this information before they decide whom to interview will give them a better idea of the candidate’s viability.

There are other advantages to those solutions, as well. When organizations hire, especially at high volumes, they have a period of time during which they collect applications. Then, they sort them and choose who goes to the next stage. That means an interested, qualified candidate may not hear from a recruiter for some time. AI chatbots can talk to applicants immediately or within a few hours. That provides the engagement many applicants crave. Our own research has shown that applicants’ number one complaint is a lack of communication during various stages of the process. Such a solution can really address that problem.

Final Thoughts

While not everyone has the budget to run out and pick up an AI chatbot, the solutions it could provide can help characterize what might be missing in your current hiring process. What steps are you taking to eliminate unconscious bias during hiring?

Leave a Reply

Your email address will not be published. Required fields are marked *