A Cheat Sheet to Preventing Unconscious Bias in Hiring
Unconscious bias continues to drive poor hiring decisions for companies of all shapes and sizes. Whether we’re aware of it or not, many hiring managers and recruiters base their candidate reviews on factors that have no impact on a role’s requirements or qualifications — causing many teams to lack diversity, perspective, and potential for growth. This checklist is designed to help you identify areas for improvement in your hiring strategy and ensure you don’t lose top-notch candidates due to biased decision-making.
Add Diversity To Your Team’s Perspective
If there are just one or two team members involved in your candidate screening process, this is the perfect place to start. Basing hiring decisions solely on a couple of opinions and not evaluating the proper criteria can result in unknowingly biased decision-making, and even pose the risk of making a bad hire. Reducing the impact of unconscious bias and diversifying your team with numerous perspectives, backgrounds, and experiences begin with diversified decision-makers:We all like to think we’re incapable of #hiringbiases. Still, the reality is that your #hiringmanagers and #recruiters are fighting to prevent unconscious bias from leading to poor hiring decisions. @MyaSystems compiled a cheat sheet to combat this: Click To Tweet
Implement hiring panels. Think about stakeholders at your organization involved in hiring decisions. How similar are they? Perhaps a candidate grew up in your hometown or attended the same university as you. Naturally, you’re likely to favor a candidate if they’re connected to your childhood or bleed the same college pride that you do. The more varied backgrounds you bring to the table, the more effective your decision-making will be. Involving a larger group of decision-makers allows you to look beyond similarities between yourself and the candidate, ensuring your team makes the right hiring decision based on the most important criteria for the role.
The more, the merrier, right? While your hiring panel’s size is vital to well-thought-out decisions, so is the representation of different backgrounds. Your hiring panel may include ten employees – but, if they’re all from the same university, it’s less likely that you’ll see unique perspectives brought on by that panel. To improve representation in your panel, require a relatively even distribution by race, gender, and education.
But, what about cultural fit? Of course, having a connection with the candidate based on shared life experience, hobby, or quirky fun-facts is exciting. Cultural fit and personality are just as important as a candidate’s work ethic and skillsets. The goal isn’t to ignore these considerations, but block bias about a person’s qualifications by putting criteria irrelevant to the role at hand above role-specific requirements.
Use blind resumes. Unconscious bias can start with the first thing you see on a resume — their name. Unfortunately, a study shows that 28% of applicants with foreign names are less likely to get a callback, regardless of birth country, education, and work experience. To prevent this, try replacing names with numbers on resumes. You can also reduce the impact of bias by pre-screening candidates with advanced conversational AI technology — where basic qualifications will be assessed before the humans behind your hiring process even see the candidate’s resume.
Include a Predefined Mix of Qualitative and Quantitative Analysis of Candidate Fit
As recruiting professionals, we already know that qualitative and quantitative criteria for candidate fit are equally important. But, does your current recruitment process represent that statement? Many HR departments still lack this structure and ‘wing it’ on the spot when reviewing candidates. Here’s what you can do to support an equal distribution:
Re-evaluate performance predictors. Studies indicate that recruiters spent 80% of their time reviewing a resume looking at name, title, and education (i.e., top of the resume) — not accomplishments from previous roles or skills acquired. This is because of anchoring bias, which refers to humans spending more time scanning what comes first than what follows. So, why are we still instructing candidates to place “essential” information at the top? Think about what other questions you can ask in your applications that highlight personality and data-backed accomplishments.
Hold structured and unstructured interviews. To reduce the impact of bias, your hiring process needs to have structure. Start all of your candidates off with a structured interview (where you ask each candidate the same questions). Your second round of interviews should be unstructured, where you have the opportunity to assess the candidate on a more personal level and indicate qualifications such as cultural fit and personality.
Create weighted evaluations. When evaluating candidates, many recruiters accidentally fall under the spell of the halo effect — a type of unconscious bias in which we judge someone as fundamentally good based on one good quality. By implementing a rating scale for your behavioral-based questions, and your subjective assessment of a candidate, you’ll have a healthier mix between qualitative and quantitative criteria that will better support your hiring decisions.
Reduce Recruiter Load
Invest in your recruiting team. Your recruiting team will shine when they aren’t bogged down with initial screening and administrative tasks vulnerable to initial biases. When you invest in AI technology, recruiters can shift away from daily admin needs that come naturally with the job and focus more energy on building relationships with potential hires that have already been screened for experience and skillset. When you invest in your recruiters’ tech arsenal, they get the opportunity to do what they do best —build relationships and vet candidate character.
Turn to a Conversational AI recruiting assistant. When choosing an AI, seek out conversational AI specifically. In short, Conversational AI is a set of technologies that enable computers to simulate real human-like conversations.
Conversational AI serves a deeper purpose than simply answering FAQs. They assist in building strong relationships with candidates in a way that wasn’t previously possible with existing bots. Rather than a simple input and output, initial candidate conversations can be complex. A CAI learns the nuances of our language and considers how we say something rather than just what we say. This way, the candidate experience begins with a dialogue that feels more human and less robotic.
Awareness of bias is only the first step. The second is to create a plan of action to address them. Consider your current sourcing, recruiting, and hiring process. Where are the potential blind spots where unconscious bias could come into play? While recruiting and hiring is just one component of the labor market, the negative impact of unconscious bias undermines any organizational efforts to create a more equitable world of work. What areas of your hiring process could bias occur? What are the potential solutions?
We all like to think we’re incapable of biases. Still, the reality is that your hiring managers and recruiters are still fighting to prevent unconscious bias from impacting their hiring decisions – and they’d welcome some help. With the right processes supported by the right tech, you can mitigate the impact of unconscious bias, and your company can take another step towards creating a more equitable world of work for all. .
Whether you use conversational AI to support your hiring or not, the steps in this cheatsheet will make a positive impact on your processes. Remember unconscious bias is unfair to your pool of candidates, but it also drives poor hiring decisions, leaving more opportunities for your competitors to hire the top talent you overlook.
If you’re ready to find out more about conversational AI, rise above your competitors, and impress your candidates, say Hello to Mya today!