How Can AI Recruiting Support Diversity?
If you’re a leader in the staffing or talent acquisition space, you already know that a diverse workforce is a better workforce. From increased productivity, engagement, innovation, range of skills, and everything in-between, this idea is commonplace for you. And that’s great — you know you need to build a diverse team, but knowing this alone isn’t and has never been enough.
Market data proves that many recruiting leaders don’t fully understand the impact AI can have on your diversity specs. This isn’t surprising because the AI recruiting space is also known to be cluttered and confusing even for those most apt to take on new technology. If you really care about improving diversity, you need to recognize that choosing the right technology can help you eliminate your bias and make more diverse hires than ever before.Unconscious bias can destroy your recruiting potential without knowing it. In @MyaSystems’ article, discover how selecting the right #AIrecruiting tech fights bias and promotes a #diverseworkforce! Click To Tweet
So, we’re going to address the issues with the different types of recruiting technology and what to consider when evaluating AI recruiting for diversity impact.
How AI Recruiting Can Create Bias in the Hiring Process
From the outside looking in, many would think, “Why wouldn’t artificial intelligence be an objective third party? How can there be any bias?” You have to consider that any form of AI didn’t just appear from the ether. It was created by humans who are inherently biased creatures. And just because an action is done objectively doesn’t necessarily mean the action itself isn’t biased. Consider this:
1. Was the team that built the AI diverse?
You may have heard of the AI built for, and by, Amazon and Facebook that was trained on datasets of white male engineers and built by a team of white male engineers. The AI naturally became biased. When considering the best AI for your recruiting initiatives, stress the importance of who built it and with what data. If the team isn’t diverse, the product is likely inherently biased.
2. Is it used for resume parsing?
You want to avoid resume parsing. With parsing, AI will look for keywords in resumes, which will exacerbate the known issues with resume scanning. Studies show that companies are more than twice as likely to call minority applicants for interviews if they submit ‘whitened’ resumes than candidates who reveal their race—whether the company claims they value diversity or not. This is something that a staffing or recruiting leader may not even be aware is occurring. When they hear that their technology looks for ‘best fit’ keywords, you need to consider what those words include, omit, and what candidates you may be missing out on in the process.
Don’t Feel Like There’s Not a Form of AI Out There That You Can’t Trust.
Don’t feel like there’s not a form of AI out there that you can’t trust. You can still make a positive impact on diversity with the right vetting of your tech.
1. Look for diverse teams.
Look for AI recruiting companies with diverse engineering teams and established processes for preventing bias from entering their AI. If it was designed with avoiding biases in mind, they’ve likely addressed a fully-objective platform. Make sure it was built and has the data to back it up from teams with women and people of color.Does AI Recruiting do more harm than good in terms of #hiringbias? Find out in @MyaSystems’ article on #diversityrecruiting. Click To Tweet
2. Seek out chatbots and conversational AI.
In lieu of resume parsers, consider recruiting chatbots and conversational AI. These types of automation solutions focus on gathering candidate data that’s actually relevant to the individuals’ ability to perform the role—by way of a conversation. A pointed conversation can reveal information that resumes alone won’t show.
3. Focus on conversation design
Seek out AI solutions that focus on conversation design. Data from job description studies prove that certain words used during the hiring process dissuade both women and minorities from roles. This same concept rings true for AI recruiting. Companies with a keen focus on conversation design have dedicated teams of professional linguists who carefully consider each word used by the AI to prevent biased language and words that would steer women and minorities away.
The need for a diverse workforce will never disappear, and making vital decisions at your company — like choosing the right AI recruiting for diversity — are at the core of your future success. Discover how Mya’s all-in-one Conversational AI platform puts the conversation first and removes bias from your recruiting process.