Ep 224 – Eliminating Algorithmic Bias in Hiring and Employment

This Workology Podcast is sponsored by Workology. The business case for artificial intelligence in HR and your workplace is growing by leaps and bounds every single day. Employers and HR leaders have real concerns about bias hiring when using AI which is commonly referred to as algorithmic bias. In today’s podcast episode we’re going straight to the source and talking with one AI company that works in HR and talking with them about their approach to reducing algorithm bias and the steps that employers can take. 

Episode 224: Eliminating Algorithmic Bias in Hiring and Employment with Steve Feyer (@SteveFeyer)

This episode of the Workology Podcast is part of our Future of Work series powered by PEAT, the Partnership on Employment & Accessible Technology. In honor of the upcoming 30th anniversary of the Americans with Disabilities Act this July, we’re investigating what the next 30 years will look like for people with disabilities at work, and the potential of emerging technologies to make workplaces more inclusive and accessible. Today, I’m joined by Steve Feyer. He’s the Director of Product Marketing with HR Technology company, Eightfold.AI. 

How Artificial Intelligence Can Help in Hiring People with Disabilities 

AI technologies offer a great way to scale and automate your hiring a number of different ways. Steve shares with us how to artificial intelligence helps organizations find hidden talent. We’ve seen in our podcast series with PEAT on the future of work how AI can be used to match candidates with disabilities to jobs. We’ve talked with lawyers as well as PhD’s how AI might help or hurt employment prospects for a number of candidate types who’s work experiences, education, and other areas might limit their qualifications when using AI. What’s critical for employers to understand and approach the application of AI is to view this as a partnership between technology provider and the employer itself. An AI algorithm isn’t a once size fits all. It must be constantly monitored and is designed to change and evolve over time. Al technologies help identify talent a variety of different ways. It is the responsibility of employers to educate themselves on the application and use of AI when it comes to hiring and recruiting employees of all types of experiences and backgrounds including people with disabilities.

AI for HR and Recruiting a Continuous Process 

There’s been a lot of news talking about the benefits and perils of AI. One of the most known case studies of AI gone wrong is with Amazon. I’ll include the link to this case study in the resources section of this podcast episode. Steve talks to us about how AI technologies like EightFold are in a constant flow of continuous improvement using what he calls an Equal Opportunity Algorithm. The Equal Opportunity Algorithm is a series of statistical tests that EightFold AI runs against the AI. Steve talks us through an example of how this check and balance is used in the podcast interview using the topic of gender. He shares that this method will look at all the factors you’re considering and tell you with any of them are actually creating a bias against gender. The process will analyze, evaluate, and suggest that a factor possibly leading to gender bias is happening with the algorithm concerning college that a candidate or prospect attended. He says that the check might result in that some gender bias that exists in the output. Using this form of check and balance allows the employer to modify the data that the computer is allowed to use, tell it to forget something ,and run the test again and confirm that now the bias is no longer present.

The thing that Steve shares about algorithms is that they are not perfect and require a series of checks, balances, and constant analysis to ensure that bias is not happening. Once it is detected it is up to the employer in partnership with the AI technology to make the change and adjustment. It’s a continuous process that looks at eliminating conscious and unconscious bias in the hiring process.

HR leaders should ask how their AI providers how they address bias and how is it reaching a specific prediction without bias? Be sure that your vendor has considered this issue and can offer proof. - @stevefeyer #recruiting #hiring #ai Click To Tweet


Eightfold AI’s transparent approach is what we need for more AI tech companies especially in the HR and recruiting industry. Because of employment laws, HR needs to be working hand in hand with their HR technologies especially those that use AI. As HR leaders, it’s our responsibility to be educated and aware about the benefits and potential pitfalls that exist when using AI. This starts with educating yourself on the fundaments of artificial intelligence. I’m linking to a number of podcast interviews we’ve done in the past on AI as well as additional resources to help you get started in diving into this fast-moving technology. 

The future of work series in partnership with PEAT is one of my favorites. Thank you to PEAT as well as our podcast sponsor Workology.

Connect with Steve Feyer on LinkedIn



– Buy IT!—Your Guide for Purchasing Accessible Technology

– Ep 185: Making Artificial Intelligence Inclusive

– Ep 121: How Artificial Intelligence Creates Discrimination in Hiring and HR

– Amazon Scraps Secret AI Tool Finding Bias in Its Hiring Recommendations

– AI Fairness for People with Disabilities 

– Episode Transcript 


How to Subscribe to the Workology Podcast


Stitcher | PocketCast | iTunes | Podcast RSS | Google Play | YouTube | TuneIn

Find out how to be a guest on the Workology Podcast.

Posted in

Jessica Miller-Merrell

Learn more about Jessica Miller-Merrell, SPHR, SHRM-SCP, the founder of Workology, a workplace HR resource, and the host of the Workology Podcast. More of her blogs can be found here.


Pin It on Pinterest