Unconscious bias in hiring algorithms

by Nov 26, 2021

Home » The Disability Inclusion Blog » Unconscious bias in hiring algorithms

Artificial intelligence is an integral part of the workplace, for good reason. After all, AI reduces the time it takes to perform tasks, minimizes human error, and operates each and every day without interruptions or breaks. 

When it comes to employment and hiring practices, AI is used to: 

  • advertise open positions
  • sort through resumes and job applications 
  • select applicants for interviews
  • conduct pre-selection interviews

But an AI program is only as good as the person who creates it. And because we all have some unconscious biases, these biases can make their way into artificial intelligence systems. For example, an employer could optimize its AI system to reject resumes and applications of anyone unemployed for a long time. Such an algorithm could discard innumerable qualified candidates with disabilities. It’s not enough to examine and adjust AI algorithms. Employers and hiring managers need to reform their hiring practices to ensure all qualified job applicants are considered.

Employers looking to improve their hiring approach would greatly benefit from our three iDisability® training modules dedicated to interviewing individuals with disabilities. The Inclusive Practices in Remote Screening module specifically covers how AI and other elements of the remote interview process may be affected by unconscious bias.

Creating a hiring process that doesn’t automatically discount candidates with disabilities benefits both an organization and its workforce. iDisability® has helped over 4 million business users across a variety of industries promote a more inclusive workforce. Discover how we can help you today.

Related posts:

0 Comments

Submit a Comment

Your email address will not be published.