When I applied for my job at the Office of Disability Employment Policy a few years ago, a human resources officer reviewed my resume and scheduled interviews with potential colleagues. That’s a familiar process for many job applicants, though artificial intelligence tools like automated resume screening and interview analysis are increasingly common. While these tools can simplify the complicated hiring process, research shows that AI hiring tools can reflect the biases of the people who create them.
I got hired. But being disabled makes me wonder: Would AI tools have screened me out? Now it’s my job to answer questions like this, figuring out the impacts emerging technologies can have on disability inclusion at work, both good and bad.
Advancing Diversity, Equity, Inclusion and Accessibility
My work with the ODEP-funded Partnership on Employment and Accessible Technology creates resources to help organizations build inclusive workplaces. Our goal is to make workplace technology accessible and equitable for all, including people with disabilities.
For example, in celebration of National Disability Employment Awareness Month, the partnership launched a new AI & Disability Inclusion Toolkit to help organizations navigate the potential risks and biases that come with implementing AI technologies, learn best practices for making AI implementations more equitable and make the business case for equitable AI to organizational leaders.
The toolkit includes an Equitable AI Playbook, which provides a blueprint to foster disability inclusion while procuring, developing and implementing AI technologies.
The Risks of AI
We created the AI & Disability Inclusion Toolkit to offset the risks of AI tools that evaluate people, especially in the hiring process. Although organizations tend to have good intentions, people can still perpetuate biases that inadvertently exclude qualified candidates.
For example, imagine an interview where you had to complete a series of virtual games meant to measure cognitive abilities such as reaction time, attention span and ability to focus under pressure. This type of test may not accurately measure your suitability for the job, especially if you have a disability that influences your score. In addition, the testing platform could be inaccessible or require you to request an accommodation, forcing you to disclose your disability.
Another example is AI-powered tools that track a job candidate’s eye movement during a video interview. This tracking is intended to measure a candidate’s engagement but can negatively affect candidates whose disabilities impact vision, eye contact, movement and other traits.
AI for Good
As with any new technology, organizations will need to refine their approach and ensure the AI tools they use are truly accessible. The toolkit and playbook can create a clear path toward inclusive, equitable AI by helping identify the best qualified candidates – the ultimate goal of any hiring process.
Check out these resources to learn more about AI equity:
- AI & Disability Inclusion Toolkit
- Equitable AI Playbook
- Podcast: AI and Workplace Inclusion
- AI for Good Shows Promise to Enhance Work Accessibility
- Podcast: Ethics and Bias in Artificial Intelligence (AI) Technology
Nathan Cunningham is a senior policy advisor for the U.S. Department of Labor’s Office of Disability Employment Policy.