AI Discrimination Against People With Disabilities Revealed

AI is quickly taking the world by storm. It is being integrated into nearly every aspect of everyday life. From writing school essays to outlining business proposals, AI is humanity’s newest invention that seems like it’s here to stay.

Controversy Spreads

Image Credit: Shutterstock / VesnaArt

However, the employment of these systems is quickly sparking controversy, most notably in the hiring process. The main focus of the controversy revolves around the unchecked biases that lie in how AI scans applications. 

Overusing AI

Image Credit: Shutterstock / Song_about_summer

An overwhelming majority of companies depend on AI to filter out resumes and cover letters. In fact, a 2021 report from Harvard Business School found that 99% of Fortune 500 companies use AI in their recruitment practices.

Bias Abound

Image Credit: Shutterstock / Monkey Business Images

Leaning so heavily on these systems raises a number of concerns, most commonly from demographics that face discrimination in the workforce. This technological prejudice often reflects the lack of diverse thought in those who develop these programs.

A New Law Emerges

Image Credit: Shutterstock / Shutter z

In an attempt to assuage public anxiety around the issue, New York City legislators enacted the Automated Employment Decision Tool Act. This law requires companies using AI in the hiring process must pass specific audits to determine prejudice.

Limits in Diversity

Image Credit: Shutterstock / fizkes

The new act only searches for discrimination connected to “sex, race-ethnicity, and intersectional categories.” However, it fails to outline what standards are being measured.

Who Is Left Behind?

Image Credit: Shutterstock / fizkes

Additionally, the act does not include tracking for biases against people with disabilities, which is a commonly reported form of identity-based discrimination.

Slow to Learn

Image Credit: Shutterstock / Gorodenkoff

The AI algorithms in question are programmed by coding experts over years. Due to the time involved in their creation, they are incredibly difficult to change. 

Not So Easy

Image Credit: Shutterstock / REDPIXEL.PL

Consequently, companies cannot find an easy solution to resolve any shortcomings as a result of embedded bias found in the audits of the programs.

Aiming for the Ideal

Image Credit: Shutterstock / Gorodenkoff

The programs are constructed of data based on ideal candidates or current employees to select the best suitors. However, by searching for patterns based on ideals, biases against those with disabilities linger in the shadows of coding. 

Across the Board

Image Credit: Shutterstock / fizkes

Individuals who do have one or more disabilities have historically endured mistreatment in the work environment across industries.

Slow and Not So Steady

Image Credit: Shutterstock / Roman Samborskyi

While many have proposed retraining AI programs by feeding them data with more diverse applicants, success seems unlikely. This is because these programs are slow to integrate minute shifts in data, making the relearning process glacial at best.

Defining “Disability”

Image Credit: Shutterstock / William Potter

Legally, the term “disability” refers to a variety of conditions that apply elasticity to each individual’s level of ability. While this may be a safeguard in judicial matters, not all AIs are programmed for this level of flexibility. 

Hidden From Sight

Image Credit: Shutterstock / tsyhun

Disabilities can vary from person to person and may not be visible or obvious. Likewise, the intersectional relationship between disability and other factors of marginalization can impede AI spotting patterns.

Accommodations Unmet

Image Credit: Shutterstock / fizkes

Another problem, as seen in AI tools like Pymetrics, is that some candidates with disabilities may require essential accommodations to be successful in the workplace. This could lead to systems deeming these applicants as unqualified.

Not Meant for Camera

Image Credit: Shutterstock / Marko Aliaksandr

Some slip ups in these systems are commonly found in processes like video applications. These programs measure behaviors such as eye movement, posture, and speech.

What Goes Unaccounted

Image Credit: Shutterstock / bunny pixar

Without accounting for neurodivergence, physical disabilities, or speech disorders, the AI will sort these candidates in the “rejection” pile. The outcome which may turn out differently if reviewed with human eyes.

Failing to Serve the Public

Image Credit: Shutterstock / Gorodenkoff

It’s no secret that AI comes with a host of pre-programed biases. This being said, members of the public find it concerning that New York lawmakers passed an act that fails to account for disability. 

A More Personal Approach

Image Credit: Shutterstock / fizkes

Attempts to incorporate discrimination audits are not a catch-all solution to the issue. Actual progress will only be delivered when corporations wean off their dependency on AI for personnel decisions.

A Small Step

Image Credit: Shutterstock / GamePixel

All in all, legislation like the AEDTA seems to point a move in the right direction. These steps may not go far enough, though, as failing to effectively account for all modes of discrimination places the burden onto the potential employee, rather than the company.

The post AI Discrimination Against People With Disabilities Revealed first appeared on Pulse of Pride.

Featured Image Credit: Shutterstock / Marko Aliaksandr.

For transparency, this content was partly developed with AI assistance and carefully curated by an experienced editor to be informative and ensure accuracy.

+ posts