Future Tech

When it comes to recruitment, AI software could penalise candidates with disabilities

Tan KW
Publish date: Fri, 28 Jun 2024, 01:18 PM
Tan KW
0 449,929
Future Tech

Artificial intelligence (AI) tools are making increasing inroads into human resources, whether for drafting job offers or sorting applications. But this software can be biased. In fact, a US study has found that generative AI can discriminate against people with disabilities, based on their resumes.

Researchers from the University of Washington discovered this after conducting an experiment in which ChatGPT-4 was asked to give an opinion on resumes. These documents were based on the resume of one of the coauthors of the study. These had been enhanced with information that gave clues as to their author's status as a disabled worker. In one case, for example, it was indicated that the job applicant had won a scholarship specifically for people with disabilities.

The researchers ran these resumes several times through ChatGPT-4, comparing them with the original resume, on which there was nothing to suggest that the applicant had a physical or mental disability. The aim was to determine which of these profiles was the most suitable for a research position to be filled at an American software company.

It turned out that, out of 60 attempts, OpenAI's chatbot only considered the modified resumes to be the best match for the vacancy in 25% of cases. "In a fair world, the enhanced resume should be ranked first every time. I can’t think of a job where somebody who’s been recognised for their leadership skills, for example, shouldn’t be ranked ahead of someone with the same background who hasn’t," says senior study author, Jennifer Mankoff, quoted in a news release.

Bias against disabilities

When the academics asked ChatGPT-4 to justify its choices, they found that this chatbot tended to perpetuate ableist stereotypes. For example, the generative AI considered that a jobseeker with depression had "additional focus on diversity, equity, and inclusion (DEI), and personal challenges," which "detract from the core technical and research-oriented aspects of the role."

Indeed, "some of GPT’s descriptions would colour a person’s entire resume based on their disability and claimed that involvement with DEI or disability is potentially taking away from other parts of the resume," explains study lead author, Kate Glazko.

The scientists tried to customise ChatGPT with written instructions so as not to stigmatise disabled workers. This was successful, to some extent. The modified resumes outperformed the original in 37 out of 60 cases.

Nevertheless, generative AI continued to be prejudiced against candidates with depression or autism. "People need to be aware of the system’s biases when using AI for these real-world tasks," says Glazko.

The findings of this study show that generative AI is no less biased than a traditional recruiter, despite what some people may claim. That's why the Artificial Intelligence Act - the legislation that sets out a European regulatory framework to govern the use of AI - classifies software that sorts resumes among so-called "high-risk" AI systems.

Human resources professionals must therefore exercise caution when using AI software as part of their activities to minimize the risk of discrimination.

 - AFP   

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment