The Times Australia
The Times World News

.

What will a robot make of your résumé? The bias problem with using AI in job recruitment

  • Written by Melika Soleimani, Senior Data Analyst, Massey University
What will a robot make of your résumé? The bias problem with using AI in job recruitment

The artificial intelligence (AI) revolution has begun[1], spreading to almost every facet of people’s professional and personal lives – including job recruitment.

While artists fear copyright breaches[2] or simply being replaced, business and management are becoming increasingly aware to the possibilities of greater efficiencies in areas as diverse as supply chain management, customer service, product development and human resources (HR) management.

Soon all business areas and operations will be under pressure to adopt AI in some form or another. But the very nature of AI – and the data behind its processes and outputs – mean human biases are being embedded in the technology.

Our research[3] looked at the use of AI in recruitment and hiring – a field that has already widely adopted AI to automate the screening of résumés and to rate video interviews by job applicants.

AI in recruitment promises greater objectivity and efficiency[4] during the hiring process by eliminating human biases and enhancing fairness and consistency in decision making.

But our research shows AI can subtly – and at times overtly – heighten biases. And the involvement of HR professionals may worsen rather than alleviate these effects. This challenges our belief that human oversight can contain and moderate AI.

Magnifying human bias

Although one of the reasons for using AI in recruitment is that it is meant to be to be more objective and consistent, multiple studies[5] have found the technology is, in fact, very likely to be biased[6]. This happens because AI learns from the datasets used to train it. If the data is flawed[7], the AI will be too.

Biases in data can be made worse by the human-created algorithms supporting AI, which often contain human biases in their design[8].

In interviews with 22 HR professionals, we identified two common biases in hiring: “stereotype bias” and “similar-to-me bias”.

Stereotype bias occurs when decisions are influenced by stereotypes about certain groups, such as preferring candidates of the same gender, leading to gender inequality.

“Similar-to-me” bias happens when recruiters favour candidates who share similar backgrounds or interests to them.

These biases, which can significantly affect the fairness of the hiring process, are embedded in the historical hiring data which are then used to train the AI systems. This leads to biased AI.

So, if past hiring practices favoured certain demographics, the AI will continue to do so. Mitigating these biases is challenging because algorithms can infer personal information based on hidden data from other correlated information.

For example, in countries with different lengths of military service for men and women, an AI might deduce gender based on service duration.

This persistence of bias underscores the need for careful planning and monitoring to ensure fairness in both human and AI-driven recruitment processes.

Can humans help?

As well as HR professionals, we also interviewed 17 AI developers. We wanted to investigate how an AI recruitment system could be developed that would mitigate rather than exacerbate hiring bias.

Based on the interviews, we developed a model wherein HR professionals and AI programmers would go back and forth in exchanging information and questioning preconceptions as they examined data sets and developed algorithms.

However, our findings reveal the difficulty in implementing such a model lies in the educational, professional and demographic differences that exist between HR professionals and AI developers.

These differences impede effective communication, cooperation and even the ability to understand each other. While HR professionals are traditionally trained in people management and organisational behaviour, AI developers are skilled in data science and technology.

These different backgrounds can lead to misunderstandings and misalignment when working together. This is particularly a problem in smaller countries such as New Zealand, where resources are limited and professional networks are less diverse.

Does HR know what AI programmers are doing, and vice versa? Getty Images

Connecting HR and AI

If companies and the HR profession want to address the issue of bias in AI-based recruitment, several changes need to be made.

Firstly, the implementation of a structured training programme for HR professionals focused on information system development and AI is crucial. This training should cover the fundamentals of AI, the identification of biases in AI systems, and strategies for mitigating these biases.

Additionally, fostering better collaboration between HR professionals and AI developers is also important. Companies should be looking to create teams that include both HR and AI specialists. These can help bridge the communication gap and better align their efforts.

Moreover, developing culturally relevant datasets is vital for reducing biases in AI systems. HR professionals and AI developers need to work together to ensure the data used in AI-driven recruitment processes are diverse and representative of different demographic groups. This will help create more equitable hiring practices.

Lastly, countries need guidelines and ethical standards for the use of AI in recruitment that can help build trust and ensure fairness. Organisations should implement policies that promote transparency and accountability in AI-driven decision-making processes.

By taking these steps, we can create a more inclusive and fair recruitment system that leverages the strengths of both HR professionals and AI developers.

References

  1. ^ revolution has begun (www.cbsnews.com)
  2. ^ copyright breaches (www.researchgate.net)
  3. ^ research (www.igi-global.com)
  4. ^ greater objectivity and efficiency (www.researchgate.net)
  5. ^ multiple studies (academic.oup.com)
  6. ^ very likely to be biased (www.mdpi.com)
  7. ^ data is flawed (www.mdpi.com)
  8. ^ often contain human biases in their design (www.vox.com)

Read more https://theconversation.com/what-will-a-robot-make-of-your-resume-the-bias-problem-with-using-ai-in-job-recruitment-231174

Times Magazine

Building a Strong Online Presence with Katoomba Web Design

Katoomba web design is more than just creating a website that looks good—it’s about building an online presence that reflects your brand, engages your audience, and drives results. For local businesses in the Blue Mountains, a well-designed website a...

September Sunset Polo

International Polo Tour To Bridge Historic Sport, Life-Changing Philanthropy, and Breath-Taking Beauty On Saturday, September 6th, history will be made as the International Polo Tour (IPT), a sports leader headquartered here in South Florida...

5 Ways Microsoft Fabric Simplifies Your Data Analytics Workflow

In today's data-driven world, businesses are constantly seeking ways to streamline their data analytics processes. The sheer volume and complexity of data can be overwhelming, often leading to bottlenecks and inefficiencies. Enter the innovative da...

7 Questions to Ask Before You Sign IT Support Companies in Sydney

Choosing an IT partner can feel like buying an insurance policy you hope you never need. The right choice keeps your team productive, your data safe, and your budget predictable. The wrong choice shows up as slow tickets, surprise bills, and risky sh...

Choosing the Right Legal Aid Lawyer in Sutherland Shire: Key Considerations

Legal aid services play an essential role in ensuring access to justice for all. For people in the Sutherland Shire who may not have the financial means to pay for private legal assistance, legal aid ensures that everyone has access to representa...

Watercolor vs. Oil vs. Digital: Which Medium Fits Your Pet's Personality?

When it comes to immortalizing your pet’s unique personality in art, choosing the right medium is essential. Each artistic medium, whether watercolor, oil, or digital, has distinct qualities that can bring out the spirit of your furry friend in dif...

The Times Features

What Makes a Small Group Tour of Italy So Memorable?

Traveling to Italy is on almost every bucket list. From the rolling hills of Tuscany to the sparkling canals of Venice, the country is filled with sights, flavors, and experiences ...

Do you really need a dental check-up and clean every 6 months?

Just over half of Australian adults[1] saw a dental practitioner in the past 12 months, most commonly for a check-up[2]. But have you been told you should get a check-up and c...

What is a Compounding Pharmacy and Why Do You Need One in Melbourne?

Ever picked up a prescription and thought, this pill is too big, too bitter, or full of things I cannot have? That is where a compounding chemist becomes important. A compounding p...

Deep Cleaning vs Regular Cleaning: Which One Do Perth Homes Really Need?

Whether you live in a coastal home in Cottesloe or a modern apartment in East Perth, keeping your living space clean isn’t just about aesthetics, it’s essential for your health and...

Rubber vs Concrete Wheel Stops: Which is Better for Your Car Park?

When it comes to setting up a car park in Perth, wheel stops are a small feature that make a big difference. From improving driver accuracy to preventing costly damage, the right c...

Not all processed foods are bad for you. Here’s what you can tell from reading the label

If you follow wellness content on social media or in the news, you’ve probably heard that processed food is not just unhealthy, but can cause serious harm. Eating a diet domin...