Bots, AI Are Analysing Your Anger, Happiness During Interview; But Can This Be Biased Against Women?

Over the years, the recruitment process has become more advanced as the job applications keep on rolling in . To keep up with the never-ending demand, many companies have started practicing innovative ways of hiring such as the introduction of predictive hiring tools.

These A.I. tools do the work of several thousand humans effectively and efficiently at a lower cost. The algorithms analyze people’s facial expression and tone of voice to check traits such as “anger”, “nervousness” or “confidence” to decide whether the candidate is job fit or not.

Contents

A.I. Enhances The Hiring Process

Intentionally or unintentionally many a time the recruitment process is  driven by first impressions and biases are not uncommon occurrences despite increased awareness of discriminatory hiring practices.

Artificial tools are already helping businesses in their hiring process as it creates convenience for both, allowing candidates to be interviewed whenever they want and giving the freedom to recruiters to review it according to their schedule.

Using the algorithm- based video interview along with aptitude test, Axis Bank was able to standardise and scale up their interview process in turn hiring 2000 customer service officers out of 40,000 applicant pool.

Insurance Provider , Bajaj Allianz has successful hired more than 1,600 employees through robotic video assessment tools and with minimum error.

Software Accuracy And Biases Against Women Concerns

One of the major drawbacks of these A.I. tools is whatever goes in is what goes out. If suppose there is already existing bias in the recruitment process like men over women in tech or certain skin tones over others, the A.I. both only knows what is being told.

This was one of the reasons Amazon discarded its recruitment tool as it had a built in bias that used to show gender bias against women.

The recruitment tool after scanning through the candidates of past job posting realised that most of the applicants were men and it then thought men to be the “ideal” fit for the job and used to discard keywords such as “women’s golf club member”  and as well as those who went to women’s colleges.

Could This Software Be A Disadvantage To Less Expressive People

Since the software judges one’s emotions based on their facial expression and voice, it could be a negative factor to people who are less expressive ,  such as those who are autistic.

When you know your facial expressions are being closely watched and analyzed,could be an unnerving experience for many and people tend to display self consciousness.

Ours moods are very much easily changed, what is a person has an extreme headache and they have appeared for the interview, the bot will easily catch the distressed emotion and will but the candidate in the black list.

But, It Does Not Mean It’s The End For A.I. In Hiring

One way to stop the built in bias is to control the data that the program is crunching and set the program in such a way that will exclude certain data and things that could have been biased in the hiring.

A.I. is a great way to make the hiring process a less tedious work and to make it efficient.

Image Source

Comments are closed, but trackbacks and pingbacks are open.

who's online