Editorial: Humans supply missing link for AI
- September 24, 2025
- Posted by: Web workers
- Category: Finance
Human resources professionals have been trying to use more objective standards in recruitment for decades. Rather than relying on personal connections to find candidates and intuition to assess their abilities, organizations have turned to an array of methods to ensure they are either hiring or promoting the right person for the job.
Some hiring aids have been more accepted than others. As someone with handwriting a 6-year-old would be ashamed of, I for one have always been skeptical of the value of graphology in assessing a job candidate’s suitability for a position, for example.
Others, though, have been widely acknowledged as successful methods to remove bias in hiring. Blind auditions, where musicians play to appraisers from behind a screen, have been shown to greatly reduce gender and racial bias in hiring by orchestras.
As we report here, artificial intelligence is the latest tool being used to make recruitment more efficient and to reduce bias in the process.
But like many other technology applications, AI needs to be used in tandem with human assessments when it is tapped to help fill positions.
One of the major concerns with using AI in hiring and promotion is that despite the apparent objectivity of a machine, AI can introduce bias. That’s because the algorithms used are based on historical data that could include biases, and if the information is accepted unchecked it can perpetuate those biases rather than reduce or eliminate them.
In addition, anyone who has ever been involved in the hiring process knows that it’s an art as well as a science. While it’s imperative to be aware of potential preconceptions when making decisions, qualities such as the ability of candidates to be creative or collaborate are not easily assessed by an algorithm.
The same is true for promotions. While data on performance and achievements play an important role, issues such as leadership ability, interpersonal abilities and other soft skills must also be assessed.
Lawsuits alleging bias by companies that use AI in hiring have already been filed, more will follow and, if any are successful, a flood of litigation will no doubt begin.
In addition, states and municipalities are introducing measures to regulate the use of AI in hiring, and the Equal Employment Opportunity Commission has signaled that the issue is on its radar.
Such concerns should not deter companies from using AI to streamline and improve what can be a difficult and laborious process. To mitigate against potential liabilities, though, organizations should perform regular audits of their AI-based hiring systems, ensure that legal, HR and risk management collaborate to manage the exposures and ultimately make certain that the process of hiring people is controlled by people.


