When AI is replaced by humans…


Yes, you read it correctly, artificial intelligence led initiatives can fail – and very badly so! Humans are not replaceable in all instances it seems…

We read with interest, an article in The Guardian newspaper in October last year, telling us that Amazon had finally ditched its attempt to replace humans with AI in the world of recruitment. This followed a problem with its AI recruitment tool which failed one of the most fundamental requirements in recruiting – that of picking the right person for the right job. It appears that artificial intelligence can be gender biased! Who knew?

Amazon’s experimental hiring tool, which had been under development since 2014, apparently did not like women.

The fundamental flaw was purportedly caused because the algorithm was based on the previous 10 years’ worth of CVs that had been submitted. These CVs were predominantly submitted by men and the algorithm wrongly assumed that this made them more successful in posts such as software development. The system had entirely independently taught itself that men were best for positions such as these!

Despite the company’s failed experiment there is still keen interest in utilising AI into recruitment and social media giants such as LinkedIn are now implementing it to a degree. LinkedIn offers employers algorithmic rankings of candidates based on their fit for job postings on its site.

John Jersin, vice-president of LinkedIn Talent Solutions, said the service is not a replacementfor traditional recruiters.

“I certainly would not trust any AI system today to make a hiring decision on its own,” he is quoted as saying in the same article. “The technology is just not ready yet.”

We learned that Amazon has found another way of using its failed technology and is now utilising the software to filter for duplicate candidate profiles from databases – a far cry from their aspiration of creating an independent recruitment tool devoid of human emotions and bias. But we would like to ask: are human emotions and bias a bad thing?

Recruitment is more than picking someone with the right skills on paper. The best qualified person may not be a great fit for the company if they don’t fit the culture and if they don’t share the same values.

Interviewers are often looking for a common spark of interest; people who not only have the right skills, but those who also fit in.

When Horwood Köhler begin looking for the right candidate, we start with the technical requirements and then drill down to really understand the sort of character you are looking for. We would argue that our process is so much more robust than the ‘dating site’ approach that matches just your fundamentals. It was this ‘dating site’ approach that to led Amazon’s failed AI recruiter’s attempts as it tried to make sense of the language we use to describe ourselves and our skills.

The bottom line is that the human brain is capable of so much more subtlety when considering the whole picture. So much so, that it simply can’t currently be replaced by a machine; we have opinions, observations and emotional intelligence as part of our toolkit and no amount of analysing the make-up of the words on a bit of paper can beat that!

Recent Posts