The technical team at Amazon.com in Seattle, US, discovered that their new artificial intelligence-based recruiting tool, which was used to review resumes and job applications and short list the right talent, was showing preference for men!
The Company had been trying out the new hiring tool to rate aspiring candidates and select the best. However, after a while it became obvious that the selections made by the tool were gender biased. This was due to the computer vetting applications on the basis of the patterns / resumes received over a period of a decade. Since most resumes belonged to men, the engine assumed men were the obvious choice. The tool was apparently rejecting resumes that included the word ‘women’s’.
While the error in the programming was rectified, it is said that the Company stopped relying on the tool alone for new recruits and stopped using it for evaluation of hires altogether. The programming team that worked on the tool was also disbanded.
This only goes to prove that even though machines and artificial intelligence can go a long way in making tasks easier for organisations and help save time and costs, their decisions and results cannot be blindly relied on.