Connect with us

Tech

Amazon’s AI recruitment tool dumped for being bias against women

Published

on

Amazon's AI recruitment tool dumped for being bias against women

Online giant, Amazon has abandoned an algorithm that was being tested as a recruitment tool because it was sexist.

According to a Reuters report, the artificial intelligence system was trained on data submitted by applicants over a 10-year period, much of which came from men.

Members of the team working on the system told Reuters that the system effectively taught itself that male candidates were preferable.

Though Amazon has not responded to the claim, Reuters spoke to five members of the team who developed the machine learning tool in 2014. They however do not want to be named publicly.

According to them, the system was intended to review job applications and give candidates a score ranging from one to five stars.

“They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those,” said one of the engineers who spoke to Reuters.

The report however revealed that by 2015, it was clear that the system was not rating candidates in a gender-neutral way because it was built on data accumulated from CVs submitted to the firm mostly from males.

Read also: Chinese Tech company, London firm work on AI to diagnose Parkinson’s disease

The system started to penalise CVs which included the word “women”. The program was edited to make it neutral to the term but it became clear that the system could not be relied upon, Reuters was told.

The project was abandoned, although thr report claimed that it was used for a period by recruiters who looked at the recommendations generated by the toll but never relied solely on it.

Amazon currently has a global workforce that is split 60:40 in favour of males.

Reports also has it that the Amazon recruitment AI bias is not the first time doubts have been raised about how reliable algorithms trained on potentially biased data will be.

In May last year, a report claimed that an AI-generated computer program used by a US court was biased against black people, flagging them as twice as likely to reoffend as white people.

 

RipplesNigeria… without borders, without fears

Click here to join the Ripples Nigeria WhatsApp group for latest updates.

Join the conversation

Opinions

Support Ripples Nigeria, hold up solutions journalism

Balanced, fearless journalism driven by data comes at huge financial costs.

As a media platform, we hold leadership accountable and will not trade the right to press freedom and free speech for a piece of cake.

If you like what we do, and are ready to uphold solutions journalism, kindly donate to the Ripples Nigeria cause.

Your support would help to ensure that citizens and institutions continue to have free access to credible and reliable information for societal development.

Donate Now