Summary: Robot recruiters: can bias be banished from AI hiring?

Highlights of the article in The Guardian by Jonathan Barrett and Stephanie Convery

HR A.I Robot helps Austrailin comany try to screen and hire

Can AI technology help with implicit bias in recruiting?

  • The text-based system asks applicants five questions about past work situations and then sends a score and traits profile to the employer and a personality report to the applicant.

  • The Melbourne-based startup Sapia.ai created a demonstration of the structured interview process used by their clients.

  • A personality test is designed to help employers know whether someone is amiable or will turn up on time.

  • AI hiring can automate costly and time-consuming processes for businesses and government agencies, especially in large recruitment drives for non-managerial roles. Sapia's biggest claim might be that it is the only way to give someone a fair interview.

A patchy track record

  • In 2022, a quarter of Australian public sector agencies will have used AI-assisted tech in recruitment. Applicants are often unaware that they will be subject to an automated process, or on what basis they will be assessed within that.

  • The commissioner's office cautions that AI may assess candidates on something other than merit, and may cause statistical bias.

  • Amazon quietly scrapped a candidate ranking tool that systematically downgraded women's CVs, and elevated those that used verbs more commonly found on male engineers' CVs.

  • Messenger-style apps are based on natural language processing and are trained on people who speak standard English. If you're a non-native speaker, the app might not recognize you.

  • Candidates may not know if they need reasonable adjustments in a chat or video interview.

  • Sheard says employers are legally required to adjust for disability in the hiring process, but people don't always disclose their disability straight up.

Australia has no laws specifically governing AI recruitment tools. The Department of Industry has developed an AI ethics framework, but it is voluntary and employers are reliant on vendors to provide oversight.

A question of diversity

  • Hyman says client feedback and independent research shows that the broader community is comfortable with recruiters using AI. Her company's untimed, low-stress, text-based system attracts more diversity.

  • The Diversity Council of Australia and Monash University recommend that recruiters be transparent about the due diligence protocols they have in place to ensure AI-supported recruitment tools are "bias-free, inclusive and accessible".

  • Sapia generates brief notes of personality feedback for the interviewee based on how they rate on various markers.

  • Sapia said its chat-interview software analyzed language proficiency and included a profanity detector. The data itself is proprietary.

You’re (not) hired!

  • The AI demonstration says that Guardian Australia's application is not a good fit for the receptionist role because it requires repetition, routine, and following a defined process.

Below is the company talked about: