Share this article on:
Powered by MOMENTUMMEDIA
For breaking news and daily updates,
subscribe to our newsletter.
Employers at NSW government agencies have been warned not to use AI in the hiring process when screening candidates or for making decisions to avoid the technology’s hallucinations, bias, and inaccuracy.
The NSW Office of the Public Service Commissioner has published its Use of AI in Recruitment guidance, which states that AI should not be used for “making recruitment-based decisions”.
“Agencies that choose to use AI during recruitment processes must ensure AI use is lawful, clearly and transparently documented, and that risks are appropriately identified and mitigated. Critically, AI must not replace humans in making decisions,” said the report.
“AI should not be used to filter out applications or make other decisions about candidates.”
The report also highlighted some key issues with using AI that make it unreliable when used on potential job candidates.
Bias, both in terms of automation and AI outputs and based on the development of the tool itself in terms of training or developer programming, could result in the wrong candidate being chosen.
Furthermore, AI may assess candidates in unreliable or invalid ways, may make decisions based on video input, such as skin tone, facial hair, head coverings, and could all out just hallucinate as the technology is known to do.
To prevent bias, AI tools should be developed inclusively, accounting for people with “diverse attributes” such as culturally diverse backgrounds, disability, and more. AI outputs should also be screened for bias and regularly audited for exclusion patterns.
AI datasets, including “names, experiences, qualifications and language patterns common to Aboriginal and Torres Strait Islander applicants, culturally diverse communities, and people from rural and regional areas”, should also be built and included.
The Public Service Commission said that, like with any recruitment, the candidates should know the reason they were hired or not, which requires human oversight.
“Agencies should also be able to clearly identify who made the decision, under what authority and how that decision was made,” it said.
Furthermore, candidates should be allowed to provide feedback to highlight any issues they had dealing with AI tools.
The report addresses that not all agencies will have their own AI tools and will be using third-party “off-the-shelf” AI, which presents additional concerns.
“While a product might work in one agency, it might create unforeseen risks in another,” the report added.
“When procuring an AI product, it is important to closely review the marketing and contract material to ensure it meets the [NSW AI Assessment Framework] requirements.
“This is particularly important given that suppliers may be reluctant to provide any information on the algorithmic decision making used by the AI product and also because the AI product may have only been tested in limited ways.”
Particularly when it comes to third-party AI, there are security concerns regarding the use of sensitive candidate data with the AI, which, when collecting the data needed for assisting the hiring process, could potentially use it for training. Agencies looking at off-the-shelf AI programs should ensure that the data the AI collects remains independent of the data collected for training.
That being said, the NSW Office of the Public Service Commissioner acknowledges that AI can be a productive tool in the hiring process.
This could include using AI to “assist with development of recruitment documentation, assessment materials and communications”, or “identifying key themes from candidate responses” or interviewer notes, provided privacy requirements are met and the AI is screened for bias.
Be the first to hear the latest developments in the cyber industry.