ChatGPT and AI generally are changing the HR and recruitment landscape. But is automation really the solution to everything in such a human-focused industry? With industry expert Matt Burney on board, we explore some of the challenges recruiters are facing if they want to adopt new AI technology, as well as what the future holds.

What ChatGPT is (and what it’s not)

ChatGPT is a language model, meaning that it can be used by recruiters as a means to create ‘human’ sounding text. But it’s important for them to note it’s not a knowledge model, which means that anyone who uses it will still need to check that the information that it provides in the text is correct. That's very important if you’re using it to write content about your employer brand.

This means that ChatGPT has strengths and weaknesses for recruiters and candidates alike. Natural language processing tools like ChatGPT are relatively new to recruiters in this capacity, but by harnessing it in the correct way, it might help rather than hinder your processes.

The recruitment landscape and GPT language models

For recruiters, there are plenty of ways in which a language model such as ChatGPT can be used to streamline processes. Take, for example, using a prompt to write a job description, advertisement or questions for an interview.

Recruiters can input a prompt like ‘create 10 questions for a competency-based interview’, and the AI tool will provide some text-based solutions. They can also use it to generate popular keywords for a recruitment description or ad, potentially making them more successfully targeted towards relevant candidates. 

With businesses competing more intensely in an increasingly globalised marketplace, it’s no surprise that recruiters are having to turn to new tools in order to fine-tune their processes. We've found that building a network of talent is one great way to hire within a smaller-than expected talent pool, and it looks like AI recruitment might help you become more competitive here

The development of these apps is helpful to recruiters, who are always trying to automate processes. And in the case of using AI language tools for HR processes, it can possibly save about 35% of the time you have in a week –  according to Matt Burney, Senior Strategic Advisor at Indeed and AI expert. We're already finding that automation in general is saving organisations from being overwhelmed by growing HR data collections. It’s worth bearing in mind, however, that extensive automation might tempt recruiters/HR to try to do more with less. 

So outside of employee brand and job advertisement writing, where else can a natural language processing model become useful to recruiters? Burney says that ChatGPT:

‘... can be used to handle a lot of unstructured data… A really good example is interviews. If you go through an interview, everyone does it differently. There are very few uniform ways to interview people.

'But if you use a language model to look at those interviews, to take notes and structure those notes and understand what worked and what didn’t work, you can really start to understand quickly who is good at interviewing, who is bad at interviewing, what different people look like, what different behaviours look like.’

However, there are some potential consequences that come with providing a third-party like ChatGPT with unstructured data. The main issue is it’s not clear where this information goes after you input it.

So if you’re inputting a company spreadsheet as a prompt into a third-party language model, you’re not sure where this information is now going to go. You're also not necessarily going to know how it'll be used in the future. 

Potential compliance issues and ChatGPT

This leaves you open to potential data leaks and data privacy compliance issues. Burney advises that: '[His] guidance to everybody is whenever you’re using a GPT model, you shouldn’t be putting any personal information into it. You shouldn’t be putting any company information into there. I’ve seen people say: “Oh, I've put a massive spreadsheet into one of these tools”. Now who owns that data? 

'That’s data on your business, your need, your demand plan, your finances, all sorts of stuff. The risk is pretty obvious. And I think GDPR is probably the most obvious thing that you’re going to run into a compliance issue with.’

One of the clearest considerations, therefore, for companies whose staff are using language models is making sure that they are well-trained in compliance, data protection and GDPR. 

Take, for example, the well-known case of Samsung’s employees accidentally leaking source code data via ChatGPT, as well as meeting minutes – as reported in The Register. The conversation around whether ChatGPT is compliant with Australian data privacy laws is being hotly debated in the legal and tech industries.

One possible solution to this is for companies to focus on developing their own AI tools in order to manage unstructured data – creating a ‘walled garden’ approach to managing company data which means that it doesn’t get leaked out. Instead, all the data is still held within the company's own database and isn't made available to a third-party.

Other tips include not using information from personnel reviews as a ChatGPT prompt – and always keeping a human in the loop. The same goes for client or customer data or any confidential company information. 

Candidates are using ChatGPT to create job applications: but what are the implications?

According to the Australian Financial Review, it’s already becoming difficult to tell the difference between a cover letter written by a person and one written by ChatGPT. But when it comes to the interviewing round, their application might not truly represent them and their communication skills – so they might not be successful at this stage.

While sympathetic to candidates that have been struggling to secure interviews without the help of an AI language model to write their application, Burney suggests that it might be more effective to see AI language models as a ‘co-pilot’ rather than as a tool to craft an entire application.

He advises that: ‘... it’s not replacing anything that you’re doing. It’s merely a tool to help you, and spot problems that you might not have seen… It can help you [with rewording], [adding in keywords] etc.’ 

From a recruiter’s perspective, it may also be wise to consider whether or not to filter out candidates who have used a tool like ChatGPT to write their application. This would be to screen out candidates whose applications do not reflect their abilities before they have a chance to reach the interviewing process – which could potentially save time and money.

Recruiters might choose to strike a compromise with candidates instead. This could look like accepting applications which have been edited with the help of ChatGPT, but haven’t been fully written by the tool (and therefore demonstrate the candidate’s own communication skills).

Employer brand and ChatGPT

Another big ethical consideration for companies is the fact that ChatGPT can provide a picture of your employer brand which isn’t actually truly representative of your company. This might be a problem during the recruitment process, if you’re not providing a clear representation of your company to candidates who might be potentially interested in your company. 

Burney suggests that some good considerations for employers to make are whether the AI copy says anything about who you are as a business, or whether you’re ‘just putting together buzzwords that will resonate with people to try and get them over the line'.

The future of AI in recruiting

So what do recruiters need to take on board for the future if they choose to automate their recruitment processes? They’ll have to consider the consequences of automating processes. Burney describes it as a ‘garbage in, garbage out’ problem, and emphasises that:

'If you think about it really. We can feed it the wrong thing and it’ll do a bad job, if we feed it the right thing, it will do a good job. But we also need to consider: when we’re doing it, why we’re doing it, and how we’re doing it.

'Because there is a very human element in recruitment. We need to avoid dialling out the human element, and make sure that when we create new tools or adopt new tools, that we’re doing that to go and automate the manual processes, to free up time and to allow people to go off and do the human piece of work.’

ChatGPT: it’s what recruiters make of it

While ChatGPT provides many solutions to problems that recruiters are facing, it’s therefore worth wrapping this article up on the point that the quality of an AI’s product very much depends on the information you feed it. Training staff in using it correctly – and indeed in a way that’s compliant with data regulations – is key to making sure that you get the most out of the tools becoming increasingly available.