Blog

Is AI a Powerful Tool or Dangerous Shortcut When It Comes to Recruiting Staff?

Let me start by saying, this isn’t another ‘what is AI?’ article. We’re all familiar with the benefits it offers, but as a specialist recruiter, we are seeing some worrying developments as well. As an employer or candidate, you need to be aware of the potential damage that it could do, as well as the benefits it brings.

AI has rapidly become part of everyday business life. From marketing to customer service, many organisations are experimenting with AI tools to save time and streamline processes. All to the good, for the most part, and recruitment has also not been immune to this trend. Use AI right, it saves time, relieves teams of monotonous jobs and is, generally speaking, pretty useful technology.

While AI has clear benefits, its growing role in recruitment also raises an important question. Just how important is the human element in recruitment, which is one of the most vital processes in a successful business?

For employers looking to build strong finance teams and candidates hoping to present themselves effectively, relying too heavily on AI can create problems with recruitment. When there are problems with your recruitment, the financial and career costs can be huge.

The Rise of AI in the Hiring Process

Do a quick Google search, and you see that AI is now being used at multiple stages of recruitment. Candidates use it to draft CVs and cover letters. Some companies use it to write job descriptions, job specs or adverts. Others are even experimenting with AI tools to screen candidates or conduct early-stage interviews. Agentic AI, which is specifically designed to act on a set goal and autonomously make decisions, plan actions, and execute tasks with minimal human intervention, clearly has a lot of potential.  

At first glance, this seems efficient. AI can quickly generate structured text, summarise job responsibilities, or filter large numbers of applications, all for a low cost.

As we all know, though, when something seems too good to be true, it usually is.

Take a step back, and you realise that when every stage of the process is influenced by AI, something strange happens, and there is an uncomfortable shift in the basic process of recruiting. Now, instead of the traditional human-only process:

  1. The candidate may use AI to generate a CV.
  2. The employer may use AI to create the job description.
  3. A recruitment platform may use AI to screen the applications.

If this happens, the first genuine human interaction only occurs somewhere in the interview stage, and by then any misunderstandings will already have crept into the process. For roles in finance and accountancy, where accuracy, detail and accountability are essential, this is far from ideal.

When AI Creates CVs That Don’t Reflect Reality

One of the most common issues we have seen is candidates using AI to write their CVs without fully checking what the technology produces.

For example, a candidate may ask AI to write a CV for a specific job role. They upload the text from the advertisement and their own job history, and back comes the CV for them to use. It should be all fine, in theory. It is only when that CV is then used to filter potential candidates later, and you ask them about some relatively small but very specific skill that you hear:

“Oh, sorry, I don’t have that experience. AI must have assumed I had because I have done a similar role, so it added it to my CV.”

This type of situation highlights a major risk. AI tools often find or generate an accurate description of a role, but they cannot verify whether an individual has actually performed those duties. They will also make assumptions based on the user’s background and the job specification. The AI responds with what is known as ‘hallucination’ and assumes, based on supporting evidence, that the candidate ‘should’ and therefore ‘must’ have experience.

The result can be CVs that sound like a perfect match, but don’t reflect the candidate’s real suitability for the role.

This creates problems for everyone involved:

  • Candidates risk undermining their credibility in interviews
  • Employers may develop inaccurate expectations
  • Everyone must spend additional time verifying information that should have already been verified

In some sectors maybe this is not a big issue, but in finance recruitment, precision absolutely matters. A CV that looks polished but contains generic or inaccurate content could cause real problems. In an area like financial compliance, for example, an underqualified employee… well, I probably don’t need to spell out the potential for harm.

Generic Content That Fails to Stand Out

Another side effect of AI-generated content is how similar it can appear across different candidates and job adverts.

Again, a quick online search will show that all AI systems are trained on vast quantities of existing text to produce what is known as a large language model. They essentially learn to use patterns of commonly used language. As a result, they tend to produce safe, structured, and often very, very, generic content.

This can often lead to ‘soundbite’ CVs and job ads filled with phrases such as:

  • “Results-driven professional”
  • “Highly motivated team player”
  • “Strong attention to detail”

While these statements may sound impressive, they mean little without the right context. What you end up with is 1000s of CVs that do little to differentiate one candidate from another.

The same issue can affect employers. If job descriptions and adverts are written entirely by AI, they may lack the insight and nuance that come from real experience within a specific sector.

We are not saying using AI to help with structure or content is a bad idea, but you cannot rely on it if you want your vacancy or CV to stand out.

This is where specialist recruitment expertise becomes invaluable. A well-written job specification, or advice on a good CV, which is based on years of industry knowledge, will always provide more clarity than a generic AI summary.

AI Interviews and the Ethical Questions They Raise

As I mentioned earlier, some organisations are now experimenting with AI-driven interviews or automated video assessments. These systems can apparently analyse things such as speech patterns, facial expressions, or word choice to evaluate candidates.

While this may appear efficient, it actually raises serious ethical questions.

I don’t claim to be an expert in AI, but there does seem to be a very worrying potential problem with AI interviewing.

AI systems are trained using historical data, and if that data contains unexpected bias, the system may replicate or even amplify it.

For example, an AI system trained predominantly on successful candidates from a particular demographic group might unintentionally favour applicants who look or sound similar. This has apparently happened in past experiments with AI candidate selection.

Potential risks include bias related to:

  • Accent or regional speech patterns
  • Age or facial characteristics
  • Cultural communication styles
  • Neurodivergent behaviours

These are complex ethical challenges that will likely emerge more as AI interviewing develops, so, for the foreseeable future, we recommend letting humans make the decisions in this area.

Recruitment Is Still a Human Process

For all its capabilities, AI cannot replicate some of the most important aspects of recruitment.

It cannot judge the impact a person makes when they walk into a room.
It cannot assess personality, motivation or cultural fit in the same way an experienced recruiter can.
And it certainly cannot replace the intuition, knowledge and experience that can only be developed through years of working within a specific industry.

In finance recruitment, technical capability must be balanced with other softer skills such as reliability, cultural fit and personality, and these are factors that only become clear through conversation and interaction.

You cannot train a personality, and no algorithm can truly evaluate it.

AI Is a Tool, It’s Not a Replacement

None of this means AI has no place in recruitment; in fact, it is actually a very useful tool. Building a successful finance team requires more than matching keywords on a CV to a job description, though. It requires insight, experience and meaningful conversations. We have spent over 2 decades in financial recruitment, and the experience and understanding that brings are things that technology simply cannot replicate.

In a world of ChatGPT-generated CVs, AI-written job descriptions, and automated screening tools, the risk is that recruitment becomes detached from the people and processes it is meant to serve. The reality though, is that businesses succeed because of people, and identifying the right people still depends on human expertise. That is the ‘people’ part of recruitment, and that is what makes good recruitment work.

Technology may continue to evolve, but when it comes to recruitment, for the foreseeable future, the human factor will still matter most.

Share this...