AI is revolutionizing recruiting, boosting productivity and slashing costs by up to 40%—a game-changer no organization can afford to ignore. But there’s a twist: amid all this automation, are we overlooking the most critical voice in the room—the candidate’s?

How are candidates really reacting to these tech-driven changes? Are they excited? Skeptical? Do they trust this wave of innovation reshaping how we hire? And more importantly, how can we ensure we earn that trust?

But here’s the kicker: every action sparks a reaction.

What strategies might candidates deploy to navigate or even counteract these automations? Could they redefine the rules of engagement in unexpected ways?

Let’s explore the hidden dynamics shaping the future of recruitment.

The answers might surprise you.


This article is a chapter of my eBook AI In Recruiting: Separating Hype from Reality.

Click here to download the whole eBook.


What Candidates Believe About AI Systems in Recruiting?

The most concrete research on the subject has been executed by Pew Research Center. I highly recommend you read it here.

Most Americans Are in the Dark About AI in Hiring—Here’s What They Really Think

Did you know that 61% of Americans have heard nothing about AI being used in hiring?

It’s no surprise—we’re still in the early days of this tech revolution. Yet, a smaller 39% have at least some awareness, with only 7% tuned in closely.

But here’s where it gets interesting: while the majority strongly oppose letting AI make the final hiring decision, opinions are far more divided when it comes to using AI to review applications.

So, what’s driving this mix of skepticism and curiosity? And how might these perceptions shape the future of recruitment? Let’s unpack the story behind the stats.

Most oppose the AI making hiring decisions but are relatively ok with AI screening resumes

Can AI Truly Eliminate Biases in Hiring?

Many believe AI has the potential to treat all applicants equally, removing human biases from the equation.

Sounds promising, right? But here’s the catch: candidates aren’t entirely convinced.

They doubt AI can rival human judgment when it comes to spotting soft skills, assessing cultural fit, or recognizing untapped potential in applicants.

To them, AI feels like a “dry calculator”—precise but lacking the empathy needed to uncover the traits and growth potential a human interviewer might see.

So, is AI the fairness solution we’ve been waiting for, or does its lack of human intuition hold it back? Let’s explore.

Nobody believes that AI can spot soft skills better than a recruiter

Americans Aren’t Sold on AI in Hiring—Here’s Why

Pew Research survey reveals a striking insight: most Americans aren’t ready to embrace an AI-driven hiring process.

In fact, about two-thirds (66%) say they wouldn’t apply for a job at a company using AI for hiring decisions, while only 32% are open to the idea.

It’s clear that AI adoption in recruitment is still in its infancy.

But here’s the kicker—far from boosting brand appeal, using AI for hiring might actually harm a company’s reputation.

So, what’s fueling this resistance, and what can companies do to win over skeptical candidates? Let’s dig deeper.


Why Do Candidates Avoid AI-Driven Hiring?

The biggest turnoff for candidates when it comes to AI in recruitment? The absence of a human touch.

Many feel that an AI-driven process lacks the personal factors that make hiring feel fair and empathetic.

On the flip side, candidates who don’t mind AI in hiring see its potential to reduce bias and level the playing field.

So, is the divide about trust, fairness, or something deeper? And how can companies bridge the gap to make AI work for everyone? Let’s unravel the mystery.

Let’s see some individual responses and opinions from candidates regarding the usage of AI in recruitment.

Negative Responses About AI in Recruiting

Some express concern about AI’s inability to make human-like judgments or to see “intangibles” that they consider important to hiring:

“AI can’t factor in the unquantifiable intangibles that make someone a good coworker … or a bad co-worker. Personality traits like patience, compassion, and kindness would be overlooked or undervalue”

Without humans in the hiring mix, people fear the process would become impersonal and that the lack of person-to-person interaction would be detrimental to both the employer and the prospective employee. Some discussed these concerns generally, while others noted that certain fields require qualities AI cannot see:

“That takes all the personalization out of it. I wouldn’t want to make a decision whether or not to join a company without being personally selected and without meeting my potential employer directly, and without them meeting me to see if I would be a good fit for their employees.” Woman, 30s

“I work as a bartender. My job requires me to be social, current on social and timely topics. I also need to multitask at times, and get along as a team player. I’m not sure AI will see those attributes.” Woman, 50s

Another 10% of people who say they would not want to apply describe concerns that the design of AI could be flawed – for some, it is too focused on keywords or absolutes, screening people out unnecessarily:

“[To AI] … I’m not a person, just a series of keywords and if I don’t fit the exact hiring model I’m immediately discarded. Hiring manager doesn’t care, they don’t actually read anything.” Man, 40s

Others in this group discuss more fundamental problems with AI’s design or the data it uses.

“It’s a ‘garbage in, garbage out’ problem. AI in itself could be useful, but in general the parameters that it’s given are poor. There has always been a gap between the Human Resources personnel and the supervisor or team who know the actual needs, and that is exaggerated with AI. Who do you think programs the AI?” Woman, 50s

And another 3% specifically mention design flaws in AI systems that could lead to bias, unfair treatment or discrimination:

“AIs are typically trained on real-world data which can be (and often is) inherently and systemically biased to favor privileged groups. Use of AI for decision-making perpetuates the biases we have in human decision making. Hiring is an area where biased decision making is especially dangerous for our society.” Man 20s

Small shares of those who would not want to apply also express a more general wariness, saying they do not trust AI or feel comfortable with technology (5%); worry they would not fare well if AI were used (4%); or feel too “in the dark” about what AI is or what it can do (3%).

Positive Responses About AI in Recruiting

Turning to the 32% of Americans who say they would want to apply for a job like this, the most common reason relates to the prospect that AI could be objective, fair, have little to no bias, or treat everyone equally.

Some 28% of those open to applying mention one of these factors as the primary reason:

“If the AI were properly informed, it could remove/minimize any personal bias of the human who would otherwise be making hiring decisions.” Woman, 70s

Another 14% of the individuals open to applying for a job with AI in the hiring decision process say that fact is not going to stop them from applying or does not matter to them.

“If I was looking to change jobs, I would apply to potential employers because of the quality of their culture and how the job that is being offered matched my goals and skill sets, and much less how AI is used in the selection process.” Man, 60s

“Because I need a job if I am applying. It’s not like I have much of a choice.” Woman, 20s

About one-in-ten (9%) of those open to applying argue that AI would be thorough and accurate – possibly more so than humans:

“I think the AI would be able to evaluate all my skills and experience in their entirety where a human may focus just on what the job requires. The AI would see beyond the present and see my potential over time.” Man, 50s

Still, 4% of this group say humans should still be involved at some level. And small shares also note positives like AI giving them personally a leg up, being curious to try it or making hiring efficient (4% each).

“I have been part of a company’s hiring process in the past, and having to sort through thousands of applications was time-consuming and tedious. Using AI to streamline that process sounds like a good advancement”. Woman 20s

A majority of candidates say racial and ethnic bias in hiring is a problem, and about half of them say increased use of AI would help ease those issues


Another great article about the applicants’ perspective on recruiting with AI is published article in ScienceDirect which is also a recommended read.

Applicants’ perception of artificial intelligence in the recruitment process

This groundbreaking study explores a rarely addressed topic: how job applicants perceive AI-enabled recruitment tools. With limited research in this area, the paper fills an important gap by surveying 552 job seekers from diverse nationalities and industries, analyzing their experiences through the lens of the Technology Acceptance Model (TAM).

The findings? Surprisingly positive. Most candidates view AI tools as both useful and easy to use, aligning with TAM’s core variables: Perceived Usefulness (PU) and Perceived Ease of Use (PEoU). Here are the standout stats:

  • 38% of participants find AI tools helpful in recruitment.
  • 63% report that these tools are easy or very easy to use.

Candidates linked their positive perceptions to several benefits: faster response times, efficient application handling, and an improved overall experience. Many noted that AI tools give them more time to prepare and reduce the stress that often accompanies traditional interviews.

This study highlights a key takeaway: when AI tools simplify and enhance the application process, job seekers welcome their inclusion. Could this signal a turning point in how we approach recruitment?

But there is a point where this research agrees with the Pew Research Study.

The Hidden Concerns About AI in Recruitment: What Candidates Want You to Know

While AI tools bring efficiency to recruitment, nearly 70% of job seekers feel they lack the human touch, making the process seem impersonal. But that’s just the beginning of the story.

Here’s what else candidates flagged:

  • Low accuracy and reliability: Many believe AI tools overlook unique circumstances and fail to provide tailored responses.
  • Immature technology: Issues like biases in algorithms and poor text or speech recognition remain major concerns.
  • Transparency troubles: A significant number of participants feel uneasy about not knowing how AI evaluates their applications or the criteria used in decision-making.

Although ethical, privacy, and legal concerns were highlighted by fewer respondents (less than 40%), the fear of bias and opaque decision-making stood out as top issues.

The takeaway? Candidates are calling for more transparent and explainable AI systems. If companies want to build trust, addressing these concerns could be the key to the future of AI-driven hiring.


In the meantime, some candidates feel the need to respond to this disconnect between their hunt for a dream job and the disconnect they feel with recruiters that rely on a lot to AI tools without adding value to the candidates.

The AI Arms Race in Recruitment: Candidates vs. Employers

As AI transforms recruitment, candidates are arming themselves with their own tech tools to level the playing field. Here’s how they’re countering AI-driven hiring processes:

  1. CV Optimization Tools: These refine resumes to highlight the most relevant skills and experiences based on specific job descriptions.
  2. Automated Job Applications: Tools that scrape job boards and auto-apply to positions matching a candidate’s criteria are on the rise.

The result? Candidates are streamlining their job hunt, but recruiters are noticing a surge in bot-like applications—multiple CVs from the same candidate for different job positions with minor tweaks.

This trend is sparking an AI showdown: candidate tools designed to beat recruiting algorithms versus hiring AIs aiming to efficiently and transparently screen applicants. In just a year or two, job applications per opening could skyrocket due to automation.

But this evolution isn’t necessarily a bad thing. An optimist might conclude that this showdown might lead to a better solution overall: An intelligent cloud for the job market.

Imagine an “intelligence cloud,” where personal data is stored and enhanced with AI, much like the cloud transformed data storage. Could this be the next big leap in hiring technology?

An intelligence cloud that helps candidates land their dream job with insights and feedback and helps recruiters have more time for value adding human touch.

The push for more with less resources has strained recruiters. Human touch has become the premium nowadays. We need to develop tools that will bring back the human touch to both candidates and recruiters.

The Future of Hiring: An Intelligent Marketplace Driven by AI

Imagine a world where candidates store their CVs, skills, and preferences in an “intelligence cloud,” ready to connect with employers who post matching opportunities. Applications happen automatically, and algorithms handle the initial stages—streamlining the process for both sides.

Here’s the game-changer: this frictionless system doesn’t replace the human touch—it enhances it. With mundane tasks out of the way, applicants and recruiters gain more time for meaningful interactions.

This intelligent marketplace isn’t just a dream—it’s possible with today’s AI technology. By transparently combining tools for both candidates and recruiters, we can create a system that’s efficient, fair, and human-centric.


Hope you enjoyed this chapter of my eBook AI in Recruiting: Separating Hype from Reality. Feel free to download the whole book here.


In a following article which is also a chapter of this eBook we’ll explore the critical role of regulations, ethical considerations, and Diversity, Equity, and Inclusion (DEI) parameters in shaping this AI-driven future.

The road ahead is exciting!!!


Help Us Shape the Future of Recruitment – We Need Your Input!

We’re building an innovative AI platform designed to give recruiters more time to focus on what truly matters—the human connection with candidates. It’s not just about efficiency; it’s about bringing the personal touch back to recruitment.

And here’s where you come in:

We’re looking for beta testers to help us refine this platform and make sure it delivers real value. It’s completely free, no strings attached—just your honest feedback.

If you’re interested in transforming the way you recruit and being part of something exciting, we’d love to have you on board!

Reach out to us:

📩 Email: alex@manxmachina.com

💼 LinkedIn: Message me directly here

Let’s make recruitment better together! 🚀


REFERENCES

https://www.pewresearch.org/internet/2023/04/20/americans-views-on-use-of-ai-inhiring
https://www.sciencedirect.com/science/article/pii/S2451958823000362
https://www.forbes.com/sites/benjaminlaker/2023/07/07/the-dark-side-of-ai-recruitingdepersonalization-and-its-consequences-on-the-modern-job-market
https://www.theguardian.com/us-news/2022/may/11/artitifical-intelligence-jobapplications-screen-robot-recruiters
https://www.tidio.com/blog/ai-recruitment
https://www.staffingindustry.com/editorial/it-staffing-report/ai-adoption-in-recruitmentsoars-report-says
https://www.staffingindustry.com/editorial/it-staffing-report/ai-adoption-in-recruitmentsoars-report-says

Leave a Reply

Your email address will not be published.