AI is creating trust issues for recruiters, hiring managers, and job-seeking tech professionals, according to The Trust Gap in Tech Hiring 2025 report from Dice. The survey found that 80% of tech professionals trust fully human driven hiring process, while only 46% trust AI and human hybrid approaches. But only 14% said they’d trust fully AI-driven processes. Nearly half also said they’d choose to opt-out of AI résumé screening if they were given the choice.
Main concerns from respondents include worries that AI screening tools will favor keywords over qualifications (63%), fears that qualified candidates will be rejected due to narrow criteria (63%), and a belief that a human will never see their résumé (56%).
Distrust is further amplified by a challenging tech hiring market that follows years of layoffs, economic instability, and uncertainty, as well as new concerns around job displacement due to AI. Candidates have a lower frustration tolerance for minor process issues compared to when it was easier to find a tech role, according to the Dice report. However, despite this rise in distrust, the 2025 AI in Hiring Report from Insight Global found that a staggering 99% of hiring managers report using AI in the hiring process, with 98% saying they saw significant improvements in hiring efficiency using AI.
“AI tends to blur the line between confidence and embellishment,” says Dice president Paul Farnsworth. “When candidates feel they have to beat the algorithm, it shifts focus away from real skills and experience.”
This results in a lot of candidates who look the part on paper but may not actually be the right fit, he adds. And an influx of candidates using AI to hone their résumés means more applications to sift through.
“It can also create longer hiring cycles, which means more time spent filtering through inflated résumés and more interviews that don’t lead to the right hire,” Farnsworth says. “Over time, that erodes trust in the system from both sides.”
How AI is used in hiring
In tech hiring, AI is currently most used for scanning responses, ranking candidates, and automating communication or scheduling steps. “It’s helping recruiters manage volume and improve response speed, and that’s a clear win for efficiency,” says Sara Gutierrez, chief science officer at SHL, a global talent assessment company.
AI’s greatest strength in tech hiring, though, is through automating certain processes that can help free up recruiters and hiring managers to focus on interviews and identify qualified candidates. Companies need to have a clear strategy in place for where AI will be implemented and where humans will lead the charge. It’s still important to demonstrate transparency around AI in hiring so it doesn’t override the human side.
“The challenge comes when AI decisions are built on data that were never meant to indicate job success, like résumé phrasing, education keywords, or past job titles,” says Gutierrez.
As Farnsworth points out, while AI can help streamline the application process and spot patterns across candidates that might have been missed otherwise, there’s always the chance that AI will also filter out great people for the wrong reasons. For example, if the AI system leans too hard on keywords or rigid templates, he adds, you risk missing out on potential talent.
“We’re seeing AI pop up in a bunch of different ways like job ad targeting and even early-stage interviews, and that can be helpful, but only if it’s used responsibly,” says Farnsworth. “The overall goal is to give hiring teams better signals so they can make smarter decisions. When AI is treated like a tool, not a gatekeeper, that’s when it adds real value.”
Lack of authenticity and individuality
Dice also found that 78% of candidates feel they need to embellish qualifications to get noticed, while 65% say they’ve modified their résumés using AI to improve chances of it being seen. This brings up further concerns. “When candidates feel they have to exaggerate just to stay competitive, it chips away at authenticity and trust,” says Farnsworth.
It’s getting to the point where it’s becoming the hiring version of an arms race, says Gutierrez. Candidates are using AI to tailor résumés for algorithms, and employers are using AI to scan them, so the resulting noise makes it harder to distinguish genuine capability from AI-optimized presentation.
The more candidates use AI to structure their résumés, the more they start to look alike, which only makes it more difficult to sift through applications to find the best fit. AI might polish résumés, but it often doesn’t reflect the person’s real experience, says Farnsworth. “It’s hard to maintain your professional brand and voice if you’re running your résumé through an AI chat-bot that’ll potentially strip your personality from the final draft,” he says. “Ideally, hiring should be about who’s genuinely the best fit, not who wrote the most machine-friendly bullet points.”
Gutierrez adds that the future doesn’t require choosing between AI and authenticity, but rather finding the balance where AI can handle the administrative burden, thus freeing humans to focus on connection and context. It’s important to establish exactly how AI will support recruiters, hiring managers, and candidates as a well-defined strategy will build trust on both sides when hiring.
Leading with AI transparency
Transparency is the driving factor to consider when implementing AI in the hiring process, while still building trust. It’s important candidates understand how and where AI is used, and how to use it as an assistant. There are several steps to make sure candidates are reassured, and that they’re being evaluated by humans, not overlooked by an algorithm.
“That kind of openness not only eases candidate anxiety, but reinforces that the tech is there to enhance the process, not replace fairness or human judgment,” says Farnsworth.
Companies should focus on transparency around how AI is used in hiring, and according to Dice, this can include:
- Assuring candidates there’s a human review for applications.
- Offering secondary human review options for AI rejections with regular audits of AI decisions.
- Establishing recruiter accountability metrics and mandatory response timelines.
- Using AI to identify potential candidates rather than eliminate them.
- Introducing match scoring to show a fit percentage for candidates.
- Keeping AI focused on admin and humans focused on high-level thinking.
- Confirming with candidates when applications are received and reviewed by humans, and notifying candidates when a position is filled.
- Offering specific feedback to rejected candidates and steering clear of generic form responses.
The fear candidates express that AI will favor keywords over qualifications, they’ll be rejected if they don’t fit narrow criteria, and sense that a human won’t even see their résumé is cause for alarm. Dice found these fears have impacted tech workers to the point that 30% of respondents say they’re considering leaving the industry due to hiring frustrations, while 24% say they’re currently committed but growing frustrated.
“That stat should be a wake-up call,” says Farnsworth. “If people feel like they’re shouting into the void, they won’t stick around. Companies need to make sure the hiring experience doesn’t feel like a black box.”
Read More from This Article: AI fosters distrust for both tech candidate and employer
Source: News

