Site icon Enna Global

Neurodivergence and the Rise of AI in Hiring: What HR Needs to Know

As AI recruitment tools become more common, from CV screeners to video interview platforms, employers are celebrating faster, more “objective” hiring decisions. But there’s a growing problem. Many of these tools are excluding the very people you’re trying to include.

For neurodivergent candidates, automated systems can create invisible barriers. And if your hiring tech isn’t built or reviewed with neuroinclusion in mind, you might be losing out on brilliant talent without ever knowing it.

In this blog, we’ll explore how AI is reshaping hiring, why it often works against neurodivergent applicants, and what HR teams can do to use these tools ethically and inclusively.

AI in Hiring: What Are We Talking About?

Let’s start with what AI hiring actually means. These tools are designed to save time, reduce human bias, and streamline decision-making. Common examples include:

They promise speed, consistency, and fairness, but when it comes to neurodivergent applicants, they can unintentionally do the opposite.

How AI Can Disadvantage Neurodivergent Candidates

Most AI tools are built using datasets that reflect typical behaviour, speech patterns, and communication styles. The problem is that neurodivergent candidates often don’t fit these norms, and AI isn’t good at recognising different as equally capable.

Here’s where things go wrong:

1. Scoring based on eye contact or facial expression
Autistic candidates might avoid eye contact or show flat affect due to social or sensory processing differences. An AI system might interpret this as lack of enthusiasm or dishonesty, even when the candidate is fully engaged and competent.

2. Filtering based on grammar, spelling, or tone
A dyslexic applicant might make minor spelling mistakes. Someone with ADHD might write in a less structured way. But if your system penalises these factors without understanding context, it’s screening out talent based on neurotype, not ability.

3. Time-limited assessments
Many neurodivergent people need extra time to process instructions or formulate responses. Timed tasks can cause unnecessary stress and disadvantage candidates who think differently but work brilliantly.

4. Penalising gaps or career shifts
Neurodivergent candidates may have taken time off for mental health or worked in non-linear roles. AI systems that reward neat, conventional career paths may overlook great people with diverse life experience.

Legal and Ethical Risks

Using AI in hiring doesn’t remove your legal obligations. In fact, it may increase your risk if the tech results in discrimination.

Under laws like the UK’s Equality Act 2010, employers have a duty to make reasonable adjustments for disabled applicants, including those with neurodivergent conditions. If your hiring system indirectly disadvantages these groups, you could be in breach, even if it’s an algorithm making the decision.

That’s why the UK’s EHRC and the ICO have warned that automated recruitment tools must be used carefully, with regular review and human oversight.

What HR and Talent Teams Can Do

The solution isn’t to stop using AI, it’s to use it responsibly. Here’s how to start.

1. Audit your tools for bias
Ask your vendors direct questions:

2. Always offer reasonable adjustments
No AI system is one-size-fits-all. Make it easy for candidates to request adjustments, such as more time, alternative formats, or bypassing automated steps altogether. And include this clearly in job ads and on your careers page.

3. Build in human checks
Even if AI narrows down candidates, ensure humans review decisions before rejecting applicants. Sometimes, the best fit is the person the algorithm didn’t understand.

4. Avoid assessments that judge social presentation
Eye contact, tone, smiling, or “personality fit” can all be shaped by neurotype, culture, and mood. Don’t let algorithms penalise someone for not performing a scripted version of enthusiasm.

5. Collect feedback from candidates
Ask neurodivergent applicants how they found the process. If certain tools consistently lead to dropouts or complaints, it’s time to make changes.

A Better Approach to Neuroinclusive Tech

Some organisations are already shifting towards more inclusive AI use. Here’s what it looks like in practice:

In other words, the tech doesn’t make the decisions. It supports them.

Final Thoughts

AI can help make hiring faster, but if it isn’t inclusive, it’s not fair. Neurodivergent candidates already face barriers that aren’t always visible to neurotypical teams. Introducing automated systems without scrutiny can amplify those barriers and push great talent away.

Being neuroinclusive in your hiring doesn’t mean abandoning efficiency. It means embedding equity into your systems from the start.

Start by asking better questions, offering real adjustments, and being willing to adapt when the tech gets it wrong.

Because great candidates don’t always look how the algorithm expects.

Want to know if your hiring process is neuroinclusive?
We offer recruitment audits, neurodiversity training for managers and teams, and strategic guidance to help you build inclusive hiring systems, with or without AI. Learn more at enna.org.

Exit mobile version