Skip to main content
โ† Back to Blog
GDPRComplianceCV ScreeningUK HiringData ProtectionAI Recruitment

GDPR-Compliant CV Screening: What UK Employers Need to Know

A practical guide to using AI CV screening tools while staying compliant with UK GDPR. Covers lawful basis, automated decision-making rules, data retention, and candidate rights.

24 February 2026ยทUpdated 24 February 2026ยท11 min readยทDan Vernon, Founder at Marxel
Share:๐•inf

GDPR-compliant CV screening means using AI tools to evaluate candidates while meeting UK data protection requirements โ€” including having a lawful basis for processing, providing transparency about automated decisions, and respecting candidates' rights to explanation and human review. Under the UK GDPR and Data Protection Act 2018, employers must handle candidate data with the same care as any other personal data processing.

This guide covers exactly what UK employers need to know to screen CVs with AI tools without creating compliance risk.

Why GDPR Matters for CV Screening

CV screening involves processing personal data at scale. A single CV contains names, addresses, employment history, education, and sometimes protected characteristics like age or nationality. When AI tools process this data automatically, several GDPR provisions are triggered.

The ICO (Information Commissioner's Office) has been increasingly active in the employment and AI space. In 2024, the ICO published updated guidance on AI and data protection, emphasising that organisations using AI for decisions about people must be transparent about how those decisions are made.

Key statistics:

  • The ICO issued ยฃ15.2 million in fines during 2023-2024 for data protection violations (ICO Annual Report 2023-24)
  • 67% of UK employers now use some form of automated screening technology (CIPD Resourcing and Talent Planning Survey 2024)
  • Only 34% of UK organisations have conducted a Data Protection Impact Assessment for their recruitment AI (Recruitment & Employment Confederation, 2025)

The Legal Framework

Three pieces of legislation govern AI CV screening in the UK:

LegislationKey Requirements
UK GDPRLawful basis for processing, data minimisation, transparency, rights of data subjects
Data Protection Act 2018UK-specific provisions including employment exemptions and safeguards for automated decisions
UK Equality Act 2010Screening criteria must not directly or indirectly discriminate on protected characteristics

Lawful Basis for Processing

You need a lawful basis under UK GDPR Article 6 to process CV data. The two most relevant bases for recruitment are:

Legitimate interest (Article 6(1)(f)) โ€” Most common for initial screening. You have a legitimate interest in efficiently evaluating candidates, and processing is necessary and proportionate to that interest. You must document this via a Legitimate Interest Assessment (LIA).

Consent (Article 6(1)(a)) โ€” Sometimes used but problematic in recruitment. The power imbalance between employer and candidate means consent may not be "freely given." The ICO has cautioned against relying solely on consent in employment contexts.

Contract (Article 6(1)(b)) โ€” Applicable when processing is necessary to take steps at the candidate's request before entering a contract (i.e., they applied for the job).

For most organisations, a combination of legitimate interest and pre-contractual steps provides the strongest legal footing.

Automated Decision-Making: Article 22

This is the provision most relevant to AI CV screening. UK GDPR Article 22 states that data subjects have the right not to be subject to decisions based solely on automated processing that produce legal or similarly significant effects.

What This Means in Practice

If your AI screening tool automatically rejects candidates without any human involvement, Article 22 applies. You must either:

  1. Ensure meaningful human oversight โ€” A human reviews the AI's recommendations before any rejection decision is made
  2. Obtain explicit consent โ€” The candidate explicitly agrees to automated decision-making
  3. Demonstrate necessity โ€” The automated decision is necessary for entering into or performing a contract

The practical solution: Most organisations satisfy Article 22 by maintaining human oversight. AI screens and recommends; a human makes the final decision. This is both legally sound and better practice โ€” AI should augment human judgment, not replace it.

Right to Explanation

Even with human oversight, candidates can request meaningful information about the logic behind automated processing. This means your AI tool must be able to explain:

  • What criteria were evaluated
  • How those criteria were weighted
  • Why a candidate scored the way they did
  • What data points influenced the decision

Tools that provide only a ranking number without reasoning create compliance risk. Tools with detailed per-candidate explanations โ€” showing which criteria matched, what raised concerns, and what was unclear โ€” make compliance straightforward.

Data Protection Impact Assessment (DPIA)

Under UK GDPR Article 35, you must conduct a DPIA before processing that is "likely to result in a high risk to the rights and freedoms of natural persons." AI-based screening of job applicants meets this threshold because it involves:

  • Systematic and extensive evaluation of personal aspects (profiling)
  • Automated decision-making with significant effects
  • Processing of data concerning vulnerable individuals (job seekers)

What Your DPIA Should Cover

SectionWhat to Document
Processing descriptionWhat data is collected, how AI processes it, what outputs are generated
Necessity and proportionalityWhy automated screening is needed, what alternatives were considered
Risks to individualsPotential for discrimination, inaccurate decisions, lack of transparency
Mitigation measuresHuman oversight, bias auditing, candidate notification, appeal process
ConsultationInput from your DPO, HR team, and potentially affected individuals

The ICO provides a DPIA template on their website. If you haven't done a DPIA for your recruitment AI, this should be your first compliance action.

Practical Compliance Checklist

Before You Start Screening

  • Conduct a DPIA for AI-assisted recruitment processing
  • Document your lawful basis for processing candidate data via AI tools
  • Update your privacy notice to mention automated screening and explain the logic involved
  • Sign a Data Processing Agreement (DPA) with your AI screening tool provider
  • Verify data residency โ€” know where candidate data is stored and processed
  • Check data retention policies โ€” how long does the tool keep CV data?

During Screening

  • Maintain human oversight โ€” never automatically reject candidates without human review
  • Review AI-generated criteria before processing to catch potentially discriminatory requirements
  • Document decisions โ€” keep records of why candidates were advanced or rejected
  • Ensure explainability โ€” use tools that show their reasoning, not just scores

After Screening

  • Respond to data subject requests โ€” candidates can ask what data you hold and how it was processed
  • Handle right-to-explanation requests โ€” provide meaningful information about AI screening logic
  • Delete data when no longer needed โ€” don't retain CVs indefinitely
  • Audit outcomes periodically โ€” check for demographic patterns that could indicate bias

Common GDPR Mistakes in CV Screening

Mistake 1: Relying on Consent Alone

The ICO has consistently stated that consent in employment contexts is problematic due to the power imbalance. If a candidate feels they must consent to AI screening to be considered, that consent isn't freely given.

Better approach: Use legitimate interest or pre-contractual steps as your primary lawful basis, with transparency about what processing occurs.

Mistake 2: No Human in the Loop

Fully automated rejection of candidates triggers Article 22 obligations. Some organisations set up AI screening to automatically discard candidates below a threshold without any human review.

Better approach: AI categorises candidates; humans review the categories and make final decisions. Even a brief human review of the "reject" pile satisfies the human oversight requirement.

Mistake 3: Opaque AI Tools

Using a screening tool that outputs only a score or ranking without explanation creates risk. If a candidate exercises their right to explanation, you need to be able to provide meaningful information about the logic.

Better approach: Choose tools with per-candidate reasoning โ€” which criteria matched, what raised concerns, confidence levels. This makes responding to data subject requests straightforward.

Mistake 4: Keeping CVs Forever

Some organisations retain candidate CVs indefinitely "in case a future role opens up." Without a clear retention policy and legal basis, this violates the data minimisation and storage limitation principles.

Better approach: Set a clear retention period (6-12 months is common for unsuccessful candidates), inform candidates in your privacy notice, and delete data when the period expires.

Mistake 5: Not Auditing for Bias

UK Equality Act obligations don't disappear when you use AI. If your AI tool's criteria indirectly discriminate โ€” for example, requiring a degree from a specific set of universities that correlates with socioeconomic background โ€” you're still liable.

Better approach: Periodically audit screening outcomes by demographic group. Review AI-generated criteria before processing to catch potentially discriminatory requirements.

Choosing a GDPR-Compliant Screening Tool

When evaluating AI screening tools for UK use, verify:

RequirementQuestions to Ask
ExplainabilityCan the tool show why each candidate was scored/categorised the way they were?
Data Processing AgreementDoes the provider offer a DPA that meets UK GDPR requirements?
Data residencyWhere is candidate data stored? Is it processed outside the UK?
Data retentionCan you configure retention periods? Can data be deleted on demand?
AI trainingIs candidate data used to train the provider's AI models?
Sub-processorsWho else has access to the data? Where are they located?
SecurityWhat encryption and security measures are in place?
Audit trailDoes the tool maintain records of processing activities?

For a comparison of tools on these criteria, see our best AI CV screening tools for the UK guide.

The ICO's Direction of Travel

The ICO has signalled increasing scrutiny of AI in employment. Key developments to watch:

  • AI and employment guidance โ€” The ICO has published guidance specifically on AI tools used in recruitment and employment decisions
  • Algorithmic transparency โ€” Growing expectation that organisations can explain how AI tools reach their conclusions
  • Bias auditing โ€” Emerging expectation that organisations proactively audit AI tools for discriminatory outcomes

Organisations that invest in GDPR-compliant screening now will be better positioned as regulatory expectations increase.

Frequently Asked Questions

Is AI CV screening legal in the UK?

Yes. AI CV screening is legal under UK GDPR provided you have a lawful basis for processing, maintain human oversight of decisions, provide transparency to candidates, and conduct a DPIA. The key is that AI should assist human decision-making, not replace it entirely.

Do I need to tell candidates I'm using AI to screen their CVs?

Yes. Under UK GDPR's transparency requirements (Articles 13-14), you must inform candidates that automated processing is being used. This should be included in your recruitment privacy notice, explaining the logic involved and the significance of the processing.

Can candidates opt out of AI screening?

Under Article 22, candidates have the right not to be subject to decisions based solely on automated processing. In practice, this means you must offer a route for human review if requested, or ensure your process already includes meaningful human oversight.

How long can I keep CV data?

There is no fixed legal period. You must retain data only as long as necessary for the purpose it was collected. For unsuccessful candidates, 6-12 months is common practice. For successful hires, data becomes part of the employment record. Document your retention periods in your privacy notice.

What if a candidate asks how the AI assessed them?

You must provide meaningful information about the logic involved โ€” not the algorithm's source code, but an understandable explanation of what criteria were evaluated, how they were weighted, and what the outcome was. Tools with per-candidate explanations make this straightforward.

Does using AI screening reduce or increase GDPR risk?

Both, depending on implementation. AI screening can reduce risk by providing consistent, documented, and auditable decisions โ€” unlike manual screening where reasoning is often unrecorded. But poorly implemented AI (opaque scoring, no human oversight, no DPIA) increases risk significantly.


Building compliant AI screening into your process doesn't have to be complex. The fundamentals are: be transparent, keep humans in the loop, choose tools that explain their reasoning, and document your processing. Get these right and AI screening becomes both legally sound and practically superior to manual review. See how Marxel handles GDPR compliance โ†’

Sources

Related Reading

Related Articles

Ready to screen CVs faster?

Try Marxel free and see results in minutes.

Get Started Free

We use cookies for analytics and to improve your experience.