Self-assessment for TA leaders and hiring managers

Hiring Process Diagnostic

13 criteria across 5 dimensions. Score your process honestly. Find out where the risk actually is.

Most hiring processes feel functional until someone asks them to justify a decision. A rejected candidate asks why. Legal asks for the documentation. A hiring manager asks what criteria were used to shortlist. And from August 2026, the EU AI Act requires that any automated screening tool used in hiring can produce a written explanation of how it reached its conclusions.

This diagnostic does not measure whether your team is strategic or well-resourced. It measures whether your process is documented, consistent, and defensible - the three things that matter when something goes wrong.

222
Average applications per role in Q1 2024 - nearly 3x the 2021 level (Greenhouse)
3%
Of applicants typically reach the interview stage in a standard process (CareerPlug)
<20%
Of European employers "very prepared" for EU AI Act compliance in hiring (2025 survey)

The assessment takes 10 to 15 minutes. Score yourself against the current state of your process, not the process you intend to build. The number you finish with places you in one of four bands, each with a clear starting point for what to address first.

How to complete this assessment
  1. Read all four levels (0, 1, 2, 3) for each criterion before choosing your score.
  2. Select the level that most accurately describes your process today - not your intended process or your best-case scenario.
  3. Write your score in the "Mine" column on the right of each row.
  4. Add up the subtotal for each dimension at the bottom of each section.
  5. Transfer dimension subtotals to the scoring table at the end. Add them up. Find your band.
Dimension 1: Process Documentation and Consistency Max 9 pts  |  My subtotal: ____
Criterion Score Mine
1. Process definition and enforcement
0No formal process. Every recruiter improvises and each hire runs differently.
1A written process exists but is rarely enforced. People follow it when convenient.
2Process is documented and mostly followed, with occasional deviations under pressure.
3Fully documented workflow covering steps, roles, and timelines. Enforced consistently for every hire regardless of urgency.
0-3  
2. Job description quality
0JDs are outdated or boilerplate. Rarely reviewed or connected to actual screening criteria.
1JDs are broadly accurate but updated infrequently, often after a vacancy has already been posted.
2JDs are clear, reviewed before posting, and core requirements are specific enough to screen against.
3JDs are precise, pre-approved, and map directly to the scoring criteria used in screening. Updated before every new campaign.
0-3  
3. Hiring manager alignment before the process begins
0Hiring managers define requirements on the fly or change them mid-process. No formal sign-off before posting.
1A briefing takes place but criteria frequently change after the vacancy goes live.
2Hiring managers agree on core criteria in advance. A calibration session takes place before shortlisting begins.
3Full alignment before posting: scoring rubric agreed, JD approved, interview format confirmed. Criteria do not change once the process is live.
0-3  
Dimension 2: Candidate Evaluation Consistency Max 9 pts  |  My subtotal: ____
Criterion Score Mine
4. Scoring rubric adoption
0No rubric. Screening and interview decisions are entirely subjective and undocumented.
1A rubric exists but is optional or inconsistently applied - especially under time pressure.
2Defined scorecards are used by most interviewers on most roles.
3Weighted scoring rubrics are mandatory on every role. All candidates are evaluated against each criterion before a shortlist decision is made.
0-3  
5. Interview structure
0Interviews are unstructured. Questions vary by interviewer and by session. Only 19% of companies enforce structured interview guides across all hires.
1Some standard questions exist but interviews regularly drift off-script.
2Most interviews follow a structured guide with predetermined, job-relevant questions.
3Every interview is standardised: consistent panel composition, predetermined competency questions, and uniform rating scales applied across all candidates for the role.
0-3  
6. Evaluation documentation
0No written records. No documented reason why a candidate was rejected or progressed to the next stage.
1Interviewers take informal notes but they are not centralised or consistently retained.
2Interview notes and scores are entered into the ATS for most candidates, most of the time.
3Every candidate has a documented evaluation record in one system: scores, notes, and decision rationale. Records are complete enough to be produced if challenged.
0-3  
Dimension 3: Compliance and Audit Readiness Max 6 pts  |  My subtotal: ____
Criterion Score Mine
7. Data privacy and AI compliance
0No formal review of compliance obligations. Screening tools are in use without documentation of how they reach decisions. No consent process in place.
1Basic steps completed (consent collected, standard disclaimers), but no audit trail, no impact assessment for automated tools.
2Consent is recorded. Automated tools are logged. An impact assessment has been completed at least once for the main tools in use.
3Full compliance documented: rationale for automated decisions is captured, algorithm audits are scheduled, a candidate explanation process exists, and EU AI Act obligations for August 2026 are mapped and being addressed.
0-3  
8. Non-discrimination safeguards
0No formal safeguards. Decisions frequently rely on "culture fit" without defined criteria. No bias awareness training in place.
1General equality training completed. Occasional blind screening or diverse shortlisting attempted on specific roles.
2Structured safeguards in place for most hires: skills-based criteria, diverse interview panels, current bias-awareness training for interviewers.
3Hiring outcomes are monitored by demographic. Bias training is mandatory and current. The process is designed so that consistent decisions are produced regardless of which individual is reviewing.
0-3  
Dimension 4: Process Efficiency Max 9 pts  |  My subtotal: ____
Criterion Score Mine
9. Time-to-hire visibility
0No tracking. Roles stay open indefinitely with no accountability for how long they take.
1Time-to-hire is recorded but rarely acted on. No target and no root-cause analysis when a role runs over.
2Time-to-hire is measured against a benchmark (UK median: approximately 40 days). Delays are investigated when they occur.
3Consistently meets or beats target across most role types. Stage-level bottlenecks are identified and resolved. Improvement is tracked over time.
0-3  
10. Funnel conversion tracking
0No funnel data. Application volume, interview numbers, and offers are not tracked by stage.
1Headline figures are captured (total applications, total hires) but not by stage.
2Stage-by-stage conversion rates are tracked and compared to benchmarks (approximately 3% of applicants reach interview; approximately 27% of interviews result in a hire).
3Detailed conversion ratios tracked by role type and source channel. Process is adjusted when ratios fall outside expected ranges. Drop-off points are diagnosed and fixed systematically.
0-3  
11. Offer acceptance rate
0Below 60% of offers accepted. No analysis of why candidates decline.
1Acceptance tracked (typically 60-80%). Decline reasons occasionally explored but rarely acted on.
2Target of at least 85% acceptance. The team gathers structured feedback when offers are declined.
3Consistently at or above 90%. Offers are made within 48 hours of final interview. Warm handoffs are in place to prevent candidate dropout between offer and start date.
0-3  
Dimension 5: Candidate Experience and Feedback Max 6 pts  |  My subtotal: ____
Criterion Score Mine
12. Communication cadence
0Candidates hear nothing until the role is filled. Silence is the default. 52% of candidates report being ghosted in a hiring process (Greenhouse, 2024).
1Automated acknowledgment is sent at application, but updates are infrequent and generic throughout the process.
2Candidates receive status updates at each stage via ATS notifications or email.
3Timely, clear communication at every step including rejection. Candidates always know where they stand and what happens next.
0-3  
13. Rejection feedback quality
0Generic "no thanks" messages sent without reason, or no response at all.
1Standard rejection templates. No personalisation and no connection to the specific criteria used.
2Brief, respectful feedback is available to shortlisted candidates who ask for it, based on documented evaluation criteria.
3Constructive feedback is proactively offered to shortlisted candidates and tied directly to documented scoring criteria. The rejection could be produced, explained, and defended if a candidate challenged it.
0-3  
My Scores
Dimension 1: Process Documentation and Consistency   / 9
Dimension 2: Candidate Evaluation Consistency   / 9
Dimension 3: Compliance and Audit Readiness   / 6
Dimension 4: Process Efficiency   / 9
Dimension 5: Candidate Experience and Feedback   / 6
Total   / 39
What your score means
0-9out of 39
Broken
The process has significant gaps in documentation, consistency, and compliance. A rejected candidate, a regulatory enquiry, or a failed hire would expose these gaps quickly. Prioritise Dimensions 1 and 2 first - a documented process with consistent scoring is the foundation everything else depends on.
10-19out of 39
Reactive
Basic structure exists but the process relies on individuals doing the right thing rather than systems enforcing it. Decisions are being made, but the documentation to defend them is incomplete. The vulnerability usually sits in evaluation documentation (Criterion 6) and compliance readiness (Dimension 3). These can be addressed without rebuilding the whole process.
20-28out of 39
Functional
A solid foundation. The process is broadly consistent and most roles produce a defensible record. The gap between functional and audit-ready is usually Dimension 3 (compliance) and the completeness of evaluation documentation at scale. Both are fixable without structural change.
29-39out of 39
Audit-ready
The process is consistent, documented, and designed to survive a challenge. Hiring decisions can be explained. Evaluation records are complete. Compliance obligations are mapped and being addressed. The focus now is maintaining standards as hiring volume grows and as AI Act enforcement begins in August 2026.

Most processes sit in the Reactive band. Not because the people running them are careless, but because documentation and defensibility were never built into the design. They were assumed to happen naturally - and they rarely do without a structure that enforces them.

The dimension that most often surprises TA leaders is Compliance and Audit Readiness. The EU AI Act's August 2026 deadline requires that any automated tool used in screening can produce a written explanation of how it reached its conclusions. Most teams have not yet assessed what that means for the tools they are already using today. That is not a criticism - it is just where things stand, across most organisations, right now.

Whatever your score, the most useful next step is the same: look at the dimension with your lowest subtotal, and start there. The goal is not a perfect score. The goal is a process you can explain to a hiring manager, defend to a rejected candidate, and document for a regulator - all at the same time.