Most engineering interview processes are simultaneously burdensome for candidates and inaccurate as selection mechanisms. Larson’s humane interview process framework addresses both problems at once — the practices that reduce candidate burden typically also improve predictive validity.
The Seven Principles
1. Interview for the actual job
- Evaluate candidates on work resembling the real role, not abstract puzzles
- A staff engineer doing 90% system design should be assessed on system design, not linked-list reversal
- Failure mode: whiteboard algorithmic challenges for roles that never require them
2. Avoid gotcha questions
- Questions designed to catch candidates out test memorised trivia, not job-relevant judgment
- Replace gotchas with open-ended questions that reveal how candidates reason
- Failure mode: “What’s the difference between a mutex and a semaphore?” when the team uses higher-level concurrency abstractions
3. Use a consistent scoring rubric
- Pre-define evaluation criteria and a scoring framework before interviews begin
- Without rubrics, interviewers default to “would I enjoy working with this person?” — a bias-prone proxy
- Failure mode: free-form written feedback where every interviewer assesses different dimensions
4. Calibrate interviewers
- New interviewers shadow experienced ones before solo interviews
- Regular calibration sessions compare scores on borderline candidates to surface drift
- Failure mode: interviewers who have never compared their scoring to others’ on the same candidate
5. Keep it short
- Four to five interviews captures most available signal; six or more returns diminishing information
- Multi-day on-sites filter for endurance, not job fit; candidates are often using PTO
- Failure mode: seven-round loops designed to achieve consensus rather than accuracy
6. Reduce performance anxiety
- Whiteboard coding under observation introduces cognitive load unrelated to job performance
- Take-home exercises, pair programming sessions, or design discussions reduce the anxiety component
- Anxiety in interviews ≠ anxiety on the job
- Failure mode: standardising on the format that is easiest for interviewers to run, not easiest for candidates to perform in
7. Collect evidence, not opinions
- Feedback should read: “candidate explained the trade-off between X and Y, demonstrating understanding of Z”
- Not: “candidate seems smart” or “I liked their energy”
- Failure mode: adjective-heavy feedback that cannot be audited for bias or inconsistency
The Implicit Principle
Fairness and accuracy align. A process that respects candidate time and reduces anxiety tends to elicit more authentic performance, which improves predictive validity. Optimising for candidate experience is not a trade-off with hiring quality — it reinforces it.
Connection to the Hiring Funnel
The humane interview process governs the Evaluate stage of the Hiring-Funnel. Poor evaluation design degrades the entire funnel: sourcing effort from Three-Candidate-Sources and cold outreach is wasted if the interview process filters for the wrong signals or drives candidates to withdraw.
Related Concepts
- Hiring-Funnel
- Three-Candidate-Sources
- Calibration-System-for-Performance
- Performance-Management-System
- Larson-2019-An-Elegant-Puzzle
Sources
-
Schmidt, Frank L. and John E. Hunter (1998). “The Validity and Utility of Selection Methods in Personnel Psychology: Practical and Theoretical Implications of 85 Years of Research Findings.” Psychological Bulletin, Vol. 124, No. 2, pp. 262–274. DOI: 10.1037/0033-2909.124.2.262.
- Definitive meta-analysis showing structured interviews have substantially higher predictive validity (~0.51) than unstructured interviews (~0.38) for job performance
-
Larson, Will (2019). An Elegant Puzzle: Systems of Engineering Management. Stripe Press. ISBN: 978-1-7322651-8-9.
- Chapter 6.2: Interview process design principles
-
Campion, Michael A., David K. Palmer, and James E. Campion (1997). “A Review of Structure in the Selection Interview.” Personnel Psychology, Vol. 50, No. 3, pp. 655–702. DOI: 10.1111/j.1744-6570.1997.tb00709.x.
- Identifies fifteen dimensions of interview structure and their impact on reliability and validity; foundational framework for rubric design
-
Bohnet, Iris (2016). What Works: Gender Equality by Design. Harvard University Press. ISBN: 978-0-674-08903-3.
- Chapter 6 addresses how structured evaluation criteria reduce affinity bias in hiring; empirical evidence from field experiments
-
Macan, Therese Hoff (1994). “Time Management: Test of a Process Model.” Journal of Applied Psychology, Vol. 79, No. 3, pp. 381–391.
- Cognitive load research supporting the case for reducing extraneous stressors in evaluation contexts; basis for anxiety-reduction principle
Note
This content was drafted with assistance from AI tools for research, organization, and initial content generation. All final content has been reviewed, fact-checked, and edited by the author to ensure accuracy and alignment with the author’s intentions and perspective.