Introduction
Hiring decisions are among the most important decisions you will make as a manager or supervisor. Interviews are an excellent tool in the selection process but can be problematic if not done correctly. This guide has been developed to help you interview and select applicants that would best suit your advertised position.
The structured interview process, if applied correctly, may largely remove human bias from the recruitment process, reduce the likelihood of a hire that is incompatible with your position, and allows hiring managers to compare candidates in an “apples to apples” manner.
Structured interviews are designed to be consistent, measurable, and directly tied to the skills and behaviors required for success in a position.
Although this guide was developed specifically for non-pedagogical administrative employees, the concepts and techniques should be utilized for all recruitment activities where interviews are utilized to ensure an equitable, inclusive process.
What is Bias and How Does it Affect the Interview Process?
Bias refers to conscious or unconscious prejudices and mental shortcuts that cause interviewers to evaluate candidates based on subjective factors instead of merit, such as appearance, shared interests, or stereotypes. These biases, including affinity bias, the halo/horn effect, confirmation bias, first impression bias, stereotype/group bias, the contrast effect, non-verbal bias, cultural noise, and recency bias undermine hiring fairness, reduce diversity, and often lead to poor hiring decisions.
Definitions and Examples
Affinity Bias
Favoring candidates who share personal characteristics, interests, or backgrounds.
Example: Hiring a candidate because you both attended the same university or enjoy the same hobbies.
Halo/Horn Effect
Halo: A single positive trait makes you believe the candidate is perfect overall.
Example: Assuming a candidate is highly competent because they worked for a prestigious company.
Horn: A single negative trait overshadows all positive ones.
Example: Rejecting a candidate because they were nervous at the beginning, ignoring their strong technical skills.
Confirmation Bias
Forming an immediate opinion and only asking questions or listening for information that supports that initial thought.
Example: Deciding a candidate is "too junior" in the first minute, then focusing only on their lack of experience while ignoring their, relevant, high-level projects.
First Impression Bias
Relying on the first few minutes of an interview, handshake, or appearance rather than the entire conversation.
Stereotype/Group Bias
Relying on generalizations about a group rather than an individual's skills.
Example: Assuming a female candidate cannot handle a high-travel role or that an older candidate is not tech-savvy.
Contrast Effect
Evaluating a candidate against the previous interviewee rather than against the job requirements.
Example: Rating a good candidate as "poor" because the person before them was exceptional.
Non-Verbal Bias
Making judgments based on body language, dress, or mannerisms rather than the content of answers.
Example: Disliking a candidate because they did not make enough eye contact.
Different types of neurodiversity, like Tourette syndrome and autism spectrum disorders (ASD), can also affect a person’s body language and lead you not to hire them despite their strengths.
Cultural Noise
Evaluating a candidate based on what you think they believe is socially acceptable, rather than their true, honest answers.
Recency Bias
Remembering and heavily weighing only the last few things a candidate said, or the last few people interviewed.
Conscious vs. Unconscious Bias
Unconscious biases act as mental filters that shape how we:
- Develop interview questions
- Interpret candidate responses
- Evaluate information we see as “important”
Bias may be:
- Conscious or Explicit: Explicit, or conscious, bias involves deliberate, prejudiced beliefs that an interviewer is aware of, such as, "I don't think women are suited for this role."
- Unconscious or Implicit: Implicit bias is the unconscious, automatic mental shortcuts, based on stereotypes and past experiences—that influence decisions without the interviewer's awareness.
Selective perception can cause interviewers to focus on information that confirms existing beliefs. An interviewer may ignore vague answers from a candidate with a shared background (in-group favoritism) or over-focus on one small, irrelevant detail that confirms an initial, perhaps biased, impression.
Real‑World Example:
An interviewer sees a candidate who attended the same college as their
sibling
. Without realizing it, they assume the candidate is hardworking and friendly. During the interview, they
inadvertently smile more, probe less deeply, and rate answers more favorably—despite weaker evidence.
Sources of Bias
Common factors that can introduce bias during interviews include:
- Cognitive overload: too much information to evaluate objectively
- Personal preference: liking a candidate based on similarity or familiarity
- Personal circumstances: the interviewer's own mood, fatigue, bad day, or negative, unrelated personal experiences (e.g., traffic incident) can influence their assessment of the candidate.
Real‑World Examples:
- Cognitive overload: An interviewer screens 25 resumes in an hour. By resume #20, they start skimming and unintentionally overlook qualified candidates with nontraditional backgrounds.
- Personal preference: A candidate mentions that they love the same sports team as the interviewer. The conversation becomes more casual, which influences the interviewer’s score.
- Personal Circumstances: Afternoon interviews consistently receive lower scores than morning interviews due to diminished attention; an interviewer fails to take detailed notes because they are distracted due to an incident at their child’s school.
Competency‑Based Interviewing
Competency-based interviews are a structured interviewing technique that evaluates candidates based on specific skills, behaviors, and attributes required for success in a role. Instead of focusing on hypothetical questions, this method requires candidates to provide real-life examples of how they have demonstrated key competencies in past experiences.
Competency-based interviewing improves hiring accuracy by linking interview questions directly to job requirements. It reduces bias by standardizing evaluation criteria, ensures fairness across candidates, and provides predictive insights into how someone is likely to perform in the future. This approach is particularly valuable for roles where soft skills such as leadership, problem-solving, or teamwork are critical.
Interviews should evaluate:
- Skills
- Experience
- Education
- Certifications
Competencies must directly relate to job requirements and expectations.
Common Use Cases/Examples
- Asking a candidate to describe a time they resolved a conflict within a team to assess collaboration and communication.
- Using the STAR method (Situation, Task, Action. Result) to structure candidate responses.
- Evaluating competencies such as leadership, adaptability, or decision-making in managerial positions.
- Aligning interview questions and predefined role competencies developed through analysis of position responsibilities.
- Providing hiring panels with standardized scoring rubrics for consistent evaluation.
Real‑World Example: A hiring team for a data analyst role defines competencies: Excel proficiency, ability to interpret datasets, and communication skills. One candidate has an impressive resume but gives vague responses about analyzing data. Another has less experience but provides clear descriptions of past data‑cleaning, modeling, and visualization. Competency‑based scoring ensures the second candidate is rated higher—despite the resume gap.
Structured Interviewing
A structured interview is a standardized, consistent assessment method where every candidate is asked the same set of predetermined, job-related questions in the same order, and responses are evaluated using a uniform rating system.
A structured interview includes:
- Planned, standardized questions used for every candidate
- Objective scoring criteria
- Evaluation rubrics or sheets
- Consistent interviewer behavior
- Diverse panel approach
Structured interviewing reduces bias and improves the quality of hiring decisions.
Real‑World Example: A central division adopts a structured protocol for all hires. Every candidate receives the same six questions aligned to competencies such as effective communication skills, an equity mindset, and Microsoft Office proficiency. Scores become consistent across interviewers, and the division sees a 30% improvement in new‑hire effectiveness ratings.
Designing Effective Questions (STAR Method)
The STAR method (Situation, Task, Action, Result) is an approach utilized in developing behavioral questions for structured interviews. This method prompts the candidate to detail the context (S), objective (T), steps taken (A), and measurable outcomes (R).
Use the Situation–Task-Action–Result (SAR) framework to design behavioral questions:
- Situation: Ask the candidate to describe a past scenario
- Task: Ask the candidate to describe the task, problem or responsibility
- Action: Ask what action they personally took
- Result: Ask about the outcomes of their actions
Behavioral vs. Situational Questions
Structured interviews use predetermined, job-related questions to assess candidates, primarily through behavioral (past-focused) and situational (future-focused) types.
- Behavioral Questions: Ask for past examples ("Tell me about a time...") to predict future performance.
- Situational Questions: Pose hypothetical scenarios ("What would you do if...") to gauge problem-solving and decision-making skills.
Real‑World Example (Behavioral Question): “Tell me about a time you had to resolve a conflict with a colleague.” A strong STAR answer might describe:
- S: A disagreement over how to divide responsibilities during a busy enrollment season
- T: The candidate evaluated the responsibilities and the staff resources available to do the work after listening to all opinions
- A: The candidate initiated a meeting, clarified expectations, and proposed a shared calendar
- R: Miscommunications decreased and the team met all deadlines
Real‑World Example (Situational Question): “Imagine an employee becomes upset about a schedule change. What would you do?” A high‑quality answer includes steps like acknowledging concerns, offering multiple solutions, and following communication protocols.
Evaluation Criteria
Structured interview question rating scales are standardized scoring systems, typically using a 1-5 rating scale, designed to evaluate candidate responses objectively based on pre-defined proficiency levels for specific job competencies.
Effective scales use behavioral anchors to match answers to scores like "1-Unacceptable" to "5-Excellent," reducing bias and ensuring consistency.
Key Strategies for Effective Rating Scales:
- Use Behavioral Anchored Scales: Instead of just numbers, define what each rating means (e.g., 1-Does not meet, 3-Meets, 5-Exceeds) to ensure consistent interpretation across interviewers.
- Implement 5-Point Scales: A 5-point scale is recommended as it offers enough nuance (1-5) for variance while remaining intuitive for evaluators.
- Focus on Specific Competencies: Rate candidates on specific, pre-determined criteria (e.g., technical skill, communication) rather than a general, subjective feeling.
- Combine Quantitative and Qualitative Data: Supplement numerical ratings with comments or examples to provide context to the score.
- Standardize the Process: Ensure all interviewers understand the scale and criteria, allowing for direct comparison of total scores to identify the best candidate.
Before the interview:
- Identify what a strong or weak answer looks like
- Align scoring with competencies and job expectations
Real‑World Example: For the competency “problem‑solving,” a school operations manager role uses the following evaluation criteria:
- 5: Identifies root cause, proposes multiple solutions, shows data‑driven thinking
- 3: Provides a workable solution but misses underlying issues
- 1: Provides vague or irrelevant answers
Before implementing this rubric, interviewers often rated candidates inconsistently. After using it, scores correlated more strongly with later job performance.
Resume Screening Best Practices
Consider using a structured, objective approach based on a detailed job description, such as creating a checklist of required skills, education, and experience. Recruiters should use standardized scoring to minimize bias, look for career progression, and check for attention to detail.
For a more thorough process, sort resumes into "Yes," "Maybe," and "No" piles, and consider having a committee chair review the findings.
- When reviewing resumes, avoid distraction by bias triggers, such as personal details, unrelated experience, and assumptions based on format or style.
Real‑World Example:
A recruiter notices two resumes:
- Candidate A: Highly polished resume with professional formatting
- Candidate B: Less polished formatting, but stronger technical certifications and relevant experience
Using structured screening criteria (years of experience, certifications, demonstrated competencies), Candidate B scores higher—avoiding aesthetic bias.
Conducting the Interview
Best practices for structured interviews include asking all candidates the same, job-related, open-ended questions using a predefined rubric to minimize bias. Key actions involve using a panel of interviewers, taking objective notes that avoid personal opinions or assumptions, and using anchored rating scales to evaluate responses. Interviews should focus on behavioral questions and allow candidates sufficient time to respond. Interviewers are permitted to ask probing follow-up questions. Candidates should also be permitted to ask questions at the end of the structured portion of the interview.
Real‑World Example
During a search process for an Executive Director position, interviewers were trained to take objective notes.
- Objective note: “Candidate described leading a team of 12 to develop a new literacy program; reported 8% ELA growth in one year.”
- Subjective note: “Candidate seems confident and enthusiastic.”
The shift led to clearer decision‑making and more defensible hiring documentation.
Areas of Caution (Avoid These Topics)
When conducting a structured interview, avoid straying from the set list of questions, asking legally prohibited or biased questions (age, race, family status), relying on "gut feeling" rather than scoring rubrics, and failing to use open-ended, behavior-based questions. Other pitfalls include asking leading or compound questions, interrupting candidates, and neglecting to take consistent notes.
Interviewers must avoid
- Questions related to protected classes, including race, gender, age, marital status, etc.
- Salary history questions in alignment with Executive Order 21
- Disability‑related inquiries, unless they relate to essential job functions and follow ADA guidance
Real‑World Examples
- Protected class questions: An interviewer casually asks, “Do you have kids?” intending to create rapport. This violates guidance, as parental status is a protected category in many states.
- Salary history (Executive Order 21): A hiring manager begins to ask, “What were you making in your last role?” Another interviewer intervenes, reminding them that salary history requests are prohibited.
- Disability‑related inquiries: A candidate reveals they have a condition requiring occasional rest breaks. The interviewer properly shifts to: “Here are the essential functions of the role. Can you perform them with or without reasonable accommodations?”
Summary
An effective structured interviewing approach increases the predictive validity for future job performance and makes the hiring process more legally defensible.
Effective structured interviewing requires
- Understanding and mitigating bias
- Preparing standardized, competency‑based questions
- Using an objective, structured approach to resume screening
- Utilizing a diverse panel that has been trained to minimize bias
- Using consistent evaluation tools
- Maintaining objective, job‑related notes
Real‑World Example
A municipal HR department instituted structured interviewing across all departments. Within the first year:
- Hiring decisions became more consistent
- Candidate experience improved
- Departments saw fewer complaints about unfair treatment
- Managers reported feeling more confident and prepared
- The quality of hires improved
Standardizing the process created a more equitable, transparent environment for both candidates and interviewers.
Resources
Practice Scenarios (By Competency)
Below are realistic, job‑agnostic scenarios you can use for interviewer practice. Each includes:
- The interview question
- A sample strong answer (STAR‑based)
- A partial answer
- A weak answer
Scenario 1: Problem‑Solving
Interview Question: “Tell me about a time you identified a problem before others noticed it. What did you do?”
Strong STAR Answer:
- S: “At my previous organization, our student enrollment numbers were fluctuating weekly, and no formal tracking system existed.”
- T: “I analyzed incoming data, noticed patterns, and determined that a simple dashboard would be beneficial to the department for tracking and analyzing the fluctuations in enrollment numbers.
- A: “I built a simple dashboard to track changes in real time. I also trained the team on how to use it.”
- R: “The dashboard helped us predict staffing needs more accurately and reduced last‑minute scheduling issues by 40%.”
Partial Answer: Describes a general process (“I like to stay ahead of problems…”) but offers no clear example or result.
Weak Answer: Focuses on blaming a coworker or describing a trivial issue; no steps taken, no measurable outcome.
Scenario 2: Communication
Interview Question: “Describe a time you had to deliver complex information to someone unfamiliar with the topic.”
Strong STAR Answer:
- S: “Our team needed to explain a new compliance requirement to school staff unfamiliar with legal terminology.”
- T: “I determined what the most relevant information related to the new compliance requirement was for the team.
- A: “I created a one‑page visual guide and held a brief Q&A session using everyday examples.”
- R: “Compliance issues dropped immediately, and the training model was adopted districtwide.”
Partial Answer: Explains the information but not how they tailored it to the audience.
Weak Answer: “I just sent an email and hoped they understood.”
Scenario 3: Collaboration & Teamwork
Interview Question: “Tell me about a time you had to work with a difficult team member.”
Strong STAR Answer:
- S: “On a cross‑department project, another team member consistently missed deadlines.”
- T: “I reviewed the workflow of the other team members to determine if there was an imbalance in assigned work and scheduled a meeting with the team member.
- A: “At the meeting, I clarified their constraints, and adjusted our workflow so tasks were redistributed while still holding accountability.”
- R: “The project was completed on time, and communication improved across the group.”
Partial Answer: Provides the situation but avoids describing actions they took.
Weak Answer: “They were difficult, so I did the work myself.”
Scenario 4: Equity & Inclusion Mindset
Interview Question: “Give an example of how you ensured an inclusive environment or equitable outcome.”
Strong STAR Answer:
- S: “During family‑engagement nights, participation was low for multilingual families.”
- T: “I needed to understand why this was happening, so I sent out a survey to past participants. The responses indicated that the events did not support multilingual speakers. The other notable feedback was that the events began too early for working parents to attend”.
- A: “In response to the feedback I received, I organized interpreters, redesigned flyers in multiple languages, and adjusted meeting times.”
- R: “Attendance from those families increased by over 60%.”
Partial Answer: Says they “value inclusion” but gives no example.
Weak Answer: “I treat everyone the same.”
Scenario 5: Adaptability
Interview Question: “Describe a time when priorities changed quickly. How did you adjust?”
Strong STAR Answer:
- S: “A key data system went down during our busiest reporting week.”
- T: “I performed a quick analysis to determine how we could manually keep the work moving forward until the system came back up.
- A: “I shifted to a manual process, established a temporary workflow, and communicated timelines to stakeholders.”
- R: “We still met our reporting deadline with no errors.”
Partial Answer: Talks about being flexible but no tangible example.
Weak Answer: “I don’t like last‑minute changes.”
Quizzes for Interviewers
Use these as quick assessments in training sessions.
Quiz A: Identifying Bias
Which of the following is an example of similarity bias?
- Rating a candidate lower because they seem nervous
- Rating a candidate higher because you share the same hometown
- Focusing only on the first answer a candidate gives
- Assuming a degree from a top school equals high performance
Correct Answer: B
Which note is objective ?
- “Candidate seemed uncertain when discussing experience.”
- “Candidate probably wouldn’t get along with the team.”
- "Candidate led a team of 10 on a logistics project.”
- “Candidate is definitely a great leader.”
Correct Answer: C
Which question violates legal standards?
- “Are you able to perform the essential functions of the job?”
- “What was your previous salary?”
- “Tell me about a time you overcame a challenge at work.”
- “What aspects of this role interest you?”
Correct Answer: B
Quiz B: STAR Model Practice
Which of the following reflects a complete STAR answer?
- Describes actions but no outcome
- Provides situation and result but no action
- Provides situation → task → action → result
- Provides personal opinions without examples
Correct Answer: C
Quiz C: Structured vs. Unstructured Interviewing
True or False:
- Using the same questions for every candidate reduces bias. True
- Structured interviews prevent interviewers from asking follow‑up clarifying questions. False (Clarifying is allowed; new substantive questions are not.)
- Subjective impressions should be included in candidate scoring. False
Downloadable Templates
The templates and guides resources to assist you with developing targeted structured interview questions, rating scales, and access to downloadable templates of individual and group scoring rubrics.
