Whether you’re reviewing your graduate program for design improvements or assessing recent changes, feedback is only as good as the question you’re asking. A good survey question is one that helps you get clear insights and action-oriented information about your program and participants.
Here are three considerations to help you ask effective survey questions.
1. Survey design starts with purpose. What’s yours?
Begin by identifying what it is you want to learn from your survey, and then work backwards.
Here are some questions to prompt you:
- Why are you asking for feedback?
- What are the answers you want to see?
- What is the area of focus for this feedback and why?
- What do you want to measure with this feedback?
- What existing benchmarks do you want to compare to?
- What will you do with this feedback?
- How will the respondents benefit from this feedback process? (actions you’ll take)
These points are essential to know for two reasons:
- They’ll help you figure out which questions need to be asked and how to frame them.
- You can better communicate the process with desired respondents, and from that expect positive engagement in the process, higher response rates and quality data.
Just make sure you can deliver what you promise.
2. What kind of survey is it?
There’s more to a survey than simple yes/no questions. The number and intensity of questions can affect the answers and could even lead to survey fatigue (respondents will either quit the survey or pay less attention on remaining questions just to complete it).
Think about the answers you need and the reason you need them and consider what kind of survey works best.
- Quick pulse check (5-10 questions)
- Thorough and intensive review (ideally max. 20 questions)
Generally speaking, a ‘pulse check’ survey is shorter and conducted on a regular basis allowing you to track and compare results over a period of time (e.g., your graduate’s satisfaction with their program experience).
A ‘feedback survey’ is usually more in-depth and seeks to understand a topic in more detail, as well as provide participants an opportunity to contribute more detailed feedback or ideas (e.g., candidate’s recruitment experience, or graduate’s post-rotation experience).
Examples of where you might use a pulse check versus a feedback survey:
- Rotation Experience
- Program Experience
- Leader Support
- Program Support
- On-the-job Experience
- Candidate Experience
- Onboarding Program
- Learning Program
- Program Review
Understanding the kind of survey you’re creating helps you choose the right number of survey questions and question types.
3. Which questions need to be asked?
Now that you know your purpose for the survey and the information you most need to collect, you can flesh out the right questions to ask. Remembering that the type of question you use will affect the answer you get and the kind of analysis you can do.
The 6 main types of survey questions are:
- Open-ended questions. Used for free-for-all feedback, quality data, testimonials.
- Closed-ended questions with pre-selected choices (yes/no, rating scale, multi-choice). Used for creating graphs, showing trends.
- Nominal questions (usually multiple choice > ‘Which workshop did you find most valuable? a) b) c)’). Used for creating graphs, showing trends.
- Likert scale (rate their level of agreement with your statement > ‘How satisfied were you with the workshop resources provided; very dissatisfied to very satisfied’). Used to validate a known or suspected sentiment.
- Rating scale (‘How would you rate your candidate experience?’). Used to visualise and compare trends with a numerical value. E.g., 45% of candidates rated their experience as exceptional.
- Yes / No questions (straightforward and straight up > ‘Did you find our career site helpful?’). Used to quickly gauge opinion to identify areas needing further review. Also, a good question type to start your survey.
Keep in mind the number of responses you’re likely to receive. For example, if you are surveying a large group, you may want to limit the number of qualitative questions in your survey and stick to mostly quantitative questions that can be easily compiled and analysed.
What not to do when asking survey questions
When designing your questions, you’ll want to pose them in a way that provides maximum value. One of the common mistakes people make is to force respondents into a specific answer at the expense of the most true and valid response – these are known as either leading or loaded questions.
This usually happens when surveyors are trying to validate an action instead of collecting the most accurate feedback.
For example, “Do you agree with changing the structure of rotations from 4 to 6 months? Y/N” (you’re assuming the respondent thinks rotations are good).
“What makes our program better than our competitors’?” (what if it’s not?)
A fun way to approach it, may be to consider whether a lawyer could ask your question of a witness in a courtroom, or would the judge determine that you’re “Leading the witness”?
You also want to be wary of wording and language changes.
Keep your wording consistent. This is particularly important if your survey is used to compare historical data or benchmarked against your program’s success criteria. Small changes in wording can affect the way participants respond and the way you compare results over a period of time.
Ready to start writing your questions?
Please keep your language concise and clear, and write questions using an active voice. You may even wish to consider whether second person (‘you agree’) or first-person (‘I agree’) is the best approach for your brand and survey.
There are many aspects to a program that you as the Program Manager will want to monitor and capture feedback on. So, keep in mind the number and frequency of your surveys so that your participants don’t feel overwhelmed or alternatively start to ignore them.
Looking after the welfare of your graduates is just as important as the delivery of the program, so using surveys and keeping in regular contact with your graduates is important for understanding they are on track or when you might need to intervene.
If you’re reviewing your graduate program, you can also check out the free Graduate Talent Maturity Model for guidance.
About the writer: Kelly Stone gets paid to write clever and charismatic recruitment communication that enhances the candidate experience (and consequently, brand engagement). She’s on a mission to phase out jargony corp-speak, so employers better engage with young attention spans.