User surveys are popping up on websites and mobile apps everywhere. Well-designed ones yield helpful data. Poorly executed ones are a waste of time for users and researchers. Crafting a good questionnaire is a science, but luckily, substantial published research is available on the topic. It’s easy to write good questions when “survey best practices” are just a Google search away…
Indeed. There are about 68,300,000 results. All that information can be overwhelming to those new to survey writing. UX Booth is here to help.
In this article, I will focus solely on writing effective questions and answers—the critically important and challenging part between a proper introduction and conclusion. Skipping the basics of question types, I’ll dive into proven techniques researchers rely on to craft effective inquiries and response options to gather useful data and results.
(If you need help with survey planning, defining goals, or understanding the basics of structure and question types, check out How to Create an Effective Survey.)
Question Wording and Structure
The creation of effective survey questions is essential to accurately measure the opinions of the participants. If the questions are poorly worded, unclear or biased, the responses will be useless. A well-written question will mean the same thing to all respondents. It will communicate the desired information so that all participants interpret it the same way and understand the expected type of response.
Use these guidelines for writing survey questions to yield informative and accurate information.
Be clear, specific, and direct
Failure to clearly explain the intent of the question can lead to confusion and misinterpretation. Be very specific and avoid imprecise or vague words. Present the topic and define the behaviors, events, or timeframe. This will help ensure every participant is providing the same type of response.
Vague: What is your income?
For what time period? For just the respondent or the entire household? Before or after taxes?
Specific: What was your household’s yearly income before taxes in 2016?
Use the participants’ vocabulary
Consider the education level of the survey audience, and use words that will will be easily understood. Avoid jargon, complex terms, undefined abbreviations and acronyms. Use simple language and never assume knowledge; always provide the necessary information for the respondent to understand what is being asked. Define any concepts or terms that the respondent needs to understand in order to answer. If referencing something participants might not be familiar with, be sure to add details to help explain it.
Unclear: How likely would you be to subscribe to UIE’s library?
Whose library? The International Union for Electricity? What kind of library–documentation, podcasts, ebooks?
Clear: User Interface Engineering’s library offers online seminars by experts in UX design. You can access the recordings anytime for only $25 a month. How likely would you be to subscribe?
Tip: If the question requires a lengthy explanation, consider separating it from the question itself to help make the information easier to digest.
Talk like a real person and treat the questions like a conversation
Group similar topics together and order the questions in a logical way to create a natural flow as if having a conversation. The voice and tone of the survey should match who it is from and being designed for. The writing can be friendly and familiar but don’t sacrifice clarity for cutesy. Consider the MailChimp writing tone guideline, “It’s always more important to be clear than entertaining.”
Formal: Would you be willing to participate in our 10-question feedback survey? Your responses to this questionnaire will help us improve your experience with Corporation ABC’s website.
Informal: Hi! Are you willing to answer a few quick questions? It’s won’t take more than five minutes. (And there’s a really good prize!)
Tip: Although I’m focusing on introductions and not question writing, it’s worth noting that being up front about the time-investment and offering incentives can also help with response rates.
Ask only one question at a time
Each question should focus on a single item or concept. This generally means that questions should have one subject and verb. Double-barrel questions ask a respondent to evaluate more than one thing in a question yet only allow for a single response.
Double-barrel: Was the content marketing seminar worthwhile and entertaining?
What if the seminar was educational but the presenter was a dreadful bore, and the response options are Yes or No? A double-barrel question is also known as a compound question. This is a common mistake, which can be corrected by breaking questions into two. Let’s look at an example with how to correct it:
Double-barrel: How satisfied are you with your work environment and compensation?
Single and separate:
- How satisfied are you with your work environment?
- How satisfied are you with your compensation?
By breaking the double-barrel question into two questions, one about satisfaction with the work environment and another question about pay, the participant is now able to provide a response to both inquiries separately.
Practice good grammar
Keep the questions simple and grammatically correct. Maintaining a parallel structure and consistently using the same words and phrases improves respondents’ comprehension. Avoid two-part or complex questions which can be hard to interpret, as can double negatives.
Double Negative: Do you agree or disagree that user interface designers should never not know how to code?
Better: User interface designers should know how to code.
An agreement scale goes well with this reworked question—more on that later.
Avoid bias and loaded words
A biased question will lead participants in the direction of a particular answer. Some phrases, particularly adjectives and adverbs, may add bias to questions. Depending on how a question is presented, people can react in different ways (for example, asking a question using the word “loss” versus “gain”). The use of emotional, politically-charged, or sensitive words can also trigger a different response. Remain neutral regardless of topic and watch for wording that skews positive or negative.
Biased: We think this UX Booth article on Survey Question Writing is very helpful. How helpful do you think this article is?
Unbiased: What do you think of this UX Booth article on Survey Question Writing?
Start with broad, general questions and progress to specific and harder ones
Beginning with basic, easier questions can encourage a respondent to continue. When possible, try to balance simple and complex inquiries. Interspersing easier questions among more challenging ones can make it seem less burdensome and help reduce abandonment. And remember to save sensitive questions like income for the end and make them optional.
Keep the survey short and don’t be greedy!
Don’t waste people’s time–only ask for what you really need. (Requiring answers to questions will slow people down, but it won’t necessarily get you want you and will increase drop off rates.) If there aren’t too many questions, and respondents can immediately understand what is being asked, they are more likely to be willing and able to provide useful information.
If the answers are also well-crafted…
Answer Wording and Structure
Since this article is concentrated on writing, I’ll focus on answers to closed questions, where responses need to be crafted by the survey designer. When providing responses to closed-ended questions, how each answer is described, the number of options, and the order can all influence how people respond. Whether presented as multiple choice, checklists, or in scales, just like when writing questions, the answers should use precise, clear wording. Here’s how to make that happen.
Present all the possibilities
The number of answers should be kept relatively small but include all the possible choices. Answers need to be balanced both ways (e.g. positive to negative, high to low frequency).
All respondents need to be able to find an answer that fits their situation—including opting out. If there could be a situation where none of the answers apply, provide the option to select “don’t know,” “not applicable” or “prefer not to answer” for sensitive questions. Including an “Other,” with a free-form text field to provide a custom answer, is a great way to learn about alternative responses not provided in the defined answer set.
Incomplete and Unbalanced:
- Very Important
- Moderately important
- Slightly important
What if it is not important at all? Or not even applicable to the participant?
Complete and Balanced:
- Extremely important
- Very important
- Moderately important
- Slightly important
- Not at all important
- Not applicable
Say “no” only when necessary
Dichotomous questions present only two options and are clearly distinct. These answers, like yes/no and true/false, can produce less helpful data because they don’t provide context or specificity. (Though when using skip logic, these responses can often appropriate.) Formatting responses to use scales that measure things like attitudes or frequency yield more information-rich results. These answers can make a single question work harder.
Yes/No: Do you use the mobile app?
Frequency: How often do you use the mobile app?
- Always
- Very Often
- Sometimes
- Rarely
- Never
Tip: These answers also follow the first guideline to cover all the possibilities in a balanced way, ranging from high to low or not at all. An even stronger set of choices would include references for the time period to clearly define what “sometimes” is versus “rarely.” Check out the UX Booth blog example below.
Keep answers mutually exclusive
If a participant can only select one response than each answer should be distinct and not cross-over. For example, options might be 0-5 or 6-10 rather than 0-5 or 5-10. Having the “5” in both answers makes them not mutually exclusive:
Not Distinct: Where is your work location?
- In an office building.
- From my home.
- In the city.
The person could work in an office building in the city or from their home in the city.
Remove universal statements
Words like “never, none, always, all” are extreme choices that respondents might be hesitant to commit to. Rather than absolute words, provide answers with specific references for behaviors or timeframes.
Absolute:
- I always read UX Booth’s blog.
- I never read UX Booth’s blog.
Referenced Alternatives:
I read UX Booth’s blog:
- Once a week
- 2-3 times a week
- 4 or more times a week
- I do not read UX Booth’s blog.
Use ratings and scales
The Likert Scale, where respondents indicate their level of agreement or disagreement, is the most commonly used approach to scaling options when measuring attitudes or behaviors. Likert scales should be symmetrical and balanced. They should contain equal numbers of positive and negative responses within the distance between each item being the same.
Experts’ debates about scales—the number of levels (5, 7,10), and the inclusion of a neutral midpoint (neither agree nor disagree)—is too overwhelming to tackle in this article. Consult the many resources for Likert Scale best practices. SurveyMonkey suggests five scale points for unipolar and seven for bipolar. (My personal opinion is between five to seven is best; the higher the number the harder it is for people to gauge.) Always include word labels not just numbers to identify what each point on the scale means.
Common Scales:
- Agreement: Disagree to Agree
- Familiarity: Not Familiar to Very Familiar
- Frequency: Never to Always
- Important: Not Important to Extremely Important
- Likelihood: Not Likely to Extremely Likely
- Quality: Poor to Excellent
- More Examples (Iowa State PDF)
Use the expected, “natural” order for answer scales because it is easier for people to respond. For ranges (e.g. excellent to poor) it’s okay to reverse the order, such as starting with the favorable and ending with unfavorable answer, since it can also influence choices.
Tip: Read “There is a Right Way and Wrong Way to Number Rating Scales.”
Good survey design leads to good data.
The unfortunate result of bad survey design is bad data. Asking good questions and providing solid answers is not easy. Take advantage of what other researchers and academics have done and use starter templates when appropriate. It is the survey designer’s responsibility to be clear and unbiased. Make sure the questions will be informative, the answers accurate, and that the insight you can will lead to actionable results.