Guest Post by Sarah Fisher
Originally posted @ Qualtrics.com
Learning how to write survey questions is both art and science. The wording you choose can make the difference between accurate, useful data and just the opposite. Fortunately, we’ve got a raft of tips to help.
Figuring out how to make a good survey that yields actionable insights is all about sweating the details. And writing effective questionnaire questions is the first step.
Essential for success is understanding the different types of survey questions and how they work. Each format needs a slightly different approach to question-writing.
In this article, we’ll share how to write survey questionnaires and list some common errors to avoid so you can improve your surveys and the data they provide.
Did you know that Qualtrics provides 23 question types you can use in your surveys? Some are very popular and used frequently by a wide range of people from students to market researchers, while others are more specialist and used to explore complex topics. Here’s an introduction to some basic survey question formats, and how to write them well.
Familiar to many, multiple choice questions ask a respondent to pick from a range of options. You can set up the question so that only one selection is possible, or allow more than one to be ticked.
When writing a multiple choice question…
Asking participants to rank things in order, whether it’s order of preference, frequency or perceived value, is done using a rank structure. There can be a variety of interfaces, including drag-and-drop, radio buttons, text boxes and more.
When writing a rank order question…
Slider structures ask the respondent to move a pointer or button along a scale, usually a numerical one, to indicate their answers.
When writing a slider question…
Also known as an open field question, this format allows survey-takers to answer in their own words by typing into the comments box.
When writing a text entry question…
Matrix structures allow you to address several topics using the same rating system, for example a Likert scale (Very satisfied / satisfied / neither satisfied nor dissatisfied / dissatisfied / very dissatisfied).
When writing a matrix table question…
Now that you know your rating scales from your open fields, here are the 7 most common mistakes to avoid when you write questions. We’ve also added plenty of survey question examples to help illustrate the points.
Likert scales are commonly used in market research when dealing with single topic surveys. They’re simple and most reliable when combatting survey bias. For each question or statement, subjects choose from a range of possible responses. The responses, for example, typically include:
There are countless great examples of writing survey questions but how do you know if your types of survey questions will perform well? We’ve highlighted the 7 most common mistakes when attempting to get customer feedback with online surveys.
Subtle wording differences can produce great differences in results. For example, non-specific words and ideas can cause a certain level of confusing ambiguity in your survey. “Could,” “should,” and “might” all sound about the same, but may produce a 20% difference in agreement to a question.
In addition, strong words such as “force” and “prohibit” represent control or action and can bias your results.
Example: The government should force you to pay higher taxes.
No one likes to be forced, and no one likes higher taxes. This agreement scale question makes it sound doubly bad to raise taxes. When survey questions read more like normative statements than questions looking for objective feedback, any ability to measure that feedback becomes difficult.
Wording alternatives can be developed. How about simple statements such as: The government should increase taxes, or the government needs to increase taxes.
Example: How would you rate the career of legendary outfielder Joe Dimaggio?
This survey question tells you Joe Dimaggio is a legendary outfielder. This type of wording can bias respondents.
How about replacing the word “legendary” with “baseball” as in: How would you rate the career of baseball outfielder Joe Dimaggio? A rating scale question like this gets more accurate answers from the start.
Multiple choice response options should be mutually exclusive so that respondents can make clear choices. Don’t create ambiguity for respondents.
Review your survey and identify ways respondents could get stuck with either too many or no single, correct answers to choose from.
Example: What is your age group?
What answer would you select if you were 10, 20, or 30? Survey questions like this will frustrate a respondent and invalidate your results.
Example: What type of vehicle do you own?
This question has the same problem. What if the respondent owns a truck, hybrid, convertible, cross-over, motorcycle, or no vehicle at all?
Questions that are vague and do not communicate your intent can limit the usefulness of your results. Make sure respondents know what you’re asking.
Example: What suggestions do you have for improving Tom’s Tomato Juice?
This question may be intended to obtain suggestions about improving taste, but respondents will offer suggestions about texture, the type of can or bottle, about mixing juices, or even suggestions relating to using tomato juice as a mixer or in recipes.
Example: What do you like to do for fun?
Finding out that respondents like to play Scrabble isn’t what the researcher is looking for, but it may be the response received. It is unclear that the researcher is asking about movies vs. other forms of paid entertainment. A respondent could take this question in many directions.
Sometimes respondents may not want you to collect certain types of information or may not want to provide you with the types of information requested.
Questions about income, occupation, personal health, finances, family life, personal hygiene, and personal, political, or religious beliefs can be too intrusive and be rejected by the respondent.
Privacy is an important issue to most people. Incentives and assurances of confidentiality can make it easier to obtain private information.
While current research does not support that PNA (Prefer Not to Answer) options increase data quality or response rates, many respondents appreciate this non-disclosure option.
Furthermore, different cultural groups may respond differently. One recent study found that while U.S. respondents skip sensitive questions, Asian respondents often discontinue the survey entirely.
These types of questions should be asked only when absolutely necessary. In addition, they should always include an option to not answer. (e.g. “Prefer Not to Answer”).
Do you have all of the options covered? If you are unsure, conduct a pretest version of your survey using “Other (please specify)” as an option.
If more than 10% of respondents (in a pretest or otherwise) select “other,” you are probably missing an answer. Review the “Other” text your test respondents have provided and add the most frequently mentioned new options to the list.
Example: You indicated that you eat at Joe’s fast food once every 3 months. Why don’t you eat at Joe’s more often?
There isn’t a location near my house
I don’t like the taste of the food
Never heard of it
This question doesn’t include other options, such as healthiness of the food, price/value or some “other” reason. Over 10% of respondents would probably have a problem answering this question.
Unbalanced scales may be appropriate for some situations and promote bias in others.
For instance, a hospital might use an Excellent – Very Good – Good – Fair scale where “Fair” is the lowest customer satisfaction point because they believe “Fair” is absolutely unacceptable and requires correction.
The key is to correctly interpret your analysis of the scale. If “Fair” is the lowest point on a scale, then a result slightly better than fair is probably not a good one.
Additionally, scale points should represent equi-distant points on a scale. That is, they should have the same equal conceptual distance from one point to the next.
For example, researchers have shown the points to be nearly equi-distant on the strongly disagree–disagree–neutral–agree–strongly agree scale.
Set your bottom point as the worst possible situation and top point as the best possible, then evenly spread the labels for your scale points in-between.
Example: What is your opinion of Crazy Justin’s auto-repair?
The Best Ever
This question puts the center of the scale at fantastic, and the lowest possible rating as “Pretty Good.” This question is not capable of collecting true opinions of respondents.
There is often a temptation to ask multiple questions at once. This can cause problems for respondents and influence their responses.
Review each question and make sure it asks only one clear question.
Example: What is the fastest and most economical internet service for you?
This is really asking two questions. The fastest is often not the most economical.
Example: How likely are you to go out for dinner and a movie this weekend?
Dinner and Movie
Even though “dinner and a movie” is a common term, this is two questions as well. It is best to separate activities into different questions or give respondents these options:
Here are 5 easy ways to help ensure your survey results are unbiased and actionable.
Structure your questionnaire using the “funnel” technique. Start with broad, general interest questions that are easy for the respondent to answer. These questions serve to warm up the respondent and get them involved in the survey before giving them a challenge. The most difficult questions are placed in the middle – those that take time to think about and those that are of less general interest. At the end, we again place general questions that are easier to answer and of broad interest and application. Typically, these last questions include demographic and other classification questions.
In social settings, are you more introverted or more extroverted?
That was a ringer question and its purpose was to recapture your attention if you happened to lose focus earlier in this article.
Questionnaires often include “ringer” or “throw away” questions to increase interest and willingness to respond to a survey. These questions are about hot topics of the day and often have little to do with the survey. While these questions will definitely spice up a boring survey, they require valuable space that could be devoted to the main topic of interest. Use this type of question sparingly.
Questionnaires should be kept short and to the point. Most long surveys are not completed, and the ones that are completed are often answered hastily. A quick look at a survey containing page after page of boring questions produces a response of, “there is no way I’m going to complete this thing”. If a questionnaire is long, the person must either be very interested in the topic, an employee, or paid for their time. Web surveys have some advantages because the respondent often can’t view all of the survey questions at once. However, if your survey’s navigation sends them page after page of questions, your response rate will drop off dramatically.
How long is too long? The sweet spot is to keep the survey to less than five minutes. This translates into about 15 questions. The average respondent is able to complete about 3 multiple choice questions per minute. An open-ended text response question counts for about three multiple choice questions depending, of course, on the difficulty of the question. While only a rule of thumb, this formula will accurately predict the limits of your survey.
The best survey questions are always easy to read and understand. As a rule of thumb, the level of sophistication in your survey writing should be at the 9th to 11th grade level. Don’t use big words. Use simple sentences and simple choices for the answers. Simplicity is always best.
We know that being the first on the list in elections increases the chance of being elected. Similar bias occurs in all questionnaires when the same answer appears at the top of the list for each respondent. Randomization corrects this bias by randomly rotating the order of the multiple choice matrix questions for each respondent.
While not totally inclusive, these seven survey question tips are common offenders in building good survey questions. And the five tips above should steer you in the right direction.
Focus on creating clear questions and having an understandable, appropriate, and complete set of answer choices. Great questions and great answer choices lead to great research success. To learn more about survey question design, download our eBook, The Qualtrics survey template guide or get started with a free survey account with our world-class survey software.
Excerpted with permission from the 5th Chapter of “When Everyone Leads” by Ed O’Malley and...
Happy Thanksgiving! Family and Friends are coming! Pumpkin Pies are baked! Turkey is ready to put in the...
Excerpted With Permission from Chapter 5 of “How To Talk About Jesus Without Looking Like an...
Guest Post by Jeffrey Davis Originally posted at Psychology Today Social psychology shows people are eager to...
Guest Post by Kevin Herring Originally Posted @ Ascent Management Consulting How can leaders increase...
Guest Post by Jeff Haden This works whether you’re trying to make a great first impression or deepen a...
Excerpted With Permission from the 3rd Chapter of “Win the RELATIONSHIP- not the DEAL” by Casey...