In this module, we're going to talk about basic guidelines for writing survey questions. Guidelines that should apply to any survey question that you might have. So, it's what you're going to learn in this module. One is guidelines for word choices in survey questions. How do we think about actual the writing process when we're thinking about our survey questions? Then, the second thing we'll talk about are tips for picking amongst different survey question types. So, the first thing you want to consider when you're thinking about, how do you write a survey question? Is to think about the appropriate question format and play around with different formulations of your question. You want to start with how you are going to use that data and work backwards. If it is going to go into your report, is it going to go into a graph? Are you going to have anecdotes? Are you going to try to pull quotes? Are you going to show bar charts? How you're actually going to represent the data might shape exactly how you collect it. Trying different question formats can show you the power of trying these different methods as well. So, for instance, here's a good survey question. Which of these five statements best describes the CEO of the company? Now, if you look at these, these are a nominal set of categories. We would ask the respondent to pick one of these. If they pick one, how would we use this data at the end of the day? How would you imagine that a report which show 20 percent of our respondents said that the person's a born leader and 30 percent said they're real innovator? You could imagine a report that looks like this, but there might be other ways we could ask this question that might be more compelling. So, if we had a different set of presentation goals, we might instead ask, to what extent has the CEO of the company demonstrated strong leadership skills? Now, what we've done is we've taken the concepts of leadership and innovation and we separated them into two separate questions. We've changed the answer categories into ordinal response categories instead of nominal, and now we have a scale. Now, what you might do here then, is relate how people responded to each of these questions to other demographics we know about them. If you're going to relate variables, this type of data might be more useful than if you had the nominal data that we saw in the last question. So, again, you can see how all boils down to, how do I want to use this data? A third option is to force a choice point between the two concepts that we're interested in. So, in this question, we can see, which of the following do you feel best describes the CEO of the company? Here, we serve a set of nominal options. We're forcing people to pick more concretely between leader and innovator. Now, this again creates a set of nominal variables or nominal responses that we might use to describe the population, but it would be harder to use this data to correlate to other characteristics of the person responding. A second tip for how do we write survey questions, is to make sure that the question applies to the respondent. One common question that, for instance, we often see in surveys is, do you have a cellphone? Or, how often do you use your cellphone to do X tasks and why? While cellphone penetration at least in the United States has exploded and that there's been a huge uptake in cellphone technology, not everybody will have one. A common way to make sure that questions apply to people in surveys is to use what are called skip patterns or skip logic. Well, you can see, here's an example I pulled from the qualtrics survey tool, we're going to talk more about survey tools in another module, where I have created a skip pattern. If a person answers yes to the question, do you have a cellphone? They get directed to another question further down the instrument. If they answered no, they will find a different pathway. Skip patterns, especially in web surveys, are great ways for people to have questions that are relevant to them shape the experience of the survey for them. Now, of course, skip patterns in print surveys would be much harder. I think we've all seen examples of this where it is a really hard pathway to follow if you are having a mail survey or a form survey that you're trying to do. These are best in this computer system web surveys or through phone surveys. So, you can see as an example, if I were to click Yes on this button, I get taken to another question, if I click No, I might get exited out of the survey. Very common method to use to make sure that all questions are relevant to the person that you're asking. Another tip for writing good survey questions is to only ask one question at a time. Now, this might seem obvious, but you'd be surprised at how often because people are trying to get as much information from as little respondent time as they can, they're trying to be very, very conservative in terms of how much they're asking the respondents, they bundle a whole set of things they care about into one question. The term double-barreled questions is another new fun survey term we get to talk about in this module. That, basically, means a survey question where two concepts are included in the same question stem. Let's look at an example of that. In your opinion, how would you rate the speed and attractiveness of this website? You can see here, speed and attractiveness are, of course, two separate concepts. If a person's answering this question, they might think, ''Well, I like the speed, but the attractiveness isn't quite there.'' So, I'm going to combine those and answer Fair. Even though speed is excellent, attractiveness is poor, they will work around this. However, that might not be what the survey researcher intended. They wanted to have separate questions about speed and attractiveness. Of course, that would be the stronger way to elicit people's feelings about those two separate concepts in these questions. So, you could take that double-barreled question that we just saw in the last set of slides, and you can, of course, separate them so that you're only asking one question per concept. In this case, we have the question. In your opinion, how would you rate the speed of this website? Then another question. In your opinion, how would you rate the attractiveness of this website? In general, this is much better question writing process, because now you have two concepts. They each get their own questions and the respondent is much less likely to be confused by which one of these you're asking about, or how to combine them into one response category. Another type of ambiguity that you can add into questions is to talk about things that a person might not have actually experienced. So, here are two questions that get at this in slightly different ways. How many minutes did it take you to prepare dinner last night? For a certain subset of your population, they might have missed dinner last night, or they might have gone out to dinner last night. For everybody that you're asking a question, you want to make sure that the question applies to them. Another form of this is, do you use your cellphone at least daily for voice calls, access to the Internet, or to send text messages? Now, this is actually another type of double-barreled question. You can see, we might want to separate these out more between voice calls, access to the Internet, or to send messages, but it's also, what if the person doesn't have a cellphone? These are two different ways that we might ask a question that's not relevant to the respondent. Either we're asking about an event that many people could miss in the normal course of their day, or asking about having something or an experience that a certain portion of the population is missing out on systematically. Another really important thing to think about when you're writing your survey questions is to use simple and familiar words. Now honestly, this really gets into just good writing practice overall. It turns out that writing a survey question is pretty much like writing everything else. You want to make sure that you're writing survey questions where you avoid jargon and phrases that people aren't going to understand. Especially, if you're shooting for a general population or for a broad group of people, you want to aim for an eighth grade reading level for your respondents. That's about the average reading level in the United States. You could have fancy words in there if you need to, but in general, we want to use simpler words when we can. You want to go through your survey questions and do edits where you look at what my grandmother would have called $10 words, and convert them into words that are simpler and more commonly understood by everybody. So, instead of exhausted, you can use tired. Instead of leisure, you can use free time. Instead of approximate, you can use about. A lot of these words especially coming from academia, we tend to use these larger words very frequently. It can be a lot of work to look at a survey question and to examine it objectively and think, what words here could be converted to a simpler word that more people are going to understand? Another example of this is to avoid using jargon that's either really related to just the field or might be an assumption about your population that's not true. So, here's an example. How often do you disable cookies on your browser? Two, the Internet savvy to people who have some computer literacy, this does not seem like that complicated a question. But a lot of work has shown that people in the United States don't understand what cookies are for instance. A smaller percentage would still non-zero would not know what a browser is. So, these two jargony terms, cookies and browser could be systematically excluding some portion of the population that you're interested in. Another example is what is your overall impression of the user interface of as application? How do people in the user experience feel on all of these words make a ton of sense. But for the average Internet user, or the average computer user, user interface could mean multiple things or it could mean nothing at all. You want to really look and think, now what are the types of questions that my engineers would answer, or that I would answer as a professional? What do my end users really understand about what they're answering? Now, this isn't to say that people are dumb or that you should talk down to people. Is just to say to be as clear as possible, use as simple as terms as possible. As part of that, you want to use a simple rather than complex phrases. So, for instance in setup your responses, you can use your answers. A common frame in surveys is to ask about occupants of this household, when you could just talk about people who live here. Educational attainment instead of years of school. Again, this comes back to looking critically at your survey questions and thinking, am I picking this word to seem fancy or because that's how I understand this concept? Or really, am I picking these words to help the most people possible understand the concept I'm trying to get at? You also want to use specific and concrete words to specify the concepts you're after clearly. So, what's an example of this? Let's look at an example on packet from the Dillman book, a relatively straightforward question. How many times did you eat together as a family last week? Again, if we think about unpacking this question, what does eat together? What does eat together as a family mean? What kind is my whole family has to be there or could it be just me and my child or just me and my wife? What constitutes a meal? Again, you can see here that the response category mismatches the question stem a little bit. So, this is a question that could be very confusing for people. How do we adapt this? So, one way to specify is how many meals did you eat together as a family at home last week? Now, this time what we've done is we've included the words at home to try to take away all those instances where we took grandma out for dinner or something like that, all right? So, that might help to specify for a population. All right. This means meals in at my house. Does that mean pizza night in front of the TV? It's still a somewhat vague question. Even more specific version of this question, how many meals did you sit down to eat at home as a family last week? This version is more specific. It specifies sitting down, it specifies eating at home, it specifies with your family, and hopefully, will lead to people having a common understanding of what the researcher is really after. There's no guarantee. I'm sure it's easy for you to imagine ways that this question could also be misinterpreted by potential respondents. But the more specific and concrete we can be about the terms we use in these questions, the less chance for misunderstanding there is. Now, of course, you can go overboard on this. You don't want to say, how many meals did you sit down and chairs at a table with forks and spoons? That can get a little ridiculous. But you want to find a happy balance about how much specificity is in your question to help people be very clear about what it is you're asking about. Again, this gets back into the good writing advice, but just use as few words as possible to pose the question. George Orwell has a quote, "If it's possible to cut a word out, always cut it out." He believed strongly in editing out extraneous words. Especially in academic writing, I hate to say it. Those extraneous words pop up a lot. So, as some examples, due to the fact that could really be simplified as because. In order that is really for. Provide guidance for, guides. Make a decision, decides. Has the ability, can. In the majority of instances, usually. These are phrases that I find commonly in professional and academic writing, that in general should probably be edited out. But especially for survey questions where you really want to be brief and clear, should definitely be edited out of your survey writing process. Finally, use complete questions that take a question form. Now, this sounds obvious, but you'd be surprised how many surveys skip this step. So, example here. Your county, question county. I don't know if people don't have enough time to write survey questions or they're trying to save space. But often, people would just put a label like this, and as soon people will understand. It's much better practice to say, in what state/county do you live? So, for example, in what Michigan County do you live? Now, this has two purposes. One is it's a way for people to more cognitively engaged with a question. It turns out that our brain actually triggers on that question mark, and we're much more able to process a question format than a label, like what you see in the above example. But also in this particular example, the word county can be definitely confused with country. In a study, about 20 percent of the people who took the survey or answered this question, wrote the United States for the county that they lived in. That's not correct of course and we could clarify this. But again, the more clear you are, the more complete, the better writing you use for writing your questions, the less opportunity for misunderstanding there is by your respondents. As part of that, you want to avoid double negatives. An example is, "would you favor or oppose not having this feature on the application?" So, if I favor, I'm in favor of not having this feature, it gets very confusing very quickly. These aren't that common, but they can be really challenging for people. Another thing you want to avoid is bias in your question stems. Now, it seems like I would never do that. I am a good researcher, a good UX researcher, I would never include bias. A lot of people of course want to include bias in their question stems. This allows us to introduce another fun term in this lecture, which is the idea of a push poll. A push poll is a survey that intentionally includes biased language in the question stem in order to shape results. This is very common in political campaigns, and allows for a news release that says, X number of people believe that this should happen when the very basis of the question is biased. So, here's a great example of a bias question from our own research area. Wikipedia has had notorious problems with bad information. How would you rate the quality of its information? Now, you can imagine a respondent prompted by that prime about having bad information is going to have different responses than if that first sentence is not there. A surprising number of surveys try to provide context for a question by making a statement like this. But then of course, those statements if not careful, can bias results that you get. If we rewrite this question, we can see a better question which is, how would you rate the quality of information on Wikipedia? It's the same question without that biasing prompt that might shape our results. This is an extreme example. A more subtle example is, how good was your experience on this site? I know it's a bad question overall, but I'm trying to frame here the idea of good. This question would prime a certain response because the word good here, especially matched with the term good in the response categories, is going to create an acquiescence bias that's going to shape your results and lead to very different results than you would have if you reframe the question. Another way to reframe this question would be, how would you rate your experience on this site? Here you can see this is much more neutral. It helps us to think about categories without the priming word that matches the response categories like we saw in the last question. Now, it's a hearken back to some of our previous points. The term experience here, again as an imprecise term and might be something that our respondents struggle with, but that's by writing questions that are so hard. So why do we care so much about writing good questions? Besides just getting good data, how we write questions and how we shape our question choices can really have a strong effect on the results that we get. Don Dillman who's one of the authors of our book and his colleagues did a study, where they found based of the way that they framed a question that students responded differently between 71 percent, 55 percent, or 30 percent claiming they study for more than 2.5 hours per day. Now, what they actually studied per day of course didn't actually change. The only thing that changed was, how they actually responded to how much they studied, based on how the researchers had asked the question. That can show how powerful results can be shaped by asking good or bad questions. Because how we write questions can have such a strong effect on the results that we get, it's really important to consider and to be careful when we're writing questions overall. You want to make sure that you're only writing one question per concept. Don't have double-barreled questions in your instrument. Make sure that you keep your words and your phrases as simple as you can. Our goal here is clarity, not complexity when we're writing our questions. We also want to make sure that we use complete sentences that are questions. It's cognitively easier for respondents to answer a question if it's actually framed as a question. You want to make sure that your questions are applicable to all of your respondents. Don't ask questions that are going to leave out a systematic group of people who are going to respond to your survey. Finally, you want to make sure that your specific in the words that you use to measure the concepts that you're interested in. We saw earlier how easy it is to have a vague concept that can be misunderstood by different people and how hard it is to find the right set of terms to really measure that concept. You want to test that systematically and really do a lot of thinking about how the words you're using or the operationalization of your idea is going to match the concept you're really interested in. In the next set of modules, we're going to talk more specifically about different types of questions, and dive deeper into specific types of questions that we might be interested in.