As PR professionals, we care a lot about our clients’ image and reputation. An easy way to find out how a brand is perceived is sending out surveys and asking for feedback. You may get responses you’ve already anticipated, but you are very likely to uncover some interesting findings. So we asked a friend of Clearpoint Agency’s, Alex Genov, PhD, and lead user researcher at a software company in San Diego for some tips on creating highly effective surveys. He applies his psychology background and vast experience in research, design, and innovation and offers some valuable advice.
Surveys or questionnaires are one way of getting people to tell us about their internal states like motivations, feelings, attitudes, aspirations, and so on. If you think that it is easy to put together a good survey, think again. There are many aspects to rigorous survey design that require a solid background in research methodology and years of experience. Among those aspects are:
- Purpose of survey
- Type of questions (open-ended, rating scales, etc.)
- Number of scale points
- Labeling of scale points
- Length of survey
- Branching logic
- DIY survey tools
1. Always start the survey-creation process with a clear purpose in mind
What are the actions you hope to result from the survey? The goal can be to find what segments exist among your customers so you can create relevant messaging. Or it can be to find out how satisfied your customers are with your products and services so you can improve their level of satisfaction.
2. At any cost avoid leading questions which get you answers you are looking for
For example, “This product is easy to use, isn’t it?” or “Don’t you love these product features?” Surveys are best used to find out new information or to confirm a hypothesis in an objective and dispassionate way.
3. Minimize the number of open-ended text questions
Although these are good for uncovering interesting qualitative insights, they will provide you with large amounts of unstructured data. That is, after the survey is done someone has to go through all the verbatim and do manual coding to quantify the data and detect patterns. More often than not, such data remains completely unused.
4. Use structured questions when you know all possible answer choices
For example, if you want to find out how often people do certain physical activities use self-report questions like: “About how often do you participate in the following activities?”, present a list of physical activities, and let people pick one of the following options: (1) About once a year; (2) 4-6 times a year; (3) About once a month; (4) 2-3 times a month; (5) About once a week; (6) 3-4 times a week; (7) About once a day.
5. In some cases include an “Other” answer choice as a last option
Use this answer choice in conjunction with an open text field to collect information on answers you never thought of. In this case, you will be faced with the unstructured data challenge outlined in #3 above.
6. Use a 7-point Likert scale to measure attitudes, motivations, preferences, etc.
It is best practice to use the following questions phrasing and answers format: “Indicate to what extent you agree or disagree with the following …” Then provide the attitude statement, for example “I like ice cream” and the answer choices in the format of a scale: (1) Strongly Disagree (2) Disagree (3) Somewhat Disagree; (4) Neither Agree nor Disagree; (5) Somewhat Agree; (6) Agree; (7) Strongly Agree.
7. A few pointers about formatting Likert-type scales
Always have the middle point be neutral. The more points you have, the higher fidelity analyses you can do – 3 is too little, 11 is too many, 7 is a good balance. Make sure that during the analysis stage all negative end-points are coded as 1 (for example all “strongly disagree”, “not at all likely”, etc. should correspond to 1; you do not have to label all points on the scale – it is ok to have a scale like (1) Strongly Disagree (2) … (3) … (4) Neither Agree nor Disagree (5) … (6) … (7) Strongly Agree.
8. Make sure that your survey is not too long
A survey that takes participants more than 20 minutes to complete is too long and will result in a dismal completion rate. The standard “cold email” survey response rate is about 2%. Providing material incentives for completing the survey will naturally improve the response rate.
9. More complex surveys involve branching logic
This means that some questions or sections of the survey are visible only to some of the participants depending on their prior answer choices.
This post was contributed by Alex Genov, PhD. Alex is an experienced customer researcher who applies his psychology background and his passion for research, design, and innovation to the software industry. If you’d like to learn more about persuasive design, customer experience and research, visit his blog at http://persuasivedesign.wordpress.com/.