Surveys: You Get What You Ask For

Recently a product manager mentioned that he was doing a "quick little survey" of his customers. Being the curious sort, I asked "how quick and how little?" and he said "oh, with Survey Monkey I can do a thirty question survey in a couple of days."

I let it go at that, but I could have asked "should you do it in a couple of days?"

Yes, it's possible to post a survey and get results very quickly, especially if you have a pre-identified or captive audience, especially if you limit it to two or three concise questons. But if you whip up a few dozen questions and expect to pop the results straight into PowerPoint, you may be disappointed.

First, asking the right questions is tricky. If you ask too many, or they're too complex, people won't complete the survey. Worse, if you ask ambiguous questions, you'll get answers that may be impossible to analyze, or may even mislead you. It's crucial to use the target person's language, not industry jargon. If the respondent has any doubt about what you meant, the answers are not reliable and should be dumped.

Stick with clearly worded multiple choice, yes/no or "scale" (high/low, frequency, etc.) questions, rather than open-ended or essay questions. You'll get a much higher response rate. It's often a good idea to give an "out" option, such as "I don't know" or "not applicable" - otherwise, people will select incorrect options because they have no other choice.

There's an art to framing concise questions that are easy to answer but cut to the heart of the relevant issues. The key to success is refining your questions down to a the minimum number of questions, while using clear language not asking about more than one data point in any single question.

The wording you use may have a huge influence on the answer. Professional pollsters know that they can slant survey results by phrasing a question certain ways. For example, asking "is the sky blue?" may produce a very different result from asking "do you believe the sky is actually blue?" or "what color is the sky right now?" Do you want answers based on experience, opinion, hearsay, personal wishes, scientific fact, religious belief or some other criteria? It's easy to influence the answer by building assumptions into the question, or by leading the respondent to a specific answer, whether intentionally or not. That may be poor polling practice, but news agencies do it frequently.

Second, it's important to pick the right audience for a survey, and figure out how to reach them. The more targeted your survey is, the more useful it will be. Usually the best audience is your target customers, though it may focus on buyers, system administrators, or some subset of users. If you don't have an easy way of reaching these folks, such as a customer email list, you'll have to go out and find a way. Doing this effectively can make or break a survey - you want a high response rate, but surveying the wrong people can be worse than getting no responses at all.

Third, once you've run the survey on Survey Monkey - the easiest part of the process - you'll have a pile of data to analyze. You can create a set of quick graphs of individual questions using Excel or Survey Monkey's built-in tools, and you can quickly show "how often do respondents use our product?" But the real payoff comes when you do crosstabs - asking things like "how often do respondents who frequently use social networking sites use our products?" This is where you can begin to paint detailed portraits of your target customers. Analysis is, in fact, the most creative part of conducting a survey.

Of course, if you've asked open-ended or essay questions, you'll have to do one-at-a-time analysis and reporting. You'll get the benefit of hearing answers that you might not have thought of, in the respondent's own words, but this analysis can be daunting in a large survey.

Finally, presenting your data is both a challenge and an opportunity. You can choose to communicate everything you've learned and help your audience draw conclusions. Or you can pick the information that you feel is most valuable. As soon as you begin to edit, you become very powerful: the picture you choose to paint will likely have important consequences. If you asked an ambiguous question and choose to interpret it a certain way, you may be bending the truth a bit. Likewise, if you omit responses that don't support your business case or product plan. Choosing what to present, and how to present it, is often the most critical part of the survey process.

The next time I see the PM who's planning the "couple of days" survey, I'll be curious to find out how long it actually took. Thorough surveys usually take weeks to plan, run analyze and present. It can be worth every minute of it, while the results from sloppy surveys are often not worth the few hours spent on them.

Popular posts from this blog

Agile Thinking

Just Saying "No"

Consistency vs. Innovation