What Survey Mode is Best?
Summary: What survey mode is best to use? A recent Cint survey seemed to answer that question. 91% (sic) of respondents said they prefer web-based, smartphone, or SMS surveys. However, this survey suffers from a severe sample bias, given that the survey was likely conducted electronically. Just as response rates cannot be generalized across surveys done in different industries in different countries, so too the best survey mode or method cannot be generalized. Know your group of interest and what’s the best survey mode for your group.
~ ~ ~
Ever seen survey results that are too good to believe? Well, as the old adage goes, if it’s too good to believe, then you shouldn’t believe it. I call these — with all due deference to Homer Simpson — “D’oh! Results.” Even Homer in hindsight knew he should have seen that coming. Yet, it seems that many people accept survey results without critically thinking about them. I have some upcoming articles on the blind acceptance of the Net Promoter Score® and the Customer Effort Score, but I recently came across an example that displays a survey administration bias that should be obvious to even a casual reader.
Cint recently published the results of a market intelligence study of theirs. (Note: Cint removed the press release, otherwise I would link directly to it.) The study had several interesting findings, among them a discussion of survey methodology preferences by the public.
In terms of methodology, new technology was by far the most popular means of undertaking market research, with over 91% stating their preference as “smartphone”, “web” and “SMS”. Only 4% of those surveyed would choose to undertake market research by mail as their first choice, and just 1% would like to be surveyed over the telephone.
Wow, 91%!! Nowhere in the article is the methodology of their survey presented, but the first sentence of their online article give a pretty good hint.
17 January 2012: A survey out today from global provider of smart solutions for gaining market insight, Cint…
I sent an inquiry to Cint asking about their methodology, but to date I have received no response. (I actually find that fact alone very interesting for a firm engaged in customer engagement.) Anyone want to make a bet on the methodology that Cint used in their study? I am pretty confident that they surveyed using a “smart technology,” that is, a web survey.
If so, isn’t it obvious that people responding to web-based survey would say they preferred to be surveyed via the web? D’oh! Isn’t 91% an awfully high percentage? That number jumped out at me as so large that it didn’t pass a face validity test. It immediately made me skeptical of the results and want to learn more.
Sample bias such as this is not unusual. Years ago, I recall seeing a survey conducted at a conference on CRM technologies asking attendees about their senior management team’s involvement in CRM initiatives. Isn’t it senior management that approved of someone going to the conference? The administration bias is obvious. D’oh!
In fact, I am guilty of a similar bias. My Survey Workshop series spans three days. I wonder whether I should deliver the classes on Monday to Wednesday, Tuesday to Thursday, or Wednesday to Friday. I have a personal preference, but I really want to do what’s best for my market. So, on the feedback survey at the end of my class, I decided to ask which set of days is preferable. Guess what I find from this convenience sample, which is a common type of sample bias? If the class is a Monday-to-Wednesday class, that’s the preference. If it’s a Tuesday-to-Thursday class, that’s the preference. Etc. I have an obvious sample bias. D’oh! But at least I know that and can interpret my results in that light.
The Cint folks make no mention of the shortcomings of this aspect of their study’s findings. And it seems that others miss this critical point.
An American Express Open Forum article discussed the Cint study and included this synopsis of the impact of the survey methodology findings:
Get online. Consumers overwhelmingly prefer tech when it comes to surveys; 91 percent cite “smartphone”, “Web”, or “SMS” as their preferred methods. Just 4 percent like mail in surveys, and a mere 1 percent want to be surveyed by phone.
A Business Wings review of the article included:
If, after reading this, your business is now more likely to consider undertaking market research, you may wish to understand the preferred methodology. The clearest lesson is not to call people, as only 1% of those surveyed said they liked being contacted by telephone, compared to 91% who prefer to offer opinions via smartphone apps, SMS or online surveys.
Neither writer critically looked at how Cint’s survey methodology would impact these findings or seemed to question the astonishing 91% as unrealistic. D’oh!
As a university professor of 25 years, I can tell you that academics for years have stressed the need to build “critical thinking skills” among our students. We still have much work to do. D’oh!
Just as response rates cannot be generalized across surveys done in different industries in different countries, so too the best survey mode or method cannot be generalized. Know your group of interest or “population” in surveyor lingo. What’s the best survey mode for your group?
So how could you form a study that would properly understand survey methodology preferences? Here’s an example
Snap Surveys and Silver Dialogue conducted a study on multi-mode surveying methods. [Unfortunately the study is not available online anymore.] They queried visitors at National Trust properties in the United Kingdom. (These are historic properties many of which I have visited upon trips to Britain. I highly recommend them!) Visitors were asked by an interviewer upon entering the property if they would be willing to take a survey after their visit, and how they would prefer to be surveyed.
Here are their findings, quoted verbatim from their study:
- The most preferred questionnaire modes were paper/post 30% and web 26%.
- Point of visit methods, e.g. paper and kiosk, had far higher response rates (91%) than post visit, e.g. web and smartphone.
- Using a multi-mode approach increased the overall visitor survey response rate to 61% from around 45% achieved in previous paper based visitor surveys.
- A multi-mode approach reached a wider audience than previous surveys.
- Asking people to “pledge” to complete a survey after their visit helped to boost response rates.
Notice how these results conflict with Cint. Web garnered a 26% vote. The approach used here determined the best survey mode works for this transactional survey setting, but it may not work for all settings. I certainly would not apply their findings to other survey programs, and they didn’t claim you could. Now how confident do you feel in following Cint’s prescription?
Further, note that Cint queried about what survey mode people said they “preferred”. They didn’t conduct empirical research to measure the impact upon response rates of meeting someone’s preferred survey mode, as Snap did. The impact of survey mode upon response rates is what we, as surveyors, really want to know, along with the impact of survey mode upon the scores respondents provide. (I am currently co-authoring a paper on this latter topic. Stay tuned. It’s pretty dramatic.) Self-reported preference for survey mode isn’t the outcome of interest; the survey mode that leads to highest response rate is the outcome of interest — along with the most valid data.
Here’s another example of the distinction between survey mode preference and response rates. If students at universities were queried about how they would like to complete the end-of-semester course evaluations, I would suspect that they would want a web survey as opposed to the traditional scanable paper surveys administered in class.
Two places where I’ve taught, Harvard Certificate in Management Program, and Babson College, moved to a web-based survey method from the paper surveys administered in class. Hey, ya gotta use new these new smart technologies, right? Response rates fell in half. And they were surprised. And these are academics who are schooled in research design! D’oh!