Automated Phone Surveys

Surveys can be conducted using many different technologies. Phone surveys, mail surveys and web surveys are the best known means. A more recent addition to this list is IVR surveys, also known as automated phone surveys. IVR stands for “Interactive Voice Response.” When you call a company for service, your phone call — and all those annoying phone menus — is managed with an IVR, and IVR technology provides the capability to conduct surveys. In this article I will examine the strengths and weaknesses of the IVR survey approach, and I will use a recent experience with a Sears IVR to measure customer satisfaction to illustrate some points.

What is an IVR survey?

An IVR survey shares many characteristics with a telephone survey since the survey is conducted over the telephone.  But rather than have a live interviewer conduct the survey, the survey is delivered using a prerecorded script. The respondent is asked to enter his responses using the keypad of the telephone and by verbalizing free-form comments. (IVR survey programs can even analyze the tone of the responses.) Since there’s no interviewer to clarify questions, the script must be clean and anticipate all the possible responses someone being surveyed may have.

IVR surveys are most commonly used to measure a customer’s reactions to service just delivered through a call center environment, known as a transactional survey. You may have had this experience. When you call some company before being connected to an agent, you may be asked if you would be willing to take a short survey at the end of the call. When the service is complete and the agent disconnects from the caller, the caller is automatically transferred into the survey program. It’s important that the agent not know who is going to be surveyed to keep the measurements clean.

IVR surveys need not be limited to this situation. You could invite a respondent group to take an IVR survey by calling them or by emailing them with a toll free number to call to take a survey.

My recent experience with a Sears IVR survey shows the IVR method’s strengths and weaknesses.

What are the Strengths of the IVR Survey Method?

Speed of Survey Data Collection. The most striking advantage of the IVR survey method is how quickly a company can collect feedback from an individual. Within 5 minutes after a service interaction is finished, the company can get feedback from the customer about her experiences. No faster means of collecting feedback exists.

Service Recovery Flag. That completed survey from the respondent gets analyzed immediately by the IVR survey program. Management can establish criteria to flag a specific survey response for review. For example, if someone provides a very low rating for overall service quality, information about that service transaction and the subsequent survey can be forwarded to a designated manager. The customer could get a call immediately to address the issue. The immediacy helps ensure that the customer is by the phone to take the call.

Lower Survey Response Bias. Response bias results from a respondent’s predispositions when they enter into a surveying routine. Common ones are acquiescence where the respondent gives the answer he feels the surveyor wants and auspices where the respondent gives an answer to please the interviewer. Since the respondent is interacting with a machine, the likelihood of any response bias is much reduced. We may not understand the script, but few of us will be intimidated by the process. However, the flip side may prove true should a low score result in an immediate follow-up phone call by a designated manager. That call could prove intimidating to some — and make them leery to take future IVR surveys.

Lower Cost for Larger Survey Programs. IVR surveys have a cost model that has a high fixed cost, but lower variable cost. The script must be developed and recorded, which can be costly. But the cost to deliver each individual survey is relatively small.

What are the Weaknesses of the IVR Survey Method?

Lack of Anonymity. Like a telephone survey, no anonymity exists. We may feel confident that our responses will be treated confidentially, but heck, they have our phone number. (If you call them on a toll free number, they get your phone number, even if you block caller ID.) They know who we are. Thus, any survey research purpose that relies heavily on anonymity, such as an employee survey, is best done through some other method.

Keep it Very Simple. Again, like telephone surveys, the questionnaire or script must be kept simple. Complex survey questions that can need to be administered with the visual aid of a printed page or web page cannot use an IVR survey approach. For the most part, IVR survey designers are limited to interval-rating scale questions, that is, questions that ask for a response on a 1-to-5 scale or something similar. (The much touted Net Promoter Score® survey method cannot be done by IVR since it uses an 11-point, 0-to-10 scale.) Check list (or categorical) questions are difficult to administer since all the options must be read to the respondent.

Narrow Scope. When an IVR survey is conducted as a transactional survey at the end of an interaction with a contact center, e.g., a phone call, the only things that can really be evaluated in the survey is what happened during the immediate interaction. Yet, that interaction is likely one part of a broader transaction, which is composed of a series of interactions.

In the case of the Sears customer survey, they used the IVR survey not at the end of my call with their 800-4MyHome call center, rather they used it to measure my feelings about the technician’s visit to my home. This shows that the IVR survey method can be used to measure the broader transaction, although Sears through the stupidity of its functional silos, chose to measure just the transaction involving the technician.

Telephone Turn-off. Some people just dislike telephone interaction, including telephone surveys. We can thank telemarketers, especially those who use surveys as a pretense for a sales call. Some people also don’t like interacting with a machine. And those of us who have grown to hate IVR phone trees — and I count myself fully in that group — may also hate anything that involves an IVR.

In summary, IVR surveys can be a very useful tool to get very quick feedback for transactional surveys, and they are flexible enough to be used in other situations.  However, care must be taken in the survey questionnaire design, which the Sears IVR survey definitely did not do.

How to Select a Web Survey Tool

When deciding on an online web survey tool what are the key features to consider and how should you go about the selection process? This article reviews important features of online survey tools and the decision process for selecting survey software, such as SurveyMonkey or QuestionPro.

Tips for Selecting an Online Survey Software Tool

Internet survey software tools make it easy for anyone to do a survey. SurveyMonkey, the best known of these web survey tools, states its goal is “to enable anyone to create professional online surveys quickly and easily.”

That’s true. Survey Monkey, Zoomerang ™, QuestionPro, and all the rest of the online survey tools DO make it simple to create a survey and launch it. They can be a boon to capturing customer feedback as well as from employees, and other stakeholder groups, quickly and simply.

However, using these survey software tools can be like driving without a rear view mirror and without a map — and perhaps with a distorted windshield. You may think you’ve reached your intended destination using the best route. But in fact, you may not have taken the best route. Worse, you may not even be at your intended destination, and you won’t know it. And another destination may have been a better one to meet your needs.

In this article, I’ll define survey software tools, and then discuss their benefits, the two types of web survey tools, their key features, and — to conclude the thought above — the dangers of “driving” with these online survey tools. (Hint: they can’t tell you how to create a good survey.)

What are Survey Software Tools?

Simply put, survey software tools provide the capability to create a survey where the painstaking survey administration tasks are automated. Typically, we think of these automation tools as online survey tools that generate an HTML form to post to the internet, allowing your group of interest to submit their responses via the web. That’s the focus of the remainder of this article. However, there are other survey automation tools, such as

Computer Aided Telephone Interviewing (CATI) software that manages surveys delivered by telephone interviews.

Interactive Voice Response (IVR) surveys, which are like telephone surveys, except that the scripts are recorded and the respondent enters answers using the keypad of their telephone. These are frequently used in call centers at the conclusion of customer service calls.

Optical Mark Recogniztion (OMR) or Optical Character Recognition (OCR) that allow you to scan a paper-based survey to capture the responses. (Think back to your high school SAT exams…)

Benefits of Web Survey Tools

The key benefits that can be derived from internet survey software tools are:

Simplicity. You can create a survey quite quickly — once you learn how to use the tool. Survey invitations are typically sent out electronically through an email. That email will contain a link to an HTML form that is the survey questionnaire. Responses are then captured electronically, either through incoming emails or by direct loading into a data base, depending on the type of survey tool used. This data collection function is by far the key benefit of online survey tools. If you’ve ever processed a hardcopy survey by postal mail and keyed in the data, you’ll see the benefit from survey automation tools very quickly!

Low Cost. The out-of-pocket cost of these tools is quite low. You can buy a PC-based tool for $500 or less or you can buy it as a service in a hosted application, typically paying a monthly or annual subscription fee. The fees can be as low as $15 to $20 per month for a basic tool. If you need more sophisticated features, the monthly cost could be much higher. As noted above, the major cost is not the acquisition cost of the survey capabilities, but rather, the time to learn how to use the survey software tool.

Scalability. Unlike telephone or postal mail surveys, you can survey 10,000 people as easily as 10 people. This scalability is part of what makes this a low cost method.

Quick Data Capture Turnaround. Since you may be soliciting feedback information about organizational performance, such as a customer satisfaction survey, how quickly you get the data back is important, especially if this is a transactional or event survey. Quick data capture allows addressing a customer issue promptly, which is a key element in successful service recovery practices.

With email most of us practice a “do or delete” approach. People will either take the survey or delete the invitation. So, you’ll find with a web form survey that most of your responses will come back in one to two days. Responses will quickly tail off to nothing. Only IVR surveys can be faster since they are administered at the close of the transaction.

Response Rate. When web-form surveys were a novelty, response rates were quite high, perhaps above 50%. Based upon my own survey project work and feedback from those in my Survey Design Workshop, response rates typically are in the 20% to 40% range. However, the relationship you have with your group of interest is the single most important factor in driving the response rate, not the administration method.

Questionnaire Complexity. Internet web form surveys are fairly flexible. (We can argue forever about which is more flexible: a printed page or a web form.) You can create a fairly complex survey questionnaire with these tools, using extensive branching and other available survey features. Do note that the survey tool you chose will set constraints on the questionnaire you design! It’s better to determine what functionality you need and then select the tool, rather than the other way around.

Dangers in Blind Reliance Using Online Survey Tools

Yes, Dangers. I opened this article with some seemingly twisted logic about survey software taking you blindly to a wrong destination without you realizing you were blinded. Let me explain…

Survey software tools make it easy to write a good survey. True. They also make it easy to write a lousy survey! And you’re probably better off doing no survey than doing a lousy survey. Why? Because a poorly designed survey can generate misleading data that provides you with delusions of knowledge. Researchers refer to this as “instrument validity“, which simply means that the survey instrument measures what you intend to measure. That may sound like a no-brainer. It’s not! Examples of invalid surveys abound, many done by professional market research organizations.

These online survey tools cannot tell you how to create a good survey. They also can’t tell you that you have a bad survey! Only knowledge of good survey design practices will help prevent you from designing a bad survey. (Obviously, this is why I recommend you buy my Customer Survey Guidebook or attend one of my Survey Design Workshops!)

The Internet web form survey approach exacerbates this blind spot. With a paper-based survey, respondents will give you feedback on the quality of your survey questionnaire. You’ll see big question marks and comments in the margin. (I’ve been known to do this on occasion…) With telephone surveys, you can listen to respondents formulate questions or talk with the interviewers to find out where the survey instrument has shortcomings.

How does the respondent provide you this feedback about an online, web survey? Perhaps you’ll see something in a comment field. But I doubt it. This is an incredible danger zone for this surveying method — and one that the survey software tool vendors never mention.

Technical knowledge of how to use the survey software too does not equate to knowledge of how to design a valid survey questionnaire!!

A twelve year old can push a car’s gas pedal and turn the steering wheel. Would you want them to drive a car? Yet, today many well-intentioned people are using SurveyMonkey and the like, creating survey questions that lack validity and then make business decisions on that bogus data.

See How to Select a Web Survey Tool

Survey Sample Selection: The Need to Consider Your Whole Research Program

A counterintuitive result of a survey project is that a survey’s results are likely to pose as many new questions as it answers. An email to me recently posed a dilemma that can result from a particular approach to a survey program. The company had conducted a survey and now they wanted to ask some follow-up questions based upon the learning from the first survey. The question is: whom should they invite to take the survey?

Do we send the survey to the same people again or do we contact the people who replied? What is the best way. We didn’t do sampling as we just sent the survey to every user.

Having attempted a census — sending an invitation to everyone — this company runs the risk of creating “survey burnout” if they invite everyone again. Customers will only give you so much time to help out your research efforts, and if you ask too much, you risk alienating them.

Since surveying is frequently used for some quality control purposes, let draw a parallel to our manufacturing colleagues. Frequently, to measure the quality of a tangible product coming off an assembly line, we have to destroy it to know the limits of performance. (Think of those automobile crash test commercials.) This is known as destructive testing since the tested product cannot be sold. When conducting quality control surveys of our intangible service product, we don’t want to engage in destructive testing, that is, burn out and annoy our customers!

So what’s the solution?

The power of statistics means we probably do not need to attempt a census for any survey — unless your group of interest is very small. In another article, I talked about response rate requirements and statistical confidence. Surveying a sample will draw a reasonably accurate profile of how customers, or any other group of interest, feel about certain issues or operational practices. The structure of a survey program that I recommend is to survey a sample, and then if a follow-up survey is desired, we can generate another random sample excluding those who previously responded without risking survey burnout.

The questioner specifically asked about surveying just the respondents to the first survey. I responded, “Surveying only the respondents [to the first survey] does introduce a bias to the results since your sample is not random.” That bias may be fine if you’re just looking for more detailed information and not trying to establish a profile of customer feelings or practices. “If you’re looking to develop more granular information about things learned from a previous survey, let me suggest not doing a survey, but, instead, conducting some in-depth research, e.g., interviews or focus groups.  These [surveys and focus groups] are complementary research techniques.” Don’t feel that you are constrained to one research too. In a previous article, I discussed ways you can generate more actionable data on a survey, but don’t feel you cannot go beyond conducting surveys.

The key lesson here is that you should think through the entire survey program before you conduct your first survey.  You could back yourself into a corner. Over surveying your customers is certainly not recommended. Remember, a survey’s goal is to measure satisfaction, and how the survey program is conducted will affect that very satisfaction. A survey is a CRM transaction.

Survey Statistical Confidence: How Many is Enough?

Response rates and statistical confidence are always concerns for those tasked with a survey project. This article outlines the factors that determine the number of responses needed for a level of accuracy and shows how to determine the statistical confidence that can be placed in survey results.

An Old Dog’s New Trick: Postal Mail Surveys

Cutting edge is where it’s at. We’re all trained to believe that. Yet many times the new tools may not best address the core needs or requirements of the business process. New and high tech does not necessarily mean better. We’ve all seen solutions looking for problems, and many times when a legitimate new solution is found, we dismiss the old solution because it’s not a sexy, cutting edge approach. In the world of customer research — in particular customer satisfaction surveys where I spend much of my professional life — one old dog, administering surveys by postal mail, might just perform the best tricks.

The web has automated the process of conducting surveys, lowering the cost through its scalability and increasing the quality of the surveying effort by the quickness of data collection. It also has lead to a higher response rate — or so we think — or so maybe it once did. Most people I talk to about surveying want to use a web form administration method. There are strong merits for the method, which seems like a win-win-win solution.

In my workshops on surveying practices, I present the pros and cons of each survey administration method, and one of my students, Pamela Kellerstrass, Director of Marketing & Client Services for Motor Cargo, decided to conduct her first customer survey using the old dog of postal mail. Motor Cargo is a freight company located outside Salt Lake City, Utah that specializes in less than truckload (LTL) shipments. They had never asked their customers for feedback on the quality of service. Pamela didn’t have a huge budget, but she did have some company resources at her disposal. For her customer base, email interaction is a relatively new device, so they did not have email addresses for the entire customer base. This effectively eliminated electronic survey administration as that would introduce a serious bias to the results.

Pamela turned to the postal mail approach and applied her ingenuity to maximizing the response rate. The survey was mailed in a 10″x7″ bubble envelope.  Inside was a cover letter printed on stationery with a coffee motif. The letter started, “Help us wake up and smell the coffee!” Also inside the envelope were the 50-question survey instrument, a self-addressed, stamped envelope — and a small package of ground coffee, whose aroma hits the olfactory senses as soon as the package is opened.

The introductory letter explains Motor Cargo’s commitment to customer feedback, crystallized in their motto, “Determined to Listen. Designed to Perform.” It points out the role of their Customer Loyalty Manager and how the results of the survey will be used to improve service. Pamela also promised to “compile the results and report back to you changes we have made as a result of your good feedback.” Finally, she promised to send respondents a company coffee mug.

There are five steps to getting responses to a survey. Get the survey:

  • Noticed,
  • Opened,
  • Read,
  • Completed,
  • Returned. 

Consider how Pamela addressed each of these points. The unusual envelope certainly gets it noticed in the postal mail In-Box — and opened. The envelope’s contents will likely spur the recipient to read the material, and the phrasing of the letter, along with the coffee mug, will help motivate people to complete the survey. The clean layout and design of the instrument also promotes survey completion. Motor Cargo also made a follow-up phone call to the recipient list to gently push people to complete the survey. Finally, the SASE will help ensure it gets returned.

What were the results? Pamela got 182 responses out of 300 invitations, better than a 60% response rate! While we’d like 100% participation, 60% is a very admirable response rate. More importantly, they learned that customers weren’t happy with the process for getting a billing error corrected, a fact the president communicated in a follow-up letter. According to Pamela, “the survey also showed us what we were doing right. We always knew our good drivers were our best asset, but this just confirmed it. What a great opportunity to reinforce with our drivers the importance of the customer’s positive perception and how they are appreciated for their good work.”

Could Motor Cargo have gotten the same response using electronic surveying? Perhaps, but consider how many emails we all get. Do you notice the survey invitations you get buried among the scores of emails in your inbox at the start of the day? Many people tell me they just automatically delete emails from people they don’t recognize. If you don’t get your survey invitation noticed, the rest of your effort is moot. Perhaps the best way to get visibility and 10 minutes of your customer’s mind share is to return to the old dog of postal mail surveys.