Check out our Survey Glossary with definitions of key survey terminology — and links to articles where we discuss the concept.

CNN Survey Data Analytics: Explaining the Covid-19 Comfort Gap by Political Party

CNN’s June 2020 poll found a huge partisan difference in comfort levels with regular routines due to Covid. But is it partisan or other factors? What can we learn for our organizational surveys.

CNN Proves Conformity Bias — or Sample Bias

CNN’s poll in June 2020 attempted to measure people’s comfort level in returning to their regular routines. However, this poll proves the existence of either sample bias or conformity bias, or both.

Conformity Bias: How Social Desirability Can Affect Survey Research

Conformity Bias occurs when the respondent gives an answer that is socially desirable rather than an answer that reflects their true feelings. This bias in the data compromises the value of the data and our analysis.

Ranked Choice Voting – The Strategy To Winning RCV Elections

Current experience with an RCV election shows a lack of transparency, ease, or simplicity. But it’s not clear how to win RCV elections.

How a Bad Survey Checklist Question Can Confuse Findings

Checklist questions are one of the more common survey question types, and they are also used heavily in data collection forms, which are a form of survey. And for good reason. You can get specific, actionable answers to a question – if the question is written correctly. A poorly designed checklist question can hide problems and confuse interpretation.

Ranked Choice Voting: The Good, The Opaque, The End Game

Ranked Choice Voting (RCV) is a different voting process that is being pushed due to purported benefits. This article describes the concept along with its pros and its cons.

Ambiguous Questions: The Biggest Mistake in Survey Question Writing

Surveys are conducted to learn how some group feels. If the survey questions are flawed, then we don’t learn and may be misled. Ambiguous questions — questions whose phrasing leads to multiple interpretations — are the single biggest mistake made by survey designers. And perhaps a fatal one.

Airline Customer Loyalty Ain’t for the Dogs

Customer loyalty should not be defined with a short-term view but, rather, looking at the lifetime of interactions. Otherwise, a company risks driving away long-standing customers, as United Airlines almost did.

Open Road Tolling Leads to Hole in Customer Journey Map

Good customer experience management is not a one-time event, but requires reviewing customer journey maps whenever something in a process has changed. The implementation of the lovely new open-road tolling in Massachusetts missed a key part of the customer experience: how to tell the customer when something is amiss with his transponder. And this could be — and should be — proactively pushed to the customer.

MassMoves Funding Preference

MassMoves Report – Misleading Research on Transportation Priorities in Massachusetts

MassMove report on transportation priorities in Massachusetts does not have sound methodology and should not be used as the basis for decision making. It’s a wonderful example of how to concoct a research program to deliver the “findings” the sponsor wants. Shame on the Mass State Senate.

Biased Survey Samples

Meaningful survey results require a valid questionnaire and an unbiased administration of the survey. The CTE brain trauma study of NFL players used biased survey samples, which clouds conclusions from the study.

TSA Shifting Bottlenecks

The new TSA service design requires removing large electronic items. The goals are to improve the efficacy of the scans and to speed over passenger processing. However, this may not speed up passenger processing due to increased workload at passenger work areas. This is a great example of shifting a bottleneck, and, unfortunately, the lack of attention to all the “workers” and job tasks in the overall process flow.

Implications of the Federal Agency Customer Experience Act of 2017

Federal Agency Customer Experience Act of 2017 promises to make agencies more responsive to its citizen-customers. But will it? By itself, will the bill’s requirements achieve its goals? Is the feedback program required by the bill properly designed? What elements of the feedback program should be part of a Congressional bill versus assigned to some administrator to determine in conjunction with people who are survey experts? This article will explore those questions. This bill needs to be rethought with input from people who know something about survey programs.

Survey Question Type Choice: More Than One Way to Skin That Importance Cat

Various survey question types can be used to measure something. The choices have trade-offs between analytical usefulness of the data and respondent burden.

The Collision of Biases: Some Things Are Just Hard To Measure in Polls & Surveys

A confluence of survey biases – response, interviewer & instrumentation – likely overwhelmed what the NY Times’ surveyors think they measured about people’s feelings about having a female presidential candidate.

The Anxious Survey Response Scale Returns

I’m anxious about your reaction to this article.

Unclear what I mean by that? That’s exactly the point. When designing survey questions and response scales for interval rating questions, It is critical to have “clarity of meaning” and “lack of ambiguity.” Without those you won’t be capturing valid, useful data, data that don’t suffer from instrumentation bias. “Anxious” is an anchor that has multiple meanings and thus should not be used in political surveys. Yet it is.

Back to Basics at the TSA — lean operational improvements

TSA lines aren’t as bad as feared, due to the implementation of new process designs and technology. It’s really just back to basics of operational management and Lean philosophy. No earth shattering innovations, just stuff they could have — and should have — done years ago.

TSA Capacity Management Leads to Long Lines

We’ve seen that the head of the VA doesn’t understand service delivery models and now we’re hearing the details that show the TSA doesn’t understand basic process analysis.  This should be a surprise to anyone with any understanding of capacity analysis – it certainly wasn’t to me – but it is astounding that we have […]

VA Wait Times are an Important Service Measurement

VA head Robert McDonald treats wait times for health services as unimportant, arguing that Disney doesn’t measure them. He needs to develop a proper understand of service models and service measurement

The importance of Good Survey Question Wording: Even Pros Make Mistakes

Proper survey question wording is essential to generate valid, meaningful data for organizational decisions. A survey question wording bias will lead to misleading interpretations and bad decisions. Here we examine a Pew survey on use of mobile devices in public settings.

Good question phrasing is an art form, and even the pros can make mistakes. Here we’ll show a question wording bias example from a survey done by Pew Research Center. Ambiguity in question wording likely led to incorrect data and conclusions. It provides some useful lessons for all survey designers.

Survey Question Types: When to Use Each

Each of the five survey question types has potential value in a survey research project. This article presents brief definitions of each question type, the analysis you can do, and some key concerns with each question type.

Survey Glossary & Survey Article Index

Terminology frequently used in a survey project are defined in our survey glossary. Links are provided to articles of ours discussing the terms.

Survey Statistical Accuracy Defined

Survey accuracy numbers tell us how much we should believe the survey results as an indicator of how group of interest feels. This article describes how to interpret accuracy numbers and the the factors that affect accuracy.

Survey Sample Size Calculator

We offer a free Excel-based survey sample size calculator. Here we discuss how to use it and provide a form to request the calculator.

TSA Blames Passengers for Missed Flights

TSA (US airport security screeners) frequently tell us during busy times to arrive at the airport extra early. Early arrival does NOT increase capacity. If everyone arrives 2 hours early, then you are no more likely to make your plane than if everyone arrives 1 hour early. Arriving early just creates more congestion, but now TSA can turn the blame to you for missing your flight. Airport capacity management is the issue!! If only journalists would ask the right questions…

A Very Short Customer Journey Map

Customer Journey Maps are typically long & involved, but my journey out of American Express’ customer base was very short due to an ill-conceived process. What lessons can we learn about designing customer interaction processes?

Making the Case for Service Recovery Strategies — Customer Retention

The goal of service recovery is to identify customers with issues and then to address those issues to the customers’ satisfaction to promote customer retention. However, service recovery doesn’t just happen. It is a systematic business process that must be designed properly and implemented in an organization. Perhaps more importantly, the organizational culture must be supportive of the central tenant of service recovery strategies — that customers are important and their voice has value.

Impact of Mobile Surveys — Tips for Best Practice

Summary: Mobile survey taking has shot up recently. When a survey invitation is sent via email with a link to a webform, up to a third of respondents take the survey on a smartphone or equivalent.  Survey designers must consider this when designing their surveys since some questions will not display properly on a mobile […]

Misleading (or Lying) With Survey Statistics

Survey questionnaire design has a critical impact on the value of the data generated by the survey program. The survey scale design in particular can skew the responses generated to create almost perfect “top box scores.” This article examines the Ritz Carlton survey, a transactional survey, and shows how the scale design impacts the value of the findings.

Bribes, Incentives, and Video Tape (er, Response Bias)

Incentives are part of virtually every email request to take a survey. Recently, I saw this taken to its logical extreme. I was blatantly offered a bribe to give high scores. This article will cover the pros and cons of incentives. Incentives are a two-edged sword. But this experience highlights in the extreme the risks to validity inherent in offering incentives. Are the data real?

Why the Election Polls Were Wrong — Response Bias Combined with Non-Response Bias

Pollsters for the 2014 midterm elections in the US were not wrapped in glory. This article explains likely reasons, which appear to be a combination of response bias and non-response bias. Yes, that seems like a dialectic or a contradiction, but it’s not.

Survey Question Design: Headlines or Meaningful Information?

Surveys can be designed to generate meaningful information or to generate a compelling headline. This article examines the forced-choice, binary-option survey question format and shows how an improper design can potentially misrepresent respondents’ views.

Service Recovery at United Airlines

Effective service recovery at airlines should be second nature given how much practice they get, but at United, the bromides were plentiful, but meaningful explanations were sorely lacking. This article examines United’s service recovery efforts in the contexts of a service recovery model.

Employees Strike in Support of a CEO?

Employees strike against CEOs all the time, but have you ever heard of employees going on strike in support of a CEO? Employees at the Market Basket supermarket chain in greater Boston have done just that in support of Arthur T. Demoulas, and customers are supporting the boycott. This is an odd event.

Effortless Experience: Statistical Errors

The book “The Effortless Experience” presents impressive sounding statistics to show that the Customer Effort Score is a good predictor of customer loyalty. But should we believe the statistics? The authors use generous interpretations of some statistics and appear to perform some outright wrong statistical analyses. The mistakes cast doubt upon all the statistics in the study. This is the final review page of the book.

Effortless Experience: Questionnaire Design & Survey Administration Issues

The book “The Effortless Experience” posits that the Customer Effort Score is a good predictor of customer loyalty. This part of the review addresses the shortcomings of the research execution. The description of the survey execution leaves many unanswered questions, and any of these issues would seriously compromise the validity of the research data. By their own admission, the researchers do not know how to write good survey questions, but the issues go far beyond that.

Effortless Experience: Weak & Flawed Research Model

The book “The Effortless Experience” claims that the Customer Effort Score (CES) is a good predictor of customer loyalty. This part of the review addresses the shortcomings of the research model. Since the research model does not include measurements of actual customer loyalty behaviors, that greatly weakens claims that CES is a good predictor of loyalty.

“The Effortless Experience” Book Review

The book “The Effortless Experience” presents a great deal of “findings” to support their argument that companies should reduce customer disloyalty by creating more seamless experiences. The recommendations are logical and are likely to do no harm, but the authors are on very shaky ground claiming their research demonstrates a causal link between customer effort and loyalty – and that therefore the Customer Effort Score is a loyalty predictor.

Mixed-Mode Surveys: Impact on NPS and Survey Results

Survey data are affected by all aspects of the survey process. This article examines the impact of mixed-mode survey — telephone and webform — using actual data from a B2B company’s transactional survey for the research study.

Telephone surveys garner higher scores than the identical web-form survey, caused by a scale-truncation effect. The differences between survey administration modes are amplified by the threshold effects in the “net scoring” statistic. Consumers of survey data, especially when doing cross-company comparisons, should understand the impact upon survey data resulting from differences in questionnaire design and survey administration practices before making decisions based on the survey data.

Local Icons Just Fade Away

Why is that local companies with a loyal following falter so often when bought by an out-of-town company? In Boston in the fall of 2013 we lost Hilltop Steakhouse, an iconic landmark on the North Shore.

Net Promoter Score® Discussion at Customer Loyalty Forum

Great Brook recently kicked off the Customer Loyalty Forum in the Boston area. On March 10, 2010 we held our first breakfast meeting with over 30 people from Boston area companies discussing the merits of the Net Promoter® Score and other issues in capturing and applying customer feedback. It was a very lively discussion with agreement on many ideas but with different practices elicited across the companies in the group. Here are notes from the meeting which I’ve arranged by topic.

Net Promoter Score — Summary & Controversy

The Net Promoter Score is widely adopted and wildly controversial. What exactly is NPS and what are the various areas of controversy for using this survey question as a customer insight metric? This article provides a summary of the NPS concept and the critical concerns.

Have You “Met” Mayor Menino? Lots have!

Sometimes even well honed survey questions don’t measure the underlying attribute that the survey designers want. A recent poll by the Boston Globe about Mayor Menino’s popularity shows a clear example of this. The question attempted to objectively measure how many people the Mayor has met, but the results — when thought through to the logical conclusions — show that the question was really measuring something else.

Survey Design Tips — SwissCom Pocket Connect Survey

SwissCom surveys its customers’ experiences with its Pocket Connect MiFi device, which I rented during a stay in Switzerland. This is a real-time video review of the survey as I took it. Learn about the good and bad points of the survey design practices.

Comcast Chat Survey Review — Tips for Survey Design

Comcast surveys its customers’ experiences with its chat support service, which I used when trying to find out the international calling rates. This is a real-time video review of the survey as I took it. Learn about the good and bad points of the survey design practices.

HHS Hospital Quality Survey

The US Department of Health and Human Services (HHS) conducts a survey of patient experiences. The survey results are used to determine, in part, reimbursement by government to specific hospitals based upon the quality of care. But does the survey truly measure hospital quality? This article examines some of the administration biases and instrumentation biases that are present in the survey — to my surprise. In fact, the most important question has the most serious design shortcomings.

Customer Insight Metrics: An Issue of Validity

Metrics that provide insight into customer loyalty are the holy grail of customer measurements. Several have been proposed in recent years, but whether they should be used as the basis for business decisions depends upon the validity of those metrics as true indicators of customer loyalty. This article discusses validity and reproducibility as the basis for evaluating customer insight metrics.

What Sprint Could Learn from Cat Litter

Sprint’s customer loyalty practices could learn a lot from a cat litter company. Really.

Sampling Error — And Other Reasons Polls Differ

The wide discrepancies across polling data raises the question about the sources of survey error. This article will discuss the different types of survey errors within the context of political polls. Even for those conducting feedback surveys for their organizations, lessons can be learned.

Money Grows on Trees — If You Believe the Polls

Political polls — as well as organizational surveys — many times present conflicting results within a poll. The reason is that the surveys have not been designed to force respondents to engage in trade-offs among conflicting options. We see this in the New York Times, CBS News poll of swing states released on August 23, 2012 where the poll indicates that respondents want to keep Medicare as we know it yet spend less on it. Clearly, something is amiss.

An Example of the Impact of Question Sequencing

The New York Times and CBS News released a nationwide poll on 7July 19, 2012 that conveniently ignores the impact of question sequencing and presents questionable interpretations of the data.

Caveat Survey Dolor: “Show Me the Questionnaire”

“Show me the car fax” is one those lines from a TV ad that frankly gets annoying after a while. My version of it is “Show me the survey instrument.” I annoy some organizations when I ask to see the survey instrument before I’ll even contemplate the findings derived from the survey. To most people, examining the instrument would seem an unnecessary annoyance. In this article I will show you why you should always go to the source and verify the validity of the data generated by the survey instrument.

Battling Survey Fatigue

Survey fatigue is genuine concern for surveyors. As more companies survey, the willingness of people to take all these surveys plummets. Applying the golden rule of surveying, can help you stand out, reduce the survey fatigue for your respondents, and increase survey response rates.

What Survey Mode is Best?

What survey mode is best to use? A recent Cint survey seemed to answer that question. 91% (sic) of respondents said they prefer web-based, smartphone, or SMS surveys. However, this survey suffers from a severe sample bias, given that the survey was likely conducted electronically. Just as response rates cannot be generalized across surveys done in different industries in different countries, so too the best survey mode or method cannot be generalized. Know your group of interest and what’s the best survey mode for your group.

Mind Your Ps and Qs — Processes & Queues

Queue management is an essential factor to determining customer satisfaction in a service operation. Normally, bad queue management results “just” in customer inconvenience or annoyance. This article reviews the queue management for the immigration service at Dulles International Airport that is so bad that physical injuries could have resulted.

Bolton Local Historic District Survey — A Critique

The survey critiqued here shows how to support an argument with survey statistics. Rather than a survey that is an objective collector of data to understand public opinion, the survey design accomplishes the purpose of driving public opinion. This is achieved by the Bolton Local Historic District Study Committee through intentional instrumentation bias as described.

Was It “A Flight Attendant from Hell” or a Customer Who Should Be in Purgatory?

Are customers always right? A Wall Street Journal article raises the question of whether the customer or the service agent was the disagreeable party.

Pay-to-Play Laws: Good Government or Speech Infringement

Are pay-to-play laws meant to protect us against kickbacks in government contracts, but are they an unwarranted attack on First Amendment freedom of speech?

The Hidden Danger of Survey Bias

Perhaps the topic most inquired in the field of surveys, especially customer satisfaction surveys, is response rates. What’s a good response rate? What’s the statistical accuracy with a certain response rate? But what is sometimes missed is that the response rate should not be the only concern. Bias in the response group should be an equally important concern. Lots of responses will give a good statistical accuracy, but it’s a false sense of security if there’s a sample bias. In particular, we’ll discuss survey fatigue and non-response bias.

Data Collection Form Design Issues

What can the American Recovery and Reinvestment Act (ARRA) of 2009 – also known as Stimulus 1 — teach us about surveying practices? You may think nothing, but you’d be wrong. We can see the need for thinking through the logic of the data collection form and the introduction for a form that is all about demographic data.

What’s the point? An Unactionable Transactional Survey

Transactional surveys are a vital cog for operational improvement and in customer retention. The HomeAway customer experience transactional survey doesn’t generate actionable data, including identifying customers in need of a service recovery event. This article reviews the shortcomings of this customer support transactional survey.

Constant Contact Customer Onboarding Processes

The most important impressions to a customer are the first impression. One company, Constant Contact, goes beyond the automated messages that lack all sincerity, and really focuses on the onboarding process. This article describes their onboarding process as a critical first step in customer experience management.

Lost in Translation: A Charming Hotel Stay with a Not-So-Charming Survey

Charming, posh hotels may not have charming surveys if they have been designed by those lacking significant survey questionnaire design experience. This article reviews the an event survey for a hotel stay at a hotel that is part of the Relais & Châteaux Association. We see many shortcomings in the survey design, some due to translation issues. This survey article is a collaboration with OmniTouch International’s CEO, Daniel Ord, who personally experienced this survey.

Communicate the Survey Findings — and Your Actions

Survey projects are not done with the presentation of results. You should take action on the findings AND communicate the actions to the group you’re engaging with the survey program. This article discusses how this will further engage the group you have surveyed for further feedback.

Responding to Survey Cries for Help

Surveys, especially transactional surveys, identify customers in need of a service recovery act. The worst thing you can do is ignore screams for help. Better to just forget doing the survey. This article discusses this most important communication element in a survey program.

An Honest Survey Invitation?

A survey invitation may make the first impression of a survey program for those in your respondent pool. A good impression is critical to getting a good survey response rate, but the invitation may present other critical information to the potential respondent. Most importantly, the invitation should be honest. The elements of a good survey invitation are presented in this article in the context of reviewing a poor invitation from HomeAway.com.

What Pollsters Can Teach Us — About Survey Practices

Political pollsters have a lot at stake in their polling predictions — their reputations. Many of their challenges are the same ones we confront in surveying our customers or employees or members. Recent newspaper articles discuss some of the challenges pollsters are confronting, and we confront similar changes. This articles presents some of these challenges to survey design and administration.

Satisfy, Don’t Delight

“Delight, don’t just satisfy” has been the mantra in customer service circles for many years. Satisfied customers are not necessarily loyal was the underlying assumption. Now a research project by the Customer Contact Council of the Corporate Executive Board argues that exceeding expectations has minimal marginal benefit over just meeting expectations. In essence the authors argue that satisfaction drives loyalty more than the mysterious delight factors. This article examines the argument, and specifically looks at its shortcomings in how they establish the loyalty link.

Survey Resource Requirements

Survey projects are frequently treated without the due diligence required to ensure success. They seem deceptively simple. Why the need to plan? That, as any project manager knows, is a recipe for failure. Some level of resource commitment is needed from an organization to provide a firm foundation for a successful survey project. This article examines those requirements from both monetary and personnel resource perspectives.

Reverse Engineer a Statement of Survey Research Objectives

A successful survey program starts with a firm foundation. Without a clearly articulated Statement of Research Objectives the survey project is likely to meander with disagreements among the players during the project especially about the questions to create for the survey instrument. Failure is more likely. This article outlines the critical elements that should be part of this first step in a survey project and how to reverse engineer the research statement from a currently used survey questionnaire.

Customer Experience Design — Do our designs bring out the best or the worst in our customers?

Customer Experience Design requires that we show respect for our customers. Lean Six Sigma systems bring out the best in their employees by applying a sociotechnical systems thinking perspective. When applied to service operations we must remember that customers are part of the production system. Incidents like the encounter between Steven Slater, Jet Blue flight attendant, and the passenger beg the question whether our service systems bring out the best or the worst in our customers.

An Insulting Yahoo Merchant Survey

Surveys are supposed to collect information from customers, but some surveys are so poorly designed that they just tick off the respondents. This Yahoo Merchant Survey is perhaps the most insulting survey we have ever encountered in how they waste the respondents’ time and use disparaging language.

Everyone in the Pool! Good for Parties But a Muddied Customer Service Design at United Airlines

“Everyone in the pool” may make for a good summer party, but it’s a lousy design for customer service organizations. Customer service at companies such as United Airlines and Comcast used the pooled approach to answer customer email inquiries. While the approach lends the impression of greater efficiency, this approach leads to more rounds of email exchanges, customer aggravation, and wasted effort on the part of the company. The failure lies in the performance metrics used to measure the organization.

Develop Products Right! Design for Supportability

During new product development, R&D engineers need to consider a wide range of often-conflicting requirements, including product features, cost, quality and manufactureability. Therefore, it is not surprising that support requirements are often neglected. However, support is an essential factor for achieving customer satisfaction in many industries, and so ensuring that products are easy and economical to support is a priority. Customer support managers know this but often have problems in convincing their R&D colleagues. This article looks at the issues involved in achieving Design for Supportability – the full evaluation of support requirements at the design stage.

Surveys’ Negative Impact on Customer Satisfaction

A long understood, but seldom followed, truism of organization design is that reporting for operational control and management control should not be mixed. Tools designed to provide front-line managers information about operational performance will become compromised when used for performance measurement. This is true for customer satisfaction surveys used for operational control, including Reichheld’s Net Promoter Score®. It was intended to be an operational control tool, but when used for performance measurement, we can see the deleterious effects. Customer feedback surveys are one element in an organization’s measurement systems, and the survey program needs to be considered in full context to be sure the measurements are not corrupted, and — more importantly — that the results of the survey program don’t create customer dissatisfaction in the process of attempting to measure customer satisfaction.

When Accountants Destroy Customer Loyalty

What’s more important when the economy goes in a downturn, capturing profit today or maintaining customer loyalty? Read about the Indian Lakes Conference Center focus on short-term profits and the resulting attrition to its customer base.

The Poetry of Surveys: A Respondent’s Survey Design Lessons

Survey designers ignore their respondents at their peril. The author recounts lessons from an ardent survey taker about how the design affected her respondent burden.

Genuine Customer Appreciation

Customer appreciation is one of those terms that is bandied about so much that it has lost its meaning. As a consultant in customer service and customer feedback management, I’m pretty cynical about these attempts at such marketing hype, so imagine my pleasant surprise when I actually felt appreciated as a customer, not from some major retailer with a fancy promotion or some high tech firm who had hired marketing wizards, but from my local restaurant, a pub in Hudson, Mass. Their secret sauce: genuineness.

Automated Phone Surveys

IVR surveys are a relatively new method for survey administration, best used for transactional surveys. In this article, we define IVR surveys and examine their strengths and weaknesses.

Complaint Identification: World Class CRM Surveys

Arguably the most important part of a customer satisfaction survey is not part of the survey process but rather what you do after the survey with the survey data. Companies, such as Sears and Home Depot, frequently miss the golden opportunity to identify customers at risk and engage in Service Recovery.

Sears IVR Customer Satisfaction Survey

Large, sophisticated companies, like Sears, can make devastating mistakes in their survey program design, creating customer disloyalty. Read this review of Sears IVR customer satisfaction survey for its multiple mistakes. You’ll come to understand why Sears has earned its reputation for poor customer service.

Hilton Hotel Customer Survey Program

Hotel surveys are typically transactional or event surveys, used for customer satisfaction measurement and improvement. A key challenge is how to increase response rates to generate actionable data through better survey design. Stephen Hardenburg of the Hilton Family of hotels discusses the challenges he faces managing Hilton’s customer satisfaction research programs.

University Help Desk Survey Creation

To create a survey program for any IT support center can be a challenge, but throw in the limitations faced by a university help desk and the issues can be daunting. Joyce Sandusky of Clayton State University led an effort to design a customer loyalty program for their help desk. She learned many hard lessons, even though she went into the project with her eyes wide open. We interviewed her for this article to get some insights and lessons for others who are undertaking the challenge of creating a help desk survey, especially for a university setting.

A Sweet Service Recovery Turned Sour

The essence of service recovery is fairness. A customer with a problem wants the company to treat him fairly. An intelligent company recognizes long term benefits of effective complaint handling and tries to deliver a fair resolution in all its dimensions. This article presents the three dimensions of fairness, and shows how Subaru of America missed the mark badly on one dimension — thanks to the lawyers — turning sour an otherwise good service recovery, negating the benefits delivered on the other dimensions.

Measuring Service Effectiveness

Performance measurements help guide organizations to achieve their goals. Service organizations have conflicting objectives. They strive to be both efficient in resource use and effective in the quality of service as perceived by customers. But how to measure service effectiveness? This article presents a framework for service measurement — from call monitoring and mystery shopping to customer surveys, focus groups and complaint solicitation. The key finding is that a portfolio of approaches is needed to provide comprehensive and balanced measures of service effectiveness.

Hazards of Lean Service: Jet Blue’s Valentine’s Day Massacre

Lean production principles can lead to great efficiency improvements through waste (muda) elimination coupled with competitive advantages in lead times and quality. But a too lean service operation with no buffers for uncertainty can lead to disasters like Jet Blue’s Valentine’s Day Massacre. This article examines the uncertainties inherent in a service operation like airlines and explores the logical extent of lean application.

How to Select a Web Survey Tool

When deciding on an online web survey tool what are the key features to consider and how should you go about the selection process? This article reviews important features of online survey tools and the decision process for selecting survey software, such as SurveyMonkey or QuestionPro.

Tips for Selecting an Online Survey Software Tool

Online survey software tools can make it easy for novices to create a survey — or a bad survey! This article reviews the benefits from using web survey tools, such as SurveyMonkey or Zoomerang, including strong concerns about blind reliance upon Internet survey tools, which can result in invalid survey questionnaires and delusions of knowledge.

Home Depot Transaction Survey

Home Depot requests its customers to take a “brief survey about your store visit,” which is a classic example of an event-based survey. Perhaps you’ve taken it. However, this is not just an event survey. The survey design shifts into a relationship survey mode, which lengthens the survey greatly. Plus, the survey displays some questionable design practices. Read this critique.

Event Survey Practical Points

Summary of lessons learned from the Home Depot customer satisfaction survey about the proper use of event surveys.

Customer Experience Management — By Design

Customer Experience Management (CEM) is a catch phrase for business processes that focus on delivering customer experiences that enhance the perceived value of a company’s products and services. Superior CEM requires good process design and solid execution — and a supportive organizational culture. This article presents examples of Good, Bad, and Ugly CEM examples, concluding with key lessons.

Tech Support as Training Ground: How MathWorks Finds Strategic Value in Customer Support

MathWorks recognized one of customer support’s strategic roles for a company: providing a training ground for new, entry-level employees. By delivering support, the employees learn the products quickly — and they learn customer sensitivity. Read how they make this program work for the employees, the company, tech support, and the customer.

The One Number You Need to Know: (Actually There’s More Than One)

Frederick Reichheld’s Harvard Business Review article, “The One Number You Need to Grow,” may lead readers to believe that a one-question survey design that measures “net promoter score®” will provide all the data needed to grow a business. However, digging deep into the article we find that a meaningful customer feedback program also has to identify the root causes for a customer’s unwillingness to recommend a product or service. This article highlights those points and shows that more than one number is needed to grow a business.

Lessons Learned from Service Recovery (Or Should Have Been Learned)

Effective service recovery requires that robust complaint solicitation and complaint handling systems be built and executed consistently. Part of that “system” is a pervasive corporate culture that supports service recovery. This article reviews the author’s experiences at two hotels, a high-end Marriott and Ritz Carlton, both extremely well known for service and service recovery, and how they bungled customer service issues. The lessons from these experiences are pertinent for most any business.

A Sporting Service Recovery

Service recovery is more than just complaint handling. It is recovering a customer’s positive feelings about a company after a bad experience and taking action to resolve the root cause of the problem. This article looks at service recovery as practiced by the Toronto Maple Leafs and the Seattle SuperSonics and how they create Fan Delight and avoid Fan Outrage

The Support Bullpen

This article chronicles the similar evolution of the role of relief pitchers in baseball with the role of support organizations. Really. Baseball and strategic customer support. Plus, an ode to Dick Radatz.

Delusions of Knowledge — The Dangers of Poorly Done Research

Imagine you’re planning to conduct a survey to support some key business decision-making process. But you have to get results fast. So, you throw together a quick survey since something is better than nothing. Right? Wrong. Why is this wrong? Because this knee-jerk survey process may contain biases and errors that lead to incorrect and misleading data. Thus, the decisions […]

Survey Question Types: How Question Format Affects Survey Analysis

Choice of the survey question types used in a questionnaire is a critical design decision. Survey question type determines the type of data generated, which in turn determines the type of analysis you can do with the survey data collected. No one best survey question type exist. The appropriate question type will be one that best generates valid, reliable data to answer your research question.

Localized Survey Scales

In my Survey Workshops, I discuss the need to localize survey instruments if they will be delivered internationally. Well, during an exercise where students build a scale, a Texan created a scale regionalized to Texas. Have your own localized contribution?

How Broad, How Deep: Lessons from a CRM Implementation

When a company chooses to implement a CRM system, several critical questions must be addressed. This article outlines a case study where the initial approach was to isolate certain sales and marketing functions as the target for the CRM implementation. Midway through the project, the company recognized that all aspects of the customer order, fulfillment, and support processes needed to be considered even if not all would be re-engineered in the initial stages of the CRM project. They also recognized that certain front-office processes were more logical candidates for the initial re-engineering.

Survey Sample Selection: The Need to Consider Your Whole Research Program

A counterintuitive result of a survey project is that a survey’s results are likely to pose as many new questions as it answers. An email to me recently posed a dilemma that can result from a particular approach to a survey program. The company had conducted a survey and now they wanted to ask some follow-up questions based upon the learning from the first survey. The question is: whom should they invite to take the survey?

Survey Scale Design: The Impact on Performance Measurement Systems

The choice of a survey scale impacts setting performance goals. Scale choice and question wording will affect the way people respond. The article also discusses why (artificially) high scores are not necessarily good — if your goal is to use the survey results for continuous improvement projects, requiring Pareto Analysis.

Setting Goals from Survey Data: The Conundrum of Statistics

Reporting the results of a customer survey is a critical step, and different statistical approaches can be applied to survey data. This article discusses the implications of reporting means versus “top box” cumulative distribution scores, especially as it relates to setting performance goals.

What’s My Line

You may recall the game show by the name of this article. A panel of celebrities would try to guess someone’s profession by asking yes or no questions. Imagine if one of your customer service employees were on the show. Would you be able to guess their profession? It likely depends upon how narrowly or broadly their jobs have been defined. This article shows the impact of the two extremes, narrow and broad job definitions.

Generate Actionable Survey Data

The goal of a survey project should be to generate data that will be used to monitor or improve some business process. Generating actionable data from a survey is not a simple task. This article outlines some ways to get more useful data from a research program that includes surveys.

True Customer Care

Many organizations call themselves Customer Care. But do the organizations truly care about their customers? This article relates a personal example of deep customer care in the most trying of circumstances.

Survey Statistical Confidence: How Many is Enough?

Response rates and statistical confidence are always concerns for those tasked with a survey project. This article outlines the factors that determine the number of responses needed for a level of accuracy and shows how to determine the statistical confidence that can be placed in survey results.

The Importance of Measuring Importance — Correctly — on Customer Surveys

Frequently, when business surveys try to measure importance of various factors the survey generates useless data. Everything gets rated as important, so nothing is important. This article covers methods of measuring importance showing the advantages and disadvantages of each. The key is getting the respondent to think about the trade-offs across the factors.

Computer Owners’ Bill of Rights

Is the legislation contemplated by Senator Mark Dayton of Minnesota really needed to correct bad support practices. Should this be a wake-up call for the support industry?

Agathas in Our Midst (Can Lead to Better Products)

Steven Spielberg’s 2002 movie, Minority Report, starring Tom Cruise, presents a world in which homicides are prevented before they happen. Three precognitives’ visions of near future events allow the Department of Precrime to capture these intended killers. Neat idea. If only we in the business world could spot product quality “crimes” before they occur — rather than reacting to problems once customers use our products. Product designers and product managers are tasked with projecting the future, but is there an organizational equivalent for Agatha, the precog who sometimes saw the future differently from her mates, Arthur and Dashiell? Also published in the April 28, 2003 edition of the Boston Business Journal.

An Old Dog’s New Trick: Postal Mail Surveys

Everyone wants to do surveys on the web as the new thing, but perhaps the old techniques still have some life. This article outlines the experiences of one of my students who performed a postal mail survey — and got a response rate over 60%!

Keys to a Successful Survey Project

This article outlines the 7 key elements that are essential to success in a survey project.

Capturing the Value of Customer Complaints

We usually think that our most valuable customers are ones that buy a lot from us — and that’s true! — but to learn about your business practices your best customers may be the ones that walk away from you. Of course, this value is only recognized — if you capture their feedback! Also published in the November 8, 2002 edition of the Boston Business Journal.

Customer Surveying for Small Business: Why Bother?

Common wisdom is that only large companies need customer-research programs. After all, small companies have their feet on the ground and know what customers are thinking, right? On the surface, that seems a reasonable attitude for small-business managers to take. But ask yourself if there is some percentage of customers defecting to competitors each year for part or all of their purchases. Also printed in the July 26, 2002 print edition of the Boston Business Journal.

Customer Satisfaction Surveys: The Heart of a Great Loyalty Program

This article outlines key elements of a customer satisfaction survey and its role in a customer loyalty program. Originally published in the September 1999 issue of Customer Support Management.

Reengineering the Support-Development Interface

A synopsis of Dr. Fred Van Bennekom’s research in this area.

The Color of Feedback

How could buying paint provide lessons on capturing customer feedback? See how. “Viewpoint” article, November 1998 ServiceNews.

Challenging Support Paradigm: Problem Rectification or Problem Prevention?

“Viewpoint” article, February 1997 ServiceNews.