Summary: Pollsters for the 2014 midterm elections in the US were not wrapped in glory. This article explains likely reasons, which appear to be a combination of response bias and non-response bias. (Yes, that seems like a duplication or a contradiction, but it’s not.)

~ ~ ~

With the 2014 US midterm elections over, the political parties are trying to decipher the message(s), but so are the pollsters.  Many races deemed to be close resulted in solid victories to near landslides — Kansas, North Carolina, and Georgia senate races — and several races had surprise winners —  Virginia senate and Maryland governor races. Why so many mistakes? They appear to be a combination of response bias and non-response bias.

While those may sound the same, they’re really very different. I’ll explain each in context, and show their relevancy to organizational surveys.

The pollsters’ main challenge is to identify what the voting electorate will look like. Polls include questions about the respondent’s likelihood to vote or whether they voted in the last election(s). These questions are fraught with response bias, which is the bias that the respondent brings to the surveying process.

Voting is a civic responsibility. Asking someone about their intentions to vote is akin to asking how many dogs they’ve kicked in the past week.  People will give an answer that fits societal norms — a conformity response bias. “Of course I voted and intend to vote.”

In fact many of those people didn’t vote.  Only slightly more than a third of eligible voters did vote.  (In part that was because some large states — California and New York — had top-of-ticket gubernatorial races that weren’t competitive.) Not voting is a participation or non-response bias. People who don’t vote — and voting is a survey — don’t have their voice in the final survey results and thus bias the result.

In fact, every political campaign has a strategy to create a participation bias in their favor — suppress the turnout for their opponent and increase turnout of their core beyond their percentage in the populace. Politicians do not want a representative survey on election day.

Tips for a Successful Survey

Request Chapter 1 of the new edition of our Survey Guidebook for key points in a more effective survey program.

Fill out my online form.

How do response bias and non-response bias apply to our organizational surveys.

If we have low response rates, we likely have a non-response bias in our survey results. How did those who chose not to participate actually feel? Well, we don’t know since they chose not to participate. It’s a bit of a Catch-22.

I’ve heard people want to claim that those who didn’t respond must be satisfied. I even saw a LinkedIn discussion propose that the non-respondents should be included as “Passives” in the calculation of Net Promoter Scores! Such logic flies in the face of the very concept of surveying, and no professional surveyor would ever recommend it. (If they did, they’d better carry a lot of professional incompetency insurance.)

We may also have some types of response bias affecting our survey results. Ever taken a company survey that asked about some employee’s performance, but you knew the person wasn’t responsible for your experience; rather, it was systematic and out of the person’s control? Did that change how you answered the question? Arguably, that would a concern response bias.

I had that experience recently with a United Airlines survey. It asked about my satisfaction with the length of time for the agent to perform the service. It took some time, but the delays were due to slow responses from the code share partner’s systems. I didn’t want the agent to take a hit for something beyond her control. So, I altered my response. That’s a type of response bias.

So how did response bias and participation bias affect the pollsters in so many races?

Either the electorate had a significant shift in its views in the last week or the polls predicted a higher turnout among democrats. The latter seems most likely. In other words, a response bias in their polling questions affected their prediction of the participation bias on election day.

These people are seasoned pros. Imagine how many biases are in the surveys done by novices using SurveyMonkey and the like?

Survey Sample Size Calculator

Get our Excel-based calculator. It can also be used to gauge statistical accuracy after the survey has been completed.

Fill out my online form.