S — Survey Glossary
Sample, Sampling: A subset of the population that is selected to receive invitations to participate in the survey. Sampling may be either probabilistic (every member having an equal chance of selection) or non-probabilistic.
Article: Survey Statistical Accuracy Defined
Sample Bias: Results when creation of the sampling frame excludes a certain type of person, leading to a bias in the sample selected and thus to the respondent sample. For example, tasking contact center agents to invite customers to participate in a survey of their experience likely will mean that interactions that did not go well would not receive the invitation. Very similar to a selection bias.
Article: Creating a Survey Program for a University Help Desk, What Survey Mode is Best?, Impact of Mobile Surveys — Tips for Best Practice
Sample Size Equation: The number of people to include in an Invitation Sample divided by the likely Response Rate. The resulting sample size should lead to the desired statistical accuracy.
Articles: Survey Sample Size Calculator, Statistical Confidence in a Survey: How Many is Enough?
Sample Statistics: Calculated statistics from the Respondent Sample which are presented as indications of the Population Parameters. In other words, the sample statistics are indications of how the entire group of interest would have responded, with some degree of sampling error — and probably some degree of sample and selection bias.
Sampling Error: The error introduced into our survey results by the fact that the data are from a sample rather than from the entire population. The difference between the sample mean and the population mean results from sampling error. Statistical accuracy provides an indication of the level of sampling error.
Sampling Frame: A subset of the population from which the sample is drawn. Sampling frames are used where it is not possible or impractical to draw the sample from the entire population. For example, if we do not have contact information for some people in the population, then the sampling frame would exclude those names lacking contact information.
Article: Caveat Survey Dolor: “Show Me the Questionnaire”
Satisficing: A concept from information theory where people select a “satifactory” answer, but not the optimal answer. When presented with a long list of options from which to select, something that causes “fatigue,” the respondent may select the first, good-enough answer to save time and move on.
Scales: In common survey usage, a scale is an ordered series of response options, presented verbally, numerically, or ideographically, from which the respondents select to indicate their level of feeling about the attribute being measured. More properly a scale is a composite score of a number of survey questions that each measure the same attribute. For example, a final exam for a class is a scaled score from multiple questions that measure the students knowledge of the subject matter.
Articles: Survey Question Choice: How Question Format Affects Survey Data Analysis, Scale Design in Surveys: The Impact on Performance Measurement Systems — and the Value of Dispersion
Scatter Plots: A visual depiction of the correlation between two variables found by plotting one variable on the X and another on the Y axis.
School-Grade Scale: A response scale that uses school grades — A, B, C, D, and F — as the response options. The scale is culturally dependent.
Section Headings: Short headings perhaps with accompanying brief text that lead into a section of a survey. These help set the respondent’s mental frame for those questions.
Articles: Proper Use of Section Headings
Segmentation Analysis: In addition to analyzing the survey data set as a whole, analysis is typically desired for specific segments of the population found by slicing the data set along demographic variables. Common segments are: by income level, by gender, by length of relationship (customer, employee, etc.), and frequency of experience some phenomenon.
Selection Bias: Results when some members of the sampling frame are less likely to participate in the survey due to the way the sample is generated. For example, a webform survey designed for use on a laptop only — not on a mobile device (smartphone) — may result in people who primarily use a mobile device not participating. Very similar to a sample bias.
Articles: Impact of Mobile Surveys — Tips for Best Practice, What Survey Mode is Best?, Creating a Survey Program for a University Help Desk
Self-Selection Bias (also known as Non-Response or Participation Bias): Occurs when members of the Invitation Sample who share some common characteristics choose not to participate in the survey, resulting in a biased Response Sample.
Articles: Bolton Local Historic District Survey, The Hidden Danger of Survey Bias, Why the Polls Were Wrong — Response Bias Combined with Non-Response
Semantic Differentiation Question: A question type where the respondent is presented a multi-point scale in a horizontal numerical structure where each endpoint is verbal anchored with antonyms. Few webform tools support this question type. Very useful in handling reverse coded questions.
Semi-Structured Questionnaire: For focus groups and interviews a fluid questionnaire used to guide the discussion. Good moderator skills needed to apply such a questionnaire.
Sequencing: A response effect where the answer to one question affects the respondent’s interpretation of subsequent questions. Also seen for the sequencing of items in a checklist question. Question randomization can mitigate the effect.
Articles: Survey Question Design: Headlines or Meaningful Information?, Money Grows on Trees — If You Believe the Polls, An Example of the Impact of Question Sequencing
Service Recovery: An action program to address issues that customers have experienced to recoup them as customers or to mitigate negative word of mouth. Transactional surveys serve to identify customers in need of a service recovery act.
Articles: Capturing the Value of Customer Complaints, Service Recovery at United Airlines, Complaint Identification: A Key Outcome of World Class CRM Surveys, Service Recovery Turned Sour: Keep the Lawyers from Turning Fairness Foul, Lessons (that should have been learned) from Service Recovery, A Sporting Service Recovery, Communicate the Survey Findings — and Your Actions, Survey Project Resource Requirements, Sears IVR Customer Satisfaction Survey, Hilton Hotel Customer Survey Program, Swisscom Pocket Connect Survey Design Review
Skip and Hit (also known as Branching and Conditional Branching): Where the question path a respondent follows is based upon the response to a particular question. Useful for avoiding not applicable questions.
Articles: Creating a Survey Program for a University Help Desk
Spam Trigger Words: Words that email filters use to flag possible spam emails. Avoid using these words in an email invitation, lest the invitation never get to the potential respondent’s inbox.
Stacked Bar Charts: Vertical or horizontal bar charts that display frequency distributions from a survey question. Best used when there’s a limited number of response options for the question.
Standard Deviation: Square root of variance. A measure of the dispersion of values in the data set around the mean.
Statistic: A number meant to represent some characteristic of a larger data set.
Statistical Confidence: Commonly used interchangeably with statistical accuracy, it tells us how well a statistic from a sampling process represents the population mean for the data set.
Stratified Random Sampling: A variation of random sampling to generate the invitation sample used to help ensure a consistent statistical accuracy across demographic segments of comparative interest, for example, product line, business region, school district. Sampling here is a two stage process. First, stratify (group) the members of the sample frame by the demographic variable. Second, random sample within each stratum.
Survey: The process of conducting research using survey methodology, though in common parlance “survey” is used to mean the survey instrument or questionnaire.
Survey Administration: The process of inviting people to take the survey and collect data from them. Care must be taken to avoid introducing administration biases into the data set. See “Administration” for other related items.
Survey Instrument (also known as Questionnaire): The measuring instrument used to gauge how some group feels on the topic of the research study.
Survey Fatigue: When members of the population become so weary of repeated survey invitations that they stop taking the survey, except perhaps when they have an issue. Can introduce a participation bias (a.k.a. non-response or self-selection bias). With more organizations surveying, fatigue has increased.
Article: Survey Sample Selection: The Need to Consider Your Whole Research Program, Battling Survey Fatigue, The Hidden Danger of Survey Bias
Survey Manipulation: The practice of trying to influence the scores provided by respondents by those affected by the survey findings. The manipulation may be overt or subtle, but it distorts the true views of the respondents to achieve other goals.
Survey Question: A measuring tool to generate data that will measure some attribute (or characteristic) of interest in the research study. The wording in the question operationalizes the attribute so that the respondent can present their views through some measurement scheme, such as a scale or checkbox. Various question phrasing and question types could be used to measure the same underlying attribute.
Articles: Survey Question Choice: How Question Format Affects Survey Data Analysis
Systematic Sampling: One of the four probabilistic sampling approaches, generally used when the sampling frame is in the form of a list or maybe people queued in a line. Names would be selected at an interval (say, every 8th name) going through the entire list.
Back to Top