1989 – 2014: 25-YEARS IN MARKET RESEARCH. Your Opinion Counts

July 1989

 

Overall refusal rates of 38 percent in the 1980s were considered cause for a public education program. A Quirk’s July 1989 article, “Study tracks trends in refusal rates”, explains how six market research industry associations joined together for the “Your Opinion Counts” Public Education Program.

 

 

 

July 2014

 

Perhaps the “Your Opinion Counts” campaign worked, because, according to a Pew Research study, the response rate for a typical telephone survey had only fallen two percentage points from the 1980s 38% to 36% in 1997. Fast forward fifteen years and response rates have plummeted. According to Pew’s 2012 study Assessing the Representativeness of Public Opinion Surveys, the response rate for a typical telephone survey had fallen to a very low 9%. The study found that response rates declined across nearly all types of surveys.

 

With greater effort and expense, the study showed response rates could be increased. Pew’s study is based on two national telephone surveys; one used their standard methodology, the other survey was conducted using multiple efforts to increase participation. The difference? “The high-effort survey employed a range of techniques to obtain a higher response rate (22% vs. 9% for the standard survey) including an extended field period, monetary incentives for respondents, and letters to households that initially declined to be interviewed, as well as the deployment of interviewers with a proven record of persuading reluctant respondents to participate.”  Lockwood Research uses all of these techniques (when budget and time allow) to increase response rates, including a letter or postcard before data collection, not necessarily after the respondent has declined.

 

What if your budget and/or timeline don’t allow for a more “rigorous” survey that employs measures to increase response rate? What are the consequences of lower response rates to the quality of your research findings? Does your sample still replicate the characteristics of your target population? The good news is that the two research studies using very different research designs found little variation (two percentage points) between the “standard” and “rigorous” survey results. According to the article, “Gauging the Impact of Growing Nonresponse on Estimates from a National RDD Telephone Survey”, “These results provide compelling evidence that response rate is not necessarily an indicator of survey quality. It is unclear, however, under what circumstances – or for how long – this finding will hold”. The study also isolated the “hardest-to-reach” respondents from the “rigorous” sample to determine how their responses differed, if any, and whether adding them to the sample improved the quality of the data. It did not. So, for the extra cost and effort to reach these individuals, little was gained in terms of data quality.

 

Survey nonresponse is often thought of as those who are never reached and those who refuse to participate. But, terminated interviews are also included – these are the respondents who start the survey and then quit before the interview is complete. It is not surprising the most notable difference between this segment and those who completed the entire survey, is that those who initially consent and then terminate the survey are more likely to give “don’t know” responses. This group tends to be less interested and engaged in the survey topic. It is for this reason that Lockwood Research interviewers do not try to “hard sell” or “coerce” respondents into participating. If someone has little knowledge of the topic, increasing the percent of “don’t know” responses by including them in the sample does not seem to be an efficient use of the client’s resources.

 

The willingness of potential respondents to participate in a survey, incented or not, is a primary concern of all researchers. The good news gleaned from these Pew experiments that used a “standard” and “rigorous” approach to collecting data from two samples is that lower response rates did not appear to diminish the quality of the data. While every effort should be made to achieve the highest response rate, and include as many in the sample frame as possible in the survey project, the added time and budget also have to be taken into consideration, and sometimes it isn’t justified.