ePrivacy and GPDR Cookie Consent by Cookie Consent
The leading association
of public opinion and
survey research professionals
American Association for Public Opinion Research

Automated Polls

Automated polls – telephone polls that employ a recorded voice in place of a live interviewer – go by many names, including robopolls and interactive voice response (IVR) polls. These polls make use of a recorded voice to ask questions of respondents, who in turn respond by pressing a key, speaking the number that corresponds to their answer, or simply saying their answer. Different methods exist for connecting potential respondents to the recorded voice. For example, some organizations use automatic dialing-announcing devices, other organizations have respondents dial in directly (say when they have been contacted in another mode, such as mail or Internet), and still others use a recruit-and-switch method in which potential respondents are contacted by a live interviewer and then switched to the recorded voice. Automated polling has proliferated in the last decade, largely due to the lower costs and fast turnaround time associated with collecting data using this method.
Advantages and Disadvantages
Journalists and other consumers of polls should understand the potential disadvantages and advantages of this methodology (Currivan 2008). Polling experts caution that automated surveys face several potential pitfalls (Blumenthal 2005). First, there are specific state and federal laws that restrict, and even prohibit, certain kinds of automatic calling. For example, it is illegal to call mobile phone numbers using automated dialing methods. This could mean that the growing number of Americans who are only reachable by their mobile phone will not be represented in the sample unless they are dialed manually.
In comparison with surveys that use a live interviewer, response rates are likely to be much lower for automated surveys. Further, even when they start answering questions using this methodology, respondents are much more likely to break off and not complete the interview (Tourangeau et al. 2002). Thus, error due to nonresponse may affect the accuracy of the poll. It is also difficult to employ methods to randomly select a respondent within the household without the help of a live interviewer. Without a random selection within households, the sample may not accurately represent the target population. Automated polls typically interview whoever answers the phone and then “weight” or statistically adjust the data after collection to conform to specific demographic characteristics of the target population (such as for age and gender). It should be noted that live interviewer phone polls are also typically weighted. The absence of a live interviewer makes it difficult (or in some cases impossible) for respondents to have questions repeated or obtain clarification of word or phrase that they do not understand. In addition, without an interviewer to help motivate the respondent and record answers, long interviews and those with open-ended questions are not practical.
Conversely, the absence of an interviewer may offer two advantages to this methodology over an interviewer-administered survey (Currivan 2008). Because they use a pre-recorded voice to read all of the questions in exactly the same way for every respondent, IVR interviews adhere to a higher degree of standardization than interviewer-administered interviews. Also, some research indicates that respondents answer more honestly when they are able to record their answers using an electronic entry method than when they have to verbalize their answers to another human.
Questions to Ask about an Automated Poll
Can we trust polls using automatic dialers and recorded voices? Many in the traditional polling community remain skeptical of the capability of these surveys to produce reliable results. But some believe the track record of some automated polls in recent elections suggests the technique can be valid. To date, academic studies that have compared the track record of automated polls with interviewer-administered surveys have produced mixed or inconclusive results.
To help journalists and the public evaluate the quality of an automated poll, it’s essential that pollsters who conduct IVR surveys publicly disclose how their polls were conducted. Here are some critical questions:
  1. How were households selected to participate? How did the poll address the federal restriction that prohibits calls to mobile phones using an autodialer? If mobile phones were completely excluded, the representativeness of the poll is questionable, since a substantial proportion of the adult population lives in a household with a cell phone but no landline.
  2. How were individuals in the household selected to be respondents? Some IVR surveys begin with a human interviewer to screen for respondents — say, youngest male in the household, registered Republicans — then switch to a computer. Others merely take the answers of whoever answers the phone. A drawback to the recruit-and-switch method is that many respondents hang up during the switch to IVR.
  3. Were respondents capable of answering all the questions to a computer? Human interviewers, while fallible, are able to repeat the question or answer categories to respondents and can clarify a vague answer. If a question is long or has several answer categories, respondents may be confused or forget the choices listed. Journalists should be able to review the questionnaire used.
  4. How were the data adjusted? Were the data weighted to adjust for demographics, lack of mobile phones, or to match the data of other polls released? It is important to know how much adjustment has been done and its impact on the poll’s accuracy.
  5. What is the track record for this particular company's IVR polls, compared with other companies' polls?
    Some companies that use IVR have an established track record that compares favorably with companies that use more traditional methods. Other companies' records may be spottier.
Blumenthal, Mark M. 2005. "Toward an Open-Source Methodology: What We Can Learn from the Blogosphere." Public Opinion Quarterly 69: 655-69.
Currivan, Douglas B. 2008. "Interactive Voice Response (IVR)" In Encyclopedia of Survey Research Methodology, edited by Paul J. Lavrakas, 342-344. Newbury Park, CA: Sage.
Tourangeau, Roger, Darby Miller Steiger, and David Wilson. 2002. "Self Administered Questions by Telephone: Evaluating Interactive Voice Response." Public Opinion Quarterly 66: 256-66.

Download PDF Version