Home WebMail Saturday, November 2, 2024, 03:27 AM | Calgary | -1.0°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Posted: 2015-02-17T21:55:15Z | Updated: 2017-12-07T03:20:09Z 2014 Was The Year Automated Polls Got Lucky | HuffPost

2014 Was The Year Automated Polls Got Lucky

2014 Was The Year Robopolls Got Lucky
|
Open Image Modal

Last year automated phone polls, sometimes called "robopolls," got closer on average to the actual election results than most other surveys. Yet such polls have been repeatedly criticized as less reliable . So how is it that an oft-disparaged method ended up with the most accurate results?

In short, those pollsters got lucky in the 2014 election cycle. And we can use house effects -- the degree to which any pollster's results consistently favor one party over another -- to show how that happened.

Automated phone surveys are just one of several methods that pollsters use to contact people and ask questions. Those in the industry refer to these methods as the survey's "mode" -- as in, mode of obtaining polling data.

The most popular way to reach potential respondents is by calling them. "Live phone" refers to reaching people by telephone, both household landlines and cell phones, with live interviewers asking questions and recording answers. Automated phone polling, better known in the industry as interactive-voice response (IVR), uses a recorded voice to ask the questions, thereby cutting down costs.

As the Internet has become more essential to everyday life, pollsters have found ways to contact respondents online as well. The most prolific Internet pollster in 2014, YouGov , first asks people to sign up for its "panel" -- the list of people from which it chooses a sample -- and then emails a certain number of people from the panel with the survey.

The table below shows the average house effects for the 2014 Senate and gubernatorial polls that used these three methods, plus a hybrid IVR/Internet category. A house effect of zero would mean the polls nailed the actual election result exactly. But all of the house effects are negative, which means that, on average, the polls in each and every category overestimated voter support for Democratic candidates (for more information on how the house effects were calculated and how to interpret them, see our previous post on this topic ). But some types of polls leaned more Democratic than others.

IVR polls showed the least skew toward Democrats with the Senate surveys and with all surveys. Live phone polls leaned the least Democratic with the gubernatorial surveys. The differences could lead you to think that IVR polls performed the "best."

We shouldn't make this assumption, however, without considering what we know about the IVR method and its performance over time.

Since the largest differences among polling methods last year involved Senate polls, the next table focuses on house effects for Senate polls in the last three midterm elections -- 2006, 2010 and 2014 -- to get a longer-term view of how well the various methods have worked. In this table, the house effect figures represent the difference between the average house effect for all polls that year and the average for each method. Positive values indicate the method favored Republicans more than all polls on average, and negative values show the method favored Democrats more.

We're limited on which methods we can compare, however. The IVR/Internet method wasn't commonly used in the earlier election cycles. Plus, the data for the Internet method are questionable since those polls were dominated by the not necessarily reliable Zogby Interactive in 2006, and then the method was used for very few polls in 2010. Live phone and IVR were the most often used methods over the three midterms, so we'll really look at those.

The average house effect for live phone polls varied little from the average of all polls and didn't seem to consistently favor one party or another. IVR polls, however, showed a definite lean -- in a positive direction, meaning toward Republicans. In all three cycles, the average house effect for IVR polls was at least one percentage point more favorable to Republican candidates than the average for all polls. This is most likely due to a coverage issue: Automated polls cannot contact cell phones.

The Federal Communications Commission bars pre-recorded calls to cell phones , even as more and more households have replaced home landlines with mobile phones . Since younger people and minorities are more often reached via cell phone, IVR polls tend to disproportionately miss those groups -- and those two groups are more likely to vote for Democrats . (In 2014, many IVR pollsters used Internet samples to reach the people who are less likely to have landlines, but it's unclear whether that fixed the problem.)

This pattern explains why IVR pollsters appeared to do better than others in 2014: The automated method has trouble reaching groups that usually prefer Democratic candidates, with the result that these polls lean more toward Republican candidates . So in a year when most polls underestimated Republican support, a method that sampled more people likely to lean Republican ended up producing more accurate results. But it doesn't mean we should give automated polls credit for being better. The criticism of the method is still valid.

The lesson here: Despite all the adjustments pollsters make to their data to more closely represent the voting population, if a method of collecting data consistently under-samples specific parts of the population, its results will be skewed.

This article is the second in a series on pollster house effects. The first post focused on calculating house effects and partisanship.

Your Support Has Never Been More Critical

Other news outlets have retreated behind paywalls. At HuffPost, we believe journalism should be free for everyone.

Would you help us provide essential information to our readers during this critical time? We can't do it without you.

Support HuffPost