Are Accurate Polls Really Accurate? | HuffPost - Action News
Home WebMail Tuesday, November 5, 2024, 01:02 AM | Calgary | 1.6°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Posted: 2014-10-02T19:24:52Z | Updated: 2014-12-02T10:59:01Z Are Accurate Polls Really Accurate? | HuffPost

Are Accurate Polls Really Accurate?

Regardless of how they choose to, pollsters are going to be forced to innovate if they want to continue providing accurate advice to their candidates and campaigns.
|
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Election Day is near, and we are about to be inundated with polls. Is the Senate destined to turn Red? Will there be a "wave" for either party? What do President Obama's approval ratings mean for the election? After recent big public misses, including Eric Cantor's primary upset, observers rightly ask the question: Can we trust the polls?

To say that recent changes in behavior and technology have impacted the way we communicate and interact with information and with each other is a dramatic understatement. This presents pollsters -- who rely on the ability to reach a representative group of people in order to predict attitudes of a broader audience -- with an acute challenge. If a poll fails to reach everyone, can we have confidence in its findings?

Whether by a tremendous balancing act or shear brute force, pollsters are adept at making even the most skewed population predict results with some measure of accuracy, thanks to modeling, post-stratification weighting and other techniques.

But this sort of effort only addresses the problem at the surface. Frequently lost in this discussion are the implications beneath the surface, that is, all of the other questions after the vote. If it's getting harder to get the vote right, what about all that data campaigns are using to set their strategy?

Polls are increasingly finding themselves in lipstick-on-pig territory. Data may be made to look good on the surface, but if the fundamental problem of reaching a representative sample goes unaddressed, the data beneath the surface may not reflect reality. For candidates and campaigns, that means all the money spent on ads and all the hours spent in debate prep honing talking points may all be based on incorrect conclusions drawn from inaccurate data.

Our recent surveys suggest this is a valid concern. Experimentation with the inclusion of online and mobile (non-voice) respondents suggests that -- all else being equal -- respondents reached online are attitudinally different from those reached by phone. Demographically comparable respondents reached by phone give different responses to key questions such as favorability and message tests than their online counterparts, even as they give the same response on the vote question!

In other words, the vote may be accurate, but what shifts opinion and moves the vote -- the things campaigns need to know to set their strategy and connect with the public -- could be way off.

Take these examples from a recent survey in California:

  • Phone respondents are more ideologically polarized. Online respondents were 17 points more likely to describe themselves as "moderate." This holds true even controlling for party registration and past vote preferences.

  • Different messages work. Most startlingly from a strategic perspective, looking just at representative sample of phone-only respondents would lead us to different messaging recommendations than when you include the online respondents.
  • Online respondents are more upscale and educated. Online respondents who only or mostly just use a cell phone (as opposed to a landline) are significantly more upscale than their "cell phone only" and "cell phone mostly" counterparts reached by phone and are 10 percent more likely to have a college degree.
  • Different feelings about the economy. These same online respondents who are cell phone only or mostly are significantly more negative toward the statewide economy (10 points) than their phone counterparts, controlling for demographics and party registration.
  • Ours is not an exhaustive experimentation and can't fully explain why respondents are attitudinally different across mode, but extensive multivariate regression analysis controlling for variables typically used in weighting (i.e., age, race, gender, education) and other key variables such as partisanship and ideology supports our findings that respondents are indeed attitudinally different. The decision to include or exclude these respondents would have a significant impact on the survey's findings.

    Pew Research, Nate Silver and others have pointed out the challenges facing pollsters today and the need for more experimentation. Their voices are joined by a loud chorus of experts, such as political scientists Prof. Loren Collingwood and Prof. Justin Wolfers , calling for more experimentation to understand how to continue to produce accurate survey data in the face of this changing environment. Even Microsoft has gotten into the act.

    But, with campaigns, candidates and companies needing accurate data and good advice now, the question is what immediate steps pollsters can take to improve their data's accuracy beyond the vote.

    Gone are the days when phone surveys can reliably reach most voters -- response rates in phone surveys have dropped from around 30 percent to under 10 percent in the last 15 years. People now spend more time with digital media (5:46/day) than watching TV (4:28/day) and more of that time spent online is via mobile (2:51/day) than laptop or desktop (2:12/day). And almost 9 in 10 people have access to and use the internet . All of this suggests that incorporating online and mobile modes into existing phone survey samples will broaden the reach of surveys and include more populations who may not be reachable by phone. Pollsters should be more open to experimentation with blending multiple modes (landline, cell, online, mobile) to produce more representative samples.

    To be sure, there are challenges with online panel surveys. The Executive Council at AAPOR (the American Association for Public Opinion Research) even put together a Task Force on Non-Probability Sampling to examine the merits of online panels and other non-probability methodologies.

    But these challenges are not insurmountable. In spite of the challenges, evidence suggests there are significant advantages to including online, most importantly the ability to reach more -- and different -- people. Pollsters know now the importance of including cell phones in phone surveys. There are entire groups of people -- younger, lower income, more urban, a higher share of minorities -- who are difficult or impossible to reach by landline. Similarly, it is possible that the people reached online are entirely unreachable by phone -- landline or cell.

    Regardless of how they choose to, pollsters are going to be forced to innovate if they want to continue providing accurate advice to their candidates and campaigns. Adding cell phones and moving away from random digit dialing was a start, but this is no longer enough and more blended sample experiments are needed.

    So, yes, the public should continue to trust poll results, but pollsters must grapple with the changing realities. Only by continually innovating and experimenting will they be able to maintain a high level of accuracy that the public and candidates -- as well as companies and issue campaigns -- can rely on.

    ClearPath Strategies conducts survey research and offers strategic advice to progressive leaders and forces around the world. You can read more about the company and its innovative survey sampling methodology online.

    Your Support Has Never Been More Critical

    Other news outlets have retreated behind paywalls. At HuffPost, we believe journalism should be free for everyone.

    Would you help us provide essential information to our readers during this critical time? We can't do it without you.

    Support HuffPost