What polling does and doesnt tell us

Posted by Fernande Dalal on Saturday, July 13, 2024

Polls are powerful – they can influence emotions and shape political fortunes. They can be used to drum up support for campaigns and reveal how closely aligned (or far apart) the general public is on consequential Supreme Court rulings or presidential policies.

Like every election year, the 2022 midterms have released a riptide of numbers as pundits and voters alike try to grasp their significance. At the state level, polls and primary results indeed have sent people in some parts of the country wondering how to read surveys. In Kansas, voters surprised political watchers when they rejected the state’s attempt to restrict abortion access by a more dramatic margin than earlier polls had suggested. With control of Congress at stake in this year’s midterm elections, polls gauging whether people favor Republicans or Democrats are being closely watched and analyzed at every juncture.

Rather than conduct a census of every single person to find out where the public stands on an issue or candidate, surveys sample opinions and can distill information that helps us better gauge our world. At their best, polls can empower the broader public to help influence crucial decisions. But it is important to recognize the limitations of surveys, in addition to the strengths.

Recent national elections have reminded us how problematic it is when we think of polls as forecasts of the future rather than a glimpse at where people stand at a given moment in time. During the 2016 presidential election, a few state-based polling firms predicted larger leads in Pennsylvania, Wisconsin and Michigan for then-Democratic presidential candidate Hillary Clinton over her opponent, Republican candidate Donald Trump. Trump won the Electoral College vote.

READ MORE: How to read the polls in 2020 and avoid the mistakes of 2016

Statisticians and polling experts offered the PBS NewsHour these insights on how to navigate polls and statistics and avoid rocky misinterpretations, election year or not.

What is the margin of error?

Polls are estimates – ideally, very good estimates. The margin of error is a range that tells you how close that estimate gets to reflecting “the results that would have happened had we interviewed everyone,” said Barbara Carvalho, who directs the Marist Poll. Generally speaking, the smaller the margin of error, the better the estimate.

But not all polls include a margin of error, including non-probability polls such as those you can opt into online. Participants for these polls self-select, or opt in, and there is a risk that “these samples will not resemble the larger population,” according to Pew Research Center. When you’re reading the results of a poll, the absence of that estimated range is a red flag that a poll may not tell you much about the world beyond the exact people who responded to it, Carvalho said.

How is a poll of 1,000 people nationally representative?

Short of conducting a census every time someone wants to know what’s on people’s minds, a sample of that population, or a poll, can be used to understand people’s attitudes on things like the price of gas or job performance of a political leader.

You probably use sampling in your own life. Let’s say you are following a new soup recipe for the first time, and you want to know how it tastes. To find out, you sample a spoonful – there is no need for you to slurp down all of the soup to know what you think.

Similarly, pollsters take a representative sample of the U.S. population when conducting a survey to gauge how the country feels about a particular issue.

“In probability-based polling, the sample is drawn so that every person has a mathematically determined, non-zero chance of being selected,” according to pollster Lee Miringoff, who directs the Marist Institute of Public Opinion. That means recipients are random and not predetermined, and that everyone who meets the criteria has the same chance of getting that call.

The science of conducting surveys doesn’t stop there. Samples don’t always mirror the demographics of the population from which they are drawn. So survey methodologists make adjustments that better capture populations with specific demographics and ensure the estimate is more representative of all U.S. adults or households.

Using data such as the American Community Survey or the U.S. Census as a basis to compare their mix of respondents, the researchers may oversample – what’s known as sample weighting – from specific demographic groups to reflect their share of the nation’s population overall. That way, the sample results can offer a more accurate picture of what the broader population thinks about an issue.

Do people lie to pollsters?

When high-profile political outcomes haven’t matched what polls showed, it’s fueled speculation that people taking surveys have deceived the pollsters. In the 2020 presidential election, polls exaggerated then-Democratic presidential candidate Joe Biden’s lead over then-Republican President Donald Trump. “The 2020 polls featured polling error of an unusual magnitude,” a 2021 investigation by the American Association for Public Opinion Research concluded.

Yet “there is no evidence that respondents were lying,” the authors found. The task force wrote that “conclusive statements are impossible” to make about why those estimates were so off – “the highest in 40 years for the national popular vote” – but it is plausible that Trump voters might have been less likely to agree to be polled.

Concern that people lied in the polls spread, too, following Trump’s Electoral College victory over Clinton in 2016. But again, differences may not boil down to simple fibbing. In 2020, Charlie Cook from the Cook Political Report pointed out that one deciding factor in that election’s upset relied on “how undecided voters break at the end,” pointing out that 85 percent of undecided voters made up their minds within a week of Election Day.

Sometimes, people will say what they think another person wants to hear, even if it’s not true. In polling, that behavior is called social desirability, and the way a person hears or sees a question (over the phone, in person, in a mailed-in paper form) can influence how they respond, as well as the way questions are framed and organized in the survey itself.

This concept comes up in election polling, especially when people are asked if they plan to vote, Carvalho said. In a democracy, voting is generally viewed as virtuous, while failure to vote is looked down upon. Rather than risk disappointing someone, even a stranger, people may say they plan to vote even if they have no intention or interest in doing so. To capture whether or not a person is a likely voter, she said she asks a battery of questions, such as if they have voted in the past or how intensely they feel about voting. After asking these same questions to enough people over time, Carvalho said pollsters develop a better estimate of people’s actual behavior patterns.

“We know if people answer these questions in a certain way, they are more likely to vote, and if they answer another way, they’re less likely to vote,” she said.

The PBS NewsHour will be adding more answers to common questions about surveys and polling, as well as exploring statistical concepts tied to numbers and nuance through the series, Ask a Pollster.

ncG1vNJzZmivp6x7sa7SZ6arn1%2Bjsri%2Fx6isq2egpLmqwMicqmixn6q%2FbrPUopueZaSkera6w56prKyRo7GqusZmp6iknKg%3D