Pollsters: ‘Impossible’ to say why 2020 polls were wrong
July 18, 2021A new, highly anticipated report from the leading association of pollsters confirms just how wrong the 2020 election polls were. But nine months after that closer-than-expected contest, the people asking why are still looking for answers.
National surveys of the 2020 presidential contest were the least accurate in 40 years, while the state polls were the worst in at least two decades, according to the new, comprehensive report from the American Association for Public Opinion Research.
But unlike 2016, when pollsters could pinpoint factors like the education divide as reasons they underestimated Donald Trump and offer specific recommendations to fix the problem, the authors of the new American Association for Public Opinion Research report couldn’t put their finger on the exact problem they face now. Instead, they've stuck to rejecting the idea that they made the same mistakes as before, while pointing to possible new reasons for inaccuracy.
“We could rule some things out, but it’s hard to prove beyond a certainty what happened,” said Josh Clinton, a professor at Vanderbilt University and the chair of the association’s 2020 election task force. “Based on what we know about polling, what we know about politics, we have some good prime suspects as to what may be going on.”
Those “prime suspects” will hardly be comforting to pollsters and those who depend on them, from political campaigns to the news media. The most likely — if far from certain — culprit for off-kilter polling results is that key groups of people don’t answer polls in the first place.
Decreasing response rates have been a major source of concern for pollsters for more than a decade. But the politicization of polling during the Trump era — including the feedback loop from the former president, who has falsely decried poll results he doesn’t like as “fake” or deliberately aimed at suppressing enthusiasm for answering polls among GOP voters — appears to be skewing the results, with some segment of Republicans refusing to participate in surveys.
But pollsters say they can’t be sure that’s the main reason, because you never know exactly whom you’re not talking to.
That makes the problems with polling a lot harder to fix than the diagnosis four years ago, which mostly focused on adjusting surveys to account for Trump’s popularity with voters who haven’t earned college degrees and his corresponding weakness with college degree-holders.
“It seems plausible to the task force that, perhaps, the Republicans who are participating in our polls are different from those who are supporting Republican candidates who aren’t participating in our polls,” Clinton said. “But how do you prove that?”
The task force’s first job was to evaluate the performance of the 2020 public election polls. On that measure, polling earned a failing grade. While the national polls were the worst in four decades, the state-level polls of the presidential, Senate and gubernatorial races were as bad as they’ve been as far back as there are records (20 years).
According to the report, national polls of the presidential race conducted in the final two weeks of the election were off by an average of 4.5 percentage points, while the state polls were off by just over 5 points. Most of the error was in one direction: Looking at the vote margin, the national polls were too favorable to now-President Joe Biden by 3.9 points, and the state polls were 4.3 points too favorable for Biden.
Most of the error came from underestimating Trump’s support, as opposed to overestimating Biden’s. Comparing the final election results to the poll numbers for each candidate, Trump’s support was understated by a whopping 3.3 points on average, while Biden’s was overstated by a point — turning what looked like a solid Biden lead into a closer, if still decisive, race.
It wasn’t just a Trump effect, either. The polls of Senate and governor’s races were off by an even greater margin: 6 points on average.
“Within the same state, polling error was often larger in senatorial contests than the presidential contest,” the AAPOR report reads. “Whether the candidates were running for president, senator, or governor, poll margins overall suggested that Democratic candidates would do better and Republican candidates would do worse relative to the final certified vote.”
No one methodology performed head-and-shoulders above the others. According to the report, there were only “minor differences” whether polls were conducted on the phone, over the internet or using a mixed methodology, including texts and smartphone apps — or whether they contacted voters randomly versus off a list of registered voters. “Every mode of interviewing and every mode of sampling overstated the Democratic-Republican margin relative to the final certified vote margin,” the report said.
After the 2016 election, AAPOR’s autopsy blamed that year’s polling errors on a number of different factors. First, the organization said, a larger-than-usual number of undecided voters measured in polls flocked toward Trump disproportionately at the very end of the race, giving him an advantage that would be impossible to measure beforehand.
But 2020’s error can’t be blamed on late deciders: Only 4 percent of voters weren’t behind one of the two major candidates in state polls conducted over the final two weeks, and exit polls suggest late-deciding voters split roughly evenly between Biden and Trump.
Another of the 2016 problems — the failure of many pollsters to weight by education — wasn’t to blame last year, either, the report said. Four years earlier, many pollsters adjusted their results to get the right mix of voters by race and gender. But that missed a key, emerging dynamic in the electorate: Increasingly, white voters with college degrees have supported Democrats, while those who didn’t graduate from college rapidly flocked toward Republicans. Studies show voters without college degrees are less likely to participate in polls.
In 2020, however, the majority of state polls made adjustments to get more non-college voters in their polls. But they were still wrong.
Other 2016-style factors were also dismissed: Voters weren’t lying to pollsters about whom they’d support because of some kind of “shy Trump” theory (otherwise the errors wouldn’t be larger in downballot races). It wasn’t that one candidate’s backers didn’t show up to vote (as evidenced by the record-breaking turnout in last year’s race). And estimating the number of voters who would cast early ballots versus show up on Election Day also wasn’t to blame (the polls mostly nailed that split).
The report is clear on what didn’t cause the 2020 polling miss. But it says “identifying conclusively why polls overstated the Democratic-Republican margin relative to the certified vote appears to be impossible with the available data.”
The most plausible — yet still unproven — theory is that the voters the polls are reaching are fundamentally different from those they are not. And Trump’s rantings about the polls being “fake” or rigged only exacerbate that problem.
“If the voters most supportive of Trump were least likely to participate in polls then the polling error may be explained as follows: Self-identified Republicans who choose to respond to polls are more likely to support Democrats and those who choose not to respond to polls are more likely to support Republicans,” the report reads. “Even if the correct percentage of self-identified Republicans were polled, differences in the Republicans who did and did not respond could produce the observed polling error.”
AAPOR isn’t the only organization struggling to nail down where things went wrong. A collaborative report conducted by five of the largest Democratic campaign polling firms, released this spring, said “no consensus on a solution has emerged” to fix the 2020 errors.
While explanations remain elusive, pollsters and their clients are hard at work on changes to methodologies. Soliciting poll respondents via text messages — or text surveys entirely — are increasingly popular as fewer Americans are willing to take a 15-minute phone poll. Online polling continues to grow as well.
Public polls commissioned by the media are also changing. NBC News and The Wall Street Journal terminated their more-than-30-year-long polling partnership late last year, a Wall Street Journal spokesperson confirmed to POLITICO. The two news organizations had long worked with a bipartisan pair of major polling firms on regular phone surveys.
Without definitive answers about the causes of the 2020 miss, however, pollsters aren’t sure they’ll be able to get it right in 2022, 2024 or beyond.
“Even seven months after the fact, you’d think you’d be able to know exactly what happened,” Clinton said.
“How certain are we that we can fix this in the future? Well, it’s unclear,” Clinton added. “We’ll have to wait and see what happens — which isn’t a particularly reassuring position. But I think that’s the honest answer.”
Source: https://www.politico.com/