One of our main goals at Carolina Forward is to figure out what North Carolinians – especially its voters – think about the important issues our state faces, and then to share what we find with the public. To accomplish this, we sponsor regular statewide polling to give us answers that are as accurate as possible. Today’s post looks closer at just how we do that.
Carolina Forward uses the firm Public Policy Polling as our primary polling vendor. “PPP” is a Raleigh-based pollster that has steadily grown into one of the nation’s most reputable political forecasters. The nonpartisan FiveThirtyEight pollster ratings gives them the rank of “A-”, which is particularly impressive given how many polls they run. We believe PPP gives as accurate a view of public sentiment as it’s reasonably possible to get.
The science and practice of opinion polling is subject to some controversy these days, for reasons both good and bad. There’s a widespread, and mostly incorrect, view that “the polls got it all wrong” in both 2016 and 2020 elections. It’s true that, in some states (including North Carolina), polls predicted stronger Democratic performance over Republican. This was true in polling from both Democratic-affiliated as well as GOP-affiliated firms, however, and reflected a genuine uncertainty over sampling (which we’ll return to in a moment). In many other states, “the polls” were largely correct. Polling error was mostly correlated with the prevalence of certain types of voters in the electorate.
Many people interpret opinion polls, especially head-to-head election matchups, as more reliable than they really are. People change their minds, forget to vote, lie and more. If you ran a Monte Carlo simulation of every election, the good polls would be pretty much accurate, but just like in March Madness, elections don’t work that way – it’s just one game, where weird things can and do happen.
In other words, well-constructed polls are pretty reliable, but they’re not gospel. They’re a snapshot of the general contour of opinion at a single moment. And perhaps most importantly, they’re effectively the only way we have of easily gauging public opinion.
There are also many ways to rig a poll. If you want to produce a poll that supports some opinion you have, it’s pretty easy to do by asking leading questions: “Do you believe legislators should support tax cuts for economic growth, or job-killing tax hikes?” This practice is, unfortunately, extremely common. Tons of polls you see are pure junk because they do this.
Another method is to ask about things most people have never heard of. Carolina Forward is relatively conservative in what we poll and how, because the reality is that the vast majority of people do not know the background to a lot of important issues. For example, very few people know just how the state unemployment insurance system works, so it’s extremely difficult to reliably or meaningfully poll specific proposed changes to it. For this reason, we only poll reasonably well-known issues with widely understood terminology.
And finally, another favorite way to rig a poll is to only ask people who agree with you. This gets into sampling, which is probably the single most difficult thing about doing good polling. Basically, how do you know who to ask? What’s the exact party affiliation, race/sex and regional mix you need to know you’re getting an accurate cross-section of North Carolina’s electorate?
There’s no one right answer to that question. But here’s ours.
How We Poll
In Carolina Forward’s polling, we take care to specifically weight our sampling based on who actually votes. That means a slim majority of our poll respondents are Trump voters, slightly more are women, and more yet again are white:
A sure-fire way to know that you’re looking at a dodgy poll is when it’s not weighted by education. A surprising number of opinion polls are not, and it’s an automatic sign that the pollster is either unreputable or deliberately misleading. All of our polls are properly weighted to reflect the educational attainment of the North Carolina electorate. All of these breakdowns reflect the approximate weighting in Carolina Forward polls.
What’s Polling Good For?
The polls that matter most are, of course, elections themselves. Public opinion can and does go in any number of directions, not infrequently totally counter to what elected representatives themselves think. Part of what makes gerrymandering so toxic is that it strongly and deliberately distorts the popular will. But polling also helps understand where we are going. It is one of the very few rigorous, evidence-based ways we have of knowing what a large polity like North Carolina wants or feels. And for this reason, lots of people are curious to find out, as well as to shape, what those opinions are and how they’re reported.
The other most visible opinion pollsters in North Carolina are Elon University (through their partnership with the News & Observer), Meredith College and the John Locke Foundation (who uses Cygnal, a Republican firm). In truth, all of them are reasonably good at what they do. Both Elon and Meredith do very few polls overall, and so their quality is a little difficult to rate, while Cygnal is rated B+ by FiveThirtyEight. The JLF’s question phrasing does frequently suffer from leading language that biases its results (an example: “How concerned are you, if at all, that the Biden Administration will do too much to increase the size and role of government in U.S. society?”), but their sampling methodology is basically sound.
North Carolina is fundamentally a sensible and very moderate state, most of which is open to new ideas and approaches to solving our issues. When people actually understand progressive ideas, instead of caricatures, they like what they hear. There is significantly stronger and deeper voter support in our state for progressive policies than many people assume, as we’ve demonstrated in our polls on raising the minimum wage, automatic voter registration, repealing the ban on municipal broadband and pivoting to renewable energy. And there’s much more to come – stay tuned.