How Can Data and Polls Inform Your Reporting?

Lean how political polls are made, how to report on polls and how to tell if poll results are reliable.

Sign up for our newsletter! Right Arrow

VIDEO: Steve Kornacki, national political correspondent for NBC News, explains when and how to use data and statistics in your work as a journalist covering politics, sports and everything in between.


Public opinion polls help us explain stories in the news. Is a new policy popular? Do people want their government to be more or less active in their lives? Who is most likely to say they’re reluctant to get a Covid-19 vaccine? 

We can answer all those questions using polls. But it’s important to understand how polls work. Which ones are most credible? Even good polls may have limitations.  

The NBC News Political Unit gives us polling do’s and don’ts.  

DO: Think about a poll’s sample before reporting on it.

Not all polls are created equal.  

Imagine that you post a “survey” on your Instagram account that asks: “Do you play video games often, only occasionally, or never?” That question would only be seen by people who happen to follow you on social media. The people who answer would make up your poll’s sample. But it’s unlikely that your own social media friends are exactly representative of the nation as a whole, because your friends are more likely to be like you. The results of your Instagram survey probably won’t reflect the true breakdown of how often all Americans of all ages and backgrounds play video games.  

Reputable pollsters work hard to find a sample that reflects the entire population they’re interested in polling. Often, they use “random digit dialing” to call telephone numbers generated randomly by a computer program. That way, anyone with a phone number has an equal chance of being contacted. Other methods, such as randomly selecting voters from a list, can also be used to create representative samples.  

DO: Look at the actual questions that are asked in political polls, as well as surveys’ methodology.   

The way a question is asked also matters. Traditionally, the gold standard has been “live-caller” polls, in which a real person asks questions over the telephone. Online polls are also gaining in popularity. But polls that use a recorded or robotic voice to ask questions don’t meet NBC’s methodological standards.  

NBC also often decides not to use political polls designed or sponsored by a partisan organization. And we’d never use a poll that used wording designed to sway a voter. For example, campaigns sometimes conduct “push polls” that give biased or one-sided information before asking a question.  

Here’s an example: “If you knew that candidate X’s energy policy would mean increases in gas prices and your taxes, without any real benefit to the environment, would you be more or less likely to support them?”  

If you see a question like that asked in a poll, it’s almost certainly one you should avoid.  

DON’T: Ignore when political polls are taken.

Political polls are useful for tracking how news events change public opinion. For example, former President Donald Trump’s approval ratings were very stable until the end of his presidency, but they dipped after the Capitol riot on January 6, 2021. If you’re including a poll in your news coverage, make sure that you reference when the poll was taken. Describe why that time frame might be relevant to the story you’re telling.  

DO: Note if a polling result is very different than what other surveys are showing. 

Sometimes, even results from gold-standard political polls don’t feel right. (See the next question for why that sometimes happens.) Let’s say almost every poll of a Senate race shows two candidates neck-and-neck, but a new survey shows one candidate up by double digits. It’s possible that the survey may have accurately captured a shift in the race. It may also be what’s known as an outlier that’s a result of a statistical accident. It’s OK to report on poll numbers that seem like outliers, but you should note in your coverage when one poll looks very different than other survey results.  

DON’T: Describe a difference of just a few percentage points as a “lead” if it’s within the margin of error.  

Polls are ultimately about statistics, probability and math. A reputable pollster acknowledges that sometimes even a random sample may end up being a little skewed, just because of bad luck. That’s why a good survey will note that the poll has a margin of error based on the number of people in the sample.    

As an example, our NBC News national polls usually have a sample size of 1,000 adults. Based on some math equations that factor in the total number of adults in the whole country, we calculate that our margin of error for a 1,000-person poll is plus or minus about 3 percentage points.  

What does that mean?  

Well, let’s say that our poll finds that 48% of adults support candidate X and 52% support candidate Y.  It would be misleading to say that candidate Y is definitely “leading.” 

Here’s why:  

The margin of error shows that we are very confident that the real value of adults who support candidate X is within 3 points of 48% (between 45% and 51%). We’re equally confident that the real value of adults who support candidate Y is within 3 points of 52% (between 49% and 55%).  

We can’t rule out that, statistically, the real state of the race could be 51% for candidate X and 49% for candidate Y.  

The bottom line: If a poll result is really, really close – you probably don’t want to say one side is “leading.” Instead, you should characterize it as a close race. 


Author
Carrie Dann

Carrie Dann first joined the NBC News Washington bureau in 2006. She has covered four presidential elections with the network, including as an embedded campaign reporter in 2008 and 2012. She is a longtime co-author of the NBC News political unit’s daily First Read newsletter.