That Ill-Informed Fox News Viewer Poll? Actually It’s Based On Proven Methodology
Yesterday evening, our own Frances Martel wrote a column about the study indicating that Fox News viewers were less informed about news than those who watched other channels – and than those who watch no news at all. The title of her piece: “Left Rejoices As Poll Of 612 New Jerseyans Declares Fox News Makes People Stupid”.
I used to run political campaigns; as part of that, I commissioned polls. I have a good sense of how polling works, what it’s good for, what makes a poll valid. Martel’s implication that 612 people is an absurdly low sample size is not only erroneous, it’s a staple of those who want to cast doubt on research for political purposes.
Skepticism is always warranted. It is not, however, valid to take issue with basic math.
Let me lift a good analogy from the Washington Post. To test the temperature of soup, you don’t have to eat the whole bowl. You stir it up and take a taste. Similarly, to get an accurate poll, you don’t need to ask everyone the question – you just need to gather a random sampling of people.
Polling takes a random selection of people and then weights the results, as is well-explained here. Since a random sample will not reflect the demographics of the full population, results are recalculated to place more emphasis on bigger population groups, a process called weighting. Common factors considered are economic status, race, gender, location and educational history – though some polls may consider additional things to weight.
The Fairleigh Dickinson poll released yesterday had a sample size of 612 people, all within the state of New Jersey. 612 people is a very standard size for a poll, particularly one taken within one state. Polls bear diminishing returns as you add people; that is, adding an additional 100 people to a 600-person poll has much less of a difference than adding 100 to a 100-person poll. Not that you’d want to do a 100-person poll. The margin of error for a poll of that size is 10%, as opposed to 4% for a 600-person poll.
Let’s look at recent polling for the 2012 race (polling, I might add, that Martel has referred to at least six times in the last month without comment). In Iowa, twenty-seven polls have been commissioned since May of 2011. The average sample size for all of those polls? 528 people.
One savvy commenter on Martel’s post raises a good point: isn’t the fact that this is restricted to one state a bigger problem for applying it nationally? In general, yes – limiting an opinion poll to one state reduces its randomness. If we were polling for President, sampling only in Texas or Illinois would certainly impact the validity of the results. This poll, of course, is testing knowledge as much as opinion, for which I’ll give it some leeway.
But the sample size is beyond question. There’s one final piece of supporting evidence that’s worth mentioning: the long, solid history of correlative results from polls using the same methodology. Nate Silver, the Times‘ polling guru and author of the always great FiveThirtyEight blog, regularly does a post-mortem after big elections, rating how various pollsters did. After last year’s midterms, he ranked eight; in June of 2010, he ranked dozens more. His post-midterm findings reflect the accuracy of poll results. Quinnipiac, the firm that did the best, conducted 21 polls and ended up with an average error of 3.3 percent. And bear in mind, these polls could have been up to three weeks before Election Day. Results like that are hard to argue with.
The polling firm that did worst in Silver’s survey was Rasmussen Reports. Silver delineates Rasmussen’s “cavalier attitude toward polling convention,” mostly done in the interest of cost savings, and closes with “the methodological shortcuts that the firm takes may now be causing it to pay a price in terms of the reliability of its polling.” Methodology is everything, and firms that follow that methodology – like, say, a 600-person sample size – see predictive results.
Oh, and any guess which media company regularly relies on Rasmussen polls? Fox News.
One final resource: the National Council on Public Polls has twenty questions journalists should ask about polling. It’s no longer online, but the preceding link goes to a cached version. Worth the read.
Have a tip we should know? [email protected]