Home
AllPolitics
 

 Home
 News
 Analysis
 Community
 CNN.com

About CNN Polls

 Which polls does CNN put on the air?

 How many people are interviewed?

 How are those people interviewed?

 How are those people selected?

 Don't you really just interview people who you know will give you the answer you want?

 How can so few people represent the views of the entire U.S. public?

 Why doesn't CNN run mail-in surveys or call-in surveys to "900" telephone numbers?

 I haven't ever been called (and neither have any of my friends). Doesn't that mean your polls are wrong?

 What is the "sampling error" or "margin of error?"

 Okay, but how should I interpret the sampling error?

 What is a likely voter?

 What is weighting?

 What is an exit poll?

 Who writes the questions?

 What is the track record for CNN's polling?

 But I remember when President Bush was beating Bill Clinton by 20 points. Why were those polls wrong?

 I saw a poll on CNN. How can I get more information?

 But the poll I want information on wasn't a poll that CNN sponsored.

 I am writing a term paper/book report/article on a topic. Can I get all the polls CNN has done on that topic?

 Click here for the full collection of AllPolitics polls.


Related Sites

 Gallup Organization

 Yankelovich Partners

 Project Vote Smart

 Survey Net

 The Eagleton Poll Home Page

Polls

Frequently Asked Questions


Which Polls Does CNN Put On The Air?

CNN conducts two different polls with different survey organizations and different media partners.

One is the CNN/USA Today/Gallup Poll, which is conducted by the Gallup Organization. The second is the CNN/TIME Poll, which is conducted by Yankelovich Partners, Inc.

Occasionally, CNN will air the results of a poll sponsored by another news organization. When we do so, we believe that those figures are correct based on that poll's previous history and our own polling. But we have no control over the way those polls are conducted and cannot vouch for them in the way we stand by our own polls.


How Many People Are Interviewed?

It varies from poll to poll, but usually Gallup and Yankelovich will interview from 600 to 1,000 people. In almost every case, those people (pollsters call them "respondents") will be at least 18 years old.


How Are Those People Interviewed?

All interviews are conducted by telephone. This gives the respondents a measure of privacy while allowing us to interview as many people as possible in a short amount of time. Since nearly every household in the U.S. has a telephone, this method gives nearly all Americans an equal chance of being selected to participate in our polls.


How Are Those People Selected?

In a technique known as "random-digit dialling," a computer selects completely at random the phone numbers that our interviewers call. This method allows us to reach people with unlisted phone numbers and people who have moved recently as well as those who are listed in the phone book. This also assures that the interviewer knows nothing about the respondent before the interview takes place.

The key word here is "random." Using a computer guarantees that the numbers are generated truly randomly. (Anyone who works with computers can attest to how randomly a computer can behave.) And as long as the phone numbers are chosen at random, every person in America has an equal chance of being selected for one of our polls.


Don't You Really Just Interview People Who You Know Will Give You The Answer You Want?

No, we don't. We can't. We only know one thing about our respondents -- their telephone number -- before they are called. We don't know whether they are liberal or conservative, Republican or Democrat, rich or poor. That is one of the chief advantages of the "random-digit dialling" technique over other methods which polling organizations have used in the past.


How Can So Few People Represent The Views Of The Entire U.S. Public?

How the poll respondents are chosen is far more important than how many are chosen. If every person in the United States has an equal chance of being called for one of our polls, it stands to reason that the people we call will represent the views of every person in the United States.

The random "samples" used in polling are based on basic mathematical principles that can be found in any elementary textbook on statistics. Businesses use these same statistical techniques every day. Courts routinely allow studies based on these principles to be admitted as evidence. Even the Census Bureau, who you would normally expect to try to interview every single person in the U.S., will use these principles in a pinch. They are difficult to explain in just a sentence or two, but any book on statistics available in your local library will give you as much information on "sampling" as you would like to know.


Why Doesn't CNN Run Mail-In Surveys Or Call-In Surveys To "900" Telephone Numbers?

Because those polls don't accurately reflect the opinions of the American public. Polls cannot be reliable if the people who participate in them are "self-selected" -- that is, if they have taken the trouble to fill in a form or call a special telephone line. Polls like that only reflect the views of people who feel very strongly about the issue contained in the poll. Often they are conducted by a television or radio show, so they also only reflect the views of people who read or watch that particular show. In addition, people with the time and money to answer such polls tend to be more affluent.


I Haven't Ever Been Called (And Neither Have Any Of My Friends). Doesn't That Mean Your Polls Are Wrong?

Not at all. The samples of people we pick for our polls are valid as long as everyone else in the country has the same chance as you do to be selected. In a nation of about 270 million adults, we certainly won't talk to everyone. But if all 270 million have an equal chance of being picked, it stands to reason that the people who are chosen will generally reflect the views of all the ones who haven't been interviewed -- including you.


But Your Polls Don't Match What I Think And What My Friends Think.

We'd be surprised if they did. We try to make sure that we have polled plenty of people just like you, but we also try to make sure that we poll plenty of people who are unlike you. Our polls aren't meant to reflect exclusively the conversations you have at work, the radio stations you listen to, or your family's dinner table discussions. They are meant to reflect a little piece of the conversations every American has at work, radio stations all across the country, and the dinner table conversations of 270 million Americans.

It's understandable to assume that your experiences are typical of what the whole country is thinking. But how many people do you know from the Northeast, or the Deep South, or the Mountain West? How many people do you talk to on a daily basis who are far richer than you, or far poorer? How many of your friends talk politics with people who live in inner cities and on farms, who are retired and still in college, who are divorced with several children and happily married with none? That's why CNN conducts polls: to talk to tremendously different people all across the United States all at the same time.

In fact, in any poll, there is a group of respondents who are like you and your friends. If we just looked at those respondents, you would see poll results that reflected your views. But we want to show how the entire country feels about issues.


What Is The "Sampling Error" Or "Margin Of Error?"

Random samples obviously aren't as accurate as interviewing the entire population. Fortunately, it is easy to measure the biggest possible difference between the results of most polls and the results you would get if you asked the same questions of all 270 million adult Americans. That maximum difference is called the "sampling error" or "margin of error." Ninety-five percent of all polls are guaranteed to fall within that relatively narrow range.

CNN recognizes that this sampling error exists for every poll result we broadcast. That is why every poll graphic shows the sampling error at the bottom.


OK, But How Should I Interpret The Sampling Error?

Let's say we do a poll in which Candidate A would win 56 percent of the vote if the election were held today. If the sampling error were plus-or-minus three percentage points, it means there is some chance that Candidate A's support could be as high as 59 percent (56 plus 3) if we had asked all 200 million adult Americans. There is also some chance that her support could be as low as 53 percent (56 minus 3) if we had done 270 million interviews.

That does not mean, however, that the chances are equal that Candidate A's support actually is 53 percent, 54 percent, 55 percent, or 56 percent. The likelihood is very low that her support is 53 percent, slightly higher that it is 54 percent, higher still that it is 55 percent, and highest of all that she would actually win 56 percent of the vote. That is why CNN would report that 56 percent figure -- the one with the greatest likelihood of being correct -- while also taking pains to note the sampling error as well. (The same is true at the other end of the scale. Candidate A would be slightly likely to win 59 percent of the vote, somewhat more likely to win 58 percent, even more likely to win 57 percent, and likeliest of all to win 56 percent.)


What Is A Likely Voter?

As you know, not all Americans who are eligible to vote do so, and voters, as a group, can be different than non-voters. As a result, in the months and weeks just before an election, we will try to identify those respondents who are most likely to vote. We usually refer to those people as "likely voters." Typically, they are registered to vote, say they are definitely planning to do so, and fit other criteria which years of polling have shown are likely predictors of voting behavior.


What Is Weighting?

After we complete all the interviews for a poll, we compare the respondents to the latest Census Bureau figures for the entire U.S. population. Do we have too many men? Too few senior citizens? Too many people with high incomes or college degrees? Typically there are small discrepancies, which we correct by applying a small mathematical correction to each respondent. These corrections, called "weights," usually change the final results of a poll by about one to two percentage points.


What Is An Oversample?

Occasionally we want to analyze the opinions of a group of people who are a small portion of the overall population. For example, there are too few African-Americans in a typical poll for us to be able to say how all African-Americans feel about the issues on that poll. To do so, we will make a special effort to contact extra African-Americans. That technique produces an "oversample" of African-Americans. The term comes from the fact that we wind up producing a sample of African-Americans over and above those we have already interviewed as part of the regular sample of respondents.


What Is An Exit Poll

An exit poll is a special kind of poll CNN conducts on election day. It is not a typical telephone survey. Instead, specially-trained interviewers are stationed at the exits of polling places and interview voters after they have cast their ballots. The major advantage of this method is that we are absolutely certain that we have interviewed people who have actually voted.

CNN conducts these exit polls as part of a consortium with the other networks and the Associated Press. We share the commitment that the other networks have made to Congress that we will never report the results of an exit poll in a state or make any projection about how that state has voted until the majority of the polling places in that state have closed.


Who Writes The Questions?

CNN employees work with our polling partners at TIME magazine and USA Today, as well as professionals at Gallup and Yankelovich, to draw up the questionnaires. We spend hours -- sometimes days -- writing the questions. Often we "pre-test" them by paying for 50 to 100 interviews which are conducted solely to help us improve our questions. The wording of a question, and which other questions it follows, can affect the question's results. We recognize this, and the amount of effort we spend to write fair, unbiased questions reflects this.


What Is The Track Record For CNN's Polling?

Polling overall has become pretty reliable today. Consider this: between Labor Day and Election Day in 1992, more than 300 polls were conducted by various media organizations across the country. Not a single one of them showed George Bush beating Bill Clinton. The last poll CNN conducted before the election showed Clinton with 44 percent, Bush with 37 percent, and Perot with 14 percent. Five percent were undecided. As you will recall, in the actual voting Clinton won 43 percent of the vote, Bush won 38 percent, and Perot 19 percent.

Just before the 1994 midterm elections, the results from the last CNN poll indicated that the Republicans would pick up 48 seats in the House of Representatives, more than enough to take control of that body. The GOP actually gained 52 seats in that election.

One of the reasons why CNN conducts polls is because we are convinced that they provide reliable estimates of public opinion. Every two years, elections give us a "reality check" -- whether the polls can give an accurate indication, within the margin of error, of the actual outcome of the election.


But I Remember When President Bush Was Beating Bill Clinton By 20 Points. Why Were Those Polls Wrong?

Only a poll taken immediately before Election Day can accurate predict the outcome of an election. But those earlier polls were not wrong. They were an accurate reflection of public attitudes at that time. They might have had some predictive power if nothing had changed between the time the poll was taken and the election. Of course, many things did happen -- the primaries, conventions, and debates, Ross Perot's candidacy, and other events which changed the minds of many, many people.

You must bear in mind that polls can only provide a snapshot of what the public is thinking today. We expect to see the public's attitudes change between now and the next election. That's why we do polls frequently -- to be able to report those changes in public opinion as soon as they occur.


I Saw A Poll On CNN. How Can I Get More Information?

If the poll was a CNN/USA Today/Gallup Poll, you should contact the Gallup Organization at (609) 924-9600 or write them at 47 Hulfish Street, Princeton, N.J. 08542.

If the poll was a CNN/TIME poll, you should contact Yankelovich Partners Inc. at (909) 626-6868 or write them at 250 West 1st Street, Suite 302, Clarement, Calif. 91711. (The name of the firm is pronounced yan-kel-OH-vich, with the accent on the third syllable.)

Please make sure that you know as much information about the poll as possible before you call. The most important pieces of information are the subject matter and the dates that the poll was conducted or aired on CNN.


But The Poll I Want Information On Wasn't A Poll That CNN Sponsored.

You should contact the news organization which sponsored that poll, which would have been mentioned as part of the story which used the poll you are looking for. CNN does not keep track of or distribute other network's polls.


I Am Writing A Term Paper/Book Report/Article On A Topic. Can I Get All The Polls CNN Has Done On That Topic?

If you are looking for the results of a specific question from a poll -- or even all the results from a specific poll -- Gallup or Yankelovich can help you very easily. But they are less likely to be able to help you if you are on a "fishing expedition" or trying to find information on a topic without knowing when or whether CNN asked any poll questions on that topic.

You will probably have to do a little research on your own before Gallup or Yankelovich can help you. Your local library may have books on your topic that also discuss poll data. You may also want to use an online computer database such as Nexis. One database devoted exclusively to polling data is maintained by the Roper Center at the University of Connecticut. Call (203) 486-4440 for more information. The Roper Center also publishes a magazine which tracks poll data called The Public Perspective. Other magazines which do the same are The American Enterprise (202) 862-5800 and The Polling Report (202) 237-2000.





Barnes & Noble book search

Archives   |   CQ News   |   TIME On Politics   |   Feedback   |   Help

Copyright © 1998 AllPolitics All Rights Reserved.
Terms under which this information is provided to you.
Read our privacy guidelines.
Who we are.