ad info

 
CNN.comTranscripts
 
Editions | myCNN | Video | Audio | Headline News Brief | Feedback  

 

  Search
 
 

 

TOP STORIES

Bush signs order opening 'faith-based' charity office for business

Rescues continue 4 days after devastating India earthquake

DaimlerChrysler employees join rapidly swelling ranks of laid-off U.S. workers

Disney's GO.com is a goner

(MORE)

MARKETS
4:30pm ET, 4/16
144.70
8257.60
3.71
1394.72
10.90
879.91
 


WORLD

U.S.

POLITICS

LAW

TECHNOLOGY

ENTERTAINMENT

 
TRAVEL

ARTS & STYLE



(MORE HEADLINES)
 
CNN Websites
Networks image


TalkBack Live

Campaign 2000: Do Polls Affect How People Vote?

Aired October 24, 2000 - 3:00 p.m. ET

THIS IS A RUSH TRANSCRIPT. THIS COPY MAY NOT BE IN ITS FINAL FORM AND MAY BE UPDATED.

(BEGIN VIDEO CLIP)

AL GORE, VICE PRESIDENT OF THE UNITED STATES: I don't really put my stock in the polls.

(END VIDEO CLIP)

BOBBIE BATTISTA, HOST: They are everywhere: in newspapers, magazines and all over the networks.

(BEGIN VIDEO CLIP)

TOM BROKAW, NBC ANCHOR: And all the polls indicate this remains a race too close to call.

(END VIDEO CLIP)

(BEGIN VIDEO CLIP)

PETER JENNINGS, ABC ANCHOR: The latest ABC News/"Washington Post" poll shows the race is now even.

(END VIDEO CLIP)

(BEGIN VIDEO CLIP)

DAN RATHER, CBS ANCHOR: Fifty-three percent say they are personally comfortable with Bush.

(END VIDEO CLIP)

(BEGIN VIDEO CLIP)

FRANK NEWPORT, EDITOR IN CHIEF, GALLUP POLL: Now, this weekend, we asked people, "What if Clinton campaigned for Gore?" Would that make you more likely or less likely to want to vote for Gore?

(END VIDEO CLIP)

(BEGIN VIDEO CLIP)

GOV. GEORGE W. BUSH (R-TX), PRESIDENTIAL CANDIDATE: I like our chances, but I can't do it without you.

(END VIDEO CLIP)

BATTISTA: Presidential polling: Who's winning? Who's losing? Who's beating the margin of error? Who cares?

(BEGIN VIDEO CLIP)

GORE: I still think this is an election where you can throw the polls out the window.

(END VIDEO CLIP)

BATTISTA: Why are there polls? How are they taken, and do they influence your vote?

Good afternoon, everyone. We'll get started here in just a moment, but first, I've been told I have to throw to Lou Waters back in the newsroom for some breaking news -- Lou.

(INTERRUPTED BY BREAKING NEWS)

BATTISTA: All right, Lou. Thanks very much.

Well, two weeks to go until the presidential election and the polls are flying fast and furious. One day George W. Bush is ahead, the next it's Al Gore. Some days they're in a dead heat.

Before the show, we asked the audience if any of them have been polled on the election. And I think surprising to lot of us about 10 people said they had participated in a political poll this season.

Joining us now to explain who does get polled and how polls are taken is Keating Holland, CNN's polling director. Keating, nice to see you. Thanks for joining us.

KEATING HOLLAND, CNN POLLING DIRECTOR: Hi. How are you?

BATTISTA: I have to say the No. 1 complaint that we hear from people when we're talking about polls or politics on this show, and Pamela from Indiana has e-mailed it to us already. "I have to really wonder," she says, "about the accuracy or the validity of polls. I have never, never been contacted nor have any of my neighbors."

So the first question: How do you go about picking the people that you poll?

HOLLAND: It's a process called random digit dialing. The whole point is that everyone in America has an equal chance of being selected. A computer picks telephone numbers at random, literally randomizes the digits. They get called. The only thing we know about these people in advance is their telephone number. We don't know if they're Republicans or Democrats, liberals or conservatives, or anything else.

The whole things works as long as everyone with a telephone in America, which is about 96, 97 percent of the country, has got an equal chance of us getting called. It doesn't mean that anyone's going to get called at any one time. We only interview 750 people every three nights for our tracking poll. There's 210 million Americans. It's not surprising that we would only reach a small fraction of them at any one time.

BATTISTA: So then, who formulates the questions that you ask on these polls? And how do you keep those questions from coming off as skewed? Or do you?

HOLLAND: Well, we pay a lot of money, spend a lot of time to pretest questions when we're writing them to make sure that there isn't an unintended bias in them. We'd never intentionally bias a question.

The questions are developed with our polling partners at "USA Today" and Gallup, at "TIME" magazine, Yankelovich Partners when it's a "CNN/TIME" poll. And they pretty much come up from the headlines: what people are interested in, what topics or issues are coming up. And we try to do basically plain, vanilla questions that don't give information that would wind up tilting a respondent in one way or another, just to give them usually a yes or no, approve/disapprove, something like that on a basic issue.

BATTISTA: Some of the people that we talked to in the audience -- Michael, for example, we spoke with you a little bit earlier, and you said you had participated in a poll, and you said there were a lot, a lot of personal questions.

MICHAEL: A lot of personal questions, not only age and demographic questions but what I do on my leisure time, how I feel about just regular issues, just health care issues. But a lot of what I do outside the home that didn't have any bearing on my vote.

BATTISTA: Why are those asked, Keating?

HOLLAND: That doesn't sound like a CNN/"USA Today"/Gallup poll or a poll that was done by any other network. There's plenty of pollsters out there that work for private companies, businesses that are trying to sell you something, public relations polls of one sort or another. It sounds like you probably wound up going into one of those.

We usually ask basic demographic questions which some people find a little off-putting, like income or race. The reason why we do that is to make sure that we don't have too many rich people, too many poor people, The reason we ask age, for example, is to make sure that we don't have too many senior citizens or too many people under the age of 30.

Usually, we find that over the course of an interview, people learn to trust us when we're not slipping them a trick question. And by the end of the questionnaire, when we stick the demographic questions in, they usually trust us enough to give us an honest answer and not refuse, although if people aren't willing to give it to them -- to us, we don't push them. We'll just take a refuse, code it that way, and move on.

BATTISTA: Why do pollsters think that 750 people are representative of the entire country on any given day?

HOLLAND: We can't interview all 210 million Americans. We have to pick a sample. The statistical processes that we use have been used for over 50 years, and they seem to work.

The analogy I often hear is when you go to a doctor to give a blood test, he doesn't take out all your blood. He just takes out a tiny little piece of it or a tiny little bit of it. Or if you're making soup and trying to find out is the soup too salty, is the soup too spicy, you don't drink the entire pot of soup. You just take a spoonful of it. And we find that taking a spoonful of the American public usually works.

BATTISTA: How are we supposed to interpret it if some of these polls appear to look very, very different? You know what I'm saying?

In others words, do we look at one poll and think it's skewed or inaccurate, or do you do that if a poll that you've taken is quite different from some of the other credible polls that are out there?

HOLLAND: Yes, we'll look at a couple things. Question order. Oftentimes, if you ask other questions before you ask the presidential trial heat, that may skew things one way or another.

The time, the days on which the poll is conducted is often very important. You'll see a poll reported on a Monday or a Tuesday that was actually based on interviews conducted the previous Wednesday and Thursday. Well, two weeks to go in campaign, you can see people's attitudes changing day-to-day. It's important to make sure you've got very fresh information.

The third thing that we look at is the likely voter model, and in fact, that's probably the major reason why polls are varying at this stage: trying to weed out non-voters. Different polls do it different ways. Some do it better than others.

BATTISTA: Why -- well, I guess you sort of answered that. But why are there so many polls in the last remaining week or two before the election?

HOLLAND: One thing that is important to remember is CNN itself is not doing any more polling than it has done in October of 1992, October of 1996, or really, October of any election year in the past decade. There's a lot of other newer news organizations, media outlets that are out there. Many of them probably feel like they need to do what CNN is doing, do what other major networks are doing.

That probably accounts for a great deal of the fact that there's more polls than there used to be. There's more polls on the statewide level as well. That's probably a good thing, because there were many states in 1996 where we didn't have a single poll the entire election. Now I can't think of a state where we haven't seen at least one or two polls since Labor Day. That probably helps us get a sense what is going on in each individual state.

And, of course, the Electoral College makes that important in a very close election, as we currently have.

BATTISTA: All right, we've got to take a quick break here. And, as we do -- speaking of polls -- TALKBACK LIVE would like your opinion. So go to cnn.com/talkback and take part in our online non- poll "Viewer Vote." Today's question: Do polls influence your opinion of the candidate?

We will be right back -- talk more about that.

(COMMERCIAL BREAK)

BATTISTA: All right, joining us now: CNN senior political analyst Bill Schneider, and Arianna Huffington, a syndicated columnist and co-founder of Partnership for a Poll-Free America.

Welcome to both of you. Arianna, let me start with you, because you really don't think there's much terribly redeeming about polls. Why is that?

ARIANNA HUFFINGTON, SYNDICATED COLUMNIST: No. Well, first of all, Bobbie, they have become an alternative to political leadership. Political leaders do not utter anything, do not come forward with a proposal without poll-testing it. And the dirty little secret of pollsters is that their response rates are down dramatically.

You know, we heard about high random-digit dialing is at the heart of the polling science. But the truth is that over 60 percent of those polled refuse to participate in increasing numbers of polls. And what I would like to ask the polling profession is to start publishing that response rate. We all want to know, as well as the size of the sample, how many people, what percentage of those polled actually responded, because that means that the majority choose to sort of take themselves out of the random-digit dialing process.

And that inevitably has to skew the result.

BATTISTA: Let me have Keating address that, quickly.

HOLLAND: Well, in fact, we're not exactly sure that it skews the results. I just got in my mailbox today "Public Opinion Quarterly," which had a fascinating thing which showed that if you do everything the pollsters can do -- and what they affectively did was double the response rate -- you don't change the poll results at all, or you change them only a point a two.

HUFFINGTON: Then why don't you publish the response rate? Why do you refuse to publish the response rate if it doesn't matter?

HOLLAND: There is half-a-dozen different numbers that you may call the response rate and someone else may call the response rate. They're available on request.

HUFFINGTON: Well, but I called every major polling company, including your own, for a column I did, and every single polling company refused to release to me the response rate. I asked a very simple question. I did not call you during your dinner hour. And I asked for that simple question that I think the public has the right to know. So you cannot agree...

HOLLAND: If I'm remembering...

HUFFINGTON: You cannot agree among yourselves what the right response rate number is, but, please, start releasing it. We should know what it is.

HOLLAND: Arianna, if I'm remembering that column, Yankelovich did in fact give you that information. And Gallup gave you five or six numbers and allowed you to choose.

HUFFINGTON: Not at all. They did not. Not a single...

HOLLAND: It's not true that you got a stonewall.

HUFFINGTON: I will send that column to you again. Not a single polling company agreed to release the number of the last call.

BATTISTA: I'll let the -- I'll let the two of you settle this off the air, possibly. Let me get Bill into the conversation here.

From a journalistic standpoint, Bill, why are these polls important?

WILLIAM SCHNEIDER, CNN SENIOR POLITICAL ANALYST: We want to find out where the race stands, what is going on out there. And, frankly, my view is that no poll -- even our own polls here at CNN -- and we have two as Keating said -- is going to be absolutely authoritative. The more the better, assuming they're all reliable, reputable. And most of them are.

We get a sense of where things stand. I point out to you that, as of today, there were four national polls of the presidential race. And in every single one of them, without exception, Bush is two points ahead of Gore. Now they're all too close to call. But I think that's a pretty good indication of where the race stands.

BATTISTA: Is there no downside to this? I mean, what about the effect that polls can have on campaigns, on the voters, and ultimately the election?

SCHNEIDER: There is no question they have an effect, for instance, on fund-raising. When a candidate is running behind in the polls, particularly in the primary, they're trying to get a nomination. And they're not doing well in the polls. They find it very difficult to raise money. Of course, if they start doing better in the polls, then they will use -- they will wave the polls around and try the get contributors to give them money: See, it says here I can win. That can be a misuse of polling information. That really does happen.

There are other cases where polls make a difference. I don't think people decide how to vote based on the polls. I think, you know, the winning side can get -- can have a rallying effect. Sometimes the people who are losing get demoralized, although I don't think it kept a lot of Dole voters away from the polls in 1996. The big problem is for third party supporters, like supporters of Ralph Nader. When they look at the polls showing Ralph Nader is getting 3 or 4 percent, they may figure, well, really he has no chance to win. I don't want to waste my vote, so I'm going to vote for someone else.

HUFFINGTON: But, Bobbie, what polls do is they ignore the unlikely voters. Increasingly, they only poll likely voters. And you'll find there's a very interesting piece in today's "Washington Post" about how many poor people and people who feel completely left out of the system really are not going to vote; they feel they are not included in any of the polls.

And somebody from the Zogby polling company actually said, quite frankly, that if somebody tells us they are not planning to vote, they are not going to continue with the polling questionnaire. Well, that means that over 50 percent of eligible voters who are, at the moment not planning to vote in November -- over 50 percent are left out of this process. I mean, that really has to have a very unfortunate, disturbing impact on American politics.

HOLLAND: If I can interject here: That my be what Zogby does, but that's not what the CNN/"USA Today"/Gallup tracking poll does. We interview everybody and we report results for likely voters, registered voters and all adults.

HUFFINGTON: But don't you think that, inevitably, people who are likely to be living in homeless shelters, students living in dorms, very poor people are not likely to be as represented as middle-class voters or senior citizens?

SCHNEIDER: Well that's because they don't vote. I mean, that's a very big problem, I agree with you Arianna. That's a terrible problem -- barely half of the voting-age population in the United States votes, a shameful record.

But if you're a pollster and you want to find out how the election's going to go, which is what their business is, you better find out how the people most likely to vote are going to vote. If, suddenly, a candidate comes along who brings out a lot of these people who rarely, if ever, vote -- like, say, Jesse Ventura, who brought a lot of younger men to the polls in Minnesota, then you better be prepared for that, otherwise you're going to get it wrong.

But it's not the polls' fault that these people don't vote.

HUFFINGTON: But, in fact, every single poll, Bill, as you know perfectly well -- both statewide and national -- failed to predict that Jesse Ventura was going to win. That's a very good example of how polls fail to predict anything interesting happening. They can only predict whatever it is that sustains the status quo.

So if anybody has any interest in reforming the system, you should go to my Web site, ariannaonline.com, and take the pledge to hang up on pollsters. We need to bring down to the response rate to single digits, a small civil disobedience step, to get the dominance of polling out of our political life.

BATTISTA: Let me go to the audience quickly because Gabriel (ph) in our audience worked for a polling organization for a political candidate, right, Gabriel? And this was in where?

GABRIEL: This was in Massachusetts for a governor's campaign.

BATTISTA: And your comment?

GABRIEL: Actually, I have a question for the polling gentleman. He said, like, his sampling of America was supposed to be a perfect representation of our opinions, but I was just wondering how they come about with the percentage of error in those polls.

HOLLAND: It's not a perfect representation. There is a margin of sampling error that is due to the fact that we're only talking to 700, 800, 1,000 people out of 210 million people.

Go to any statistics textbook, I don't think we've got the 10 minutes that are necessary for me to draw out a bell-shaped curve, et cetera, et cetera; but we know that, within plus or minus three points, two points, four points, depending upon the number of people we interview that that's what would happen. That's the number we would get if we interviewed all 210 million adults.

BATTISTA: Very quickly, Keating: How high a level for the percentage of error is acceptable?

HOLLAND: We have, probably, about 4 percent, 3 1/2 to 4 percent -- excuse me, percentage point margin of error on the current tracking poll. We'll probably get that down to 3 or maybe even 2 percentage points by election eve.

BATTISTA: All right, Keating Holland, thanks very much for joining us. Appreciate the information.

And we have to take a quick break and we'll continue right after this.

(COMMERCIAL BREAK)

BATTISTA: We love that. We're going to start sending those e- mails to each of those people.

A couple of e-mails here. Gloria in Virginia says, "I was polled Monday night. Up front the pollster asked for the male who voted last. Since my phone number is a white neighborhood, they were trying to get the respondent to be a white male who votes Republican."

Ken in New Mexico say, "Polls work when they're based on random samples and the questions are valid and reliable. People hate polls when they disagree with the data, they love them when they agree."

Bill and Arianna, do either of you -- I'm guessing Arianna certainly would -- have concerns that the polls today are shaping public opinion rather than reflecting it? HUFFINGTON: That is, perhaps, my greatest concern, Bobbie.

The fact that so many critical issues for our country are left out of the polling questions. For example, the failed drug war, the fact that we have 2 million Americans in jail, the fact that we have over 30 million Americans living in poverty, political corruption -- all these questions do not really make it in the polls; and, unfortunately, as a result our leaders decide what they're going to address based on what gets a high number at the latest CNN or Gallup or NBC poll.

Can you imagine, just to give you an idea -- Abraham Lincoln conducting a poll to decide whether to sign the Emancipation Proclamation or not? It was a deeply unpopular document at the time. I'm sure if Dick Morris were around he would say that would bring your approval rating down.

BATTISTA: Let's not even go there. I can't even picture that.

Go ahead, Bill.

SCHNEIDER: Let me give you a counter-example. If there were no polls, Bill Clinton could never have survived as president because he was about to be driven out of office when the Monica Lewinsky story broke. Many people might have thought, and still think, that would have been a good thing.

But the fact is we couldn't even get Democrats to defend the president because they felt betrayed. It wasn't until we took an opinion poll and found that Americans wanted him to stay in office that, suddenly, the conventional wisdom shifted and people realized that, while Americans thought that he had probably broken the law and he certainly lied to the American people, that was not a reason to drive him out of office. Without polls, there would be no Clinton.

HUFFINGTON: But, Bill, the essence of leadership is building a consensus around what you believe. Let me give you an example, when Ronald Reagan decided to take on the air traffic controllers, you may remember that his pollster, Dick Werthlin (ph), walked into the Oval Office and said, Mr. President, you cannot do that, the public is against you. And he said, we'll have to change their mind, won't we? That's leadership. To take issues that may not be on the agenda and put them there because of your ability to communicate and convince the public, that's no longer happening.

SCHNEIDER: Well...

HUFFINGTON: I -- when was the last time that Gore or Bush said anything that the public, they believed, did not want to hear? I mean, they don't even dare address issues that don't get high approval ratings.

SCHNEIDER: And that's a lot of people -- that's why a lot of people are exacerbated with Gore and Bush. And I'll tell you something, Ronald Reagan did plenty of polling, Richard Werthlin, his pollster was very nicely employed. There is -- the problem is, how do you use polls. I don't think you should ever use polls -- and here I agree with Arianna -- you should never use polls to decide what to do in the sense of what you are for, what you support, what your principles are. It's a bad idea, and the voters will see through that every time. But it's a good idea to use polls to find out how you're doing, what people think of what you are doing, maybe to find out the best way of communicating your message, but never use a poll to decide what you are for.

BATTISTA: Let me go to the audience quickly. Bernard has a question?

BERNARD: Yes. There are so many people that are apathetic and believe that their vote doesn't count, won't the polls as close as they are in this particular election bring out some of those people and make them vote and possibly even get them back into the political process on a long-term basis?

SCHNEIDER: We hope so. The last time we had an election this close was in 1960 and that election set a record for political turnout in the last 50 years, it's the highest turnout we ever saw because it was exciting, and we are hoping that that'll happen this year.

HUFFINGTON: Oh, Bill, please, don't tell me you think this election is exciting? I think the American public has definitely decided that this election is incredibly unexciting. And the reason why we have this huge gyrations in polling results -- you remember, we had George Bush ahead of Gore in double digits, after the convention, "Newsweek" had Gore ahead of Bush by 14 percent -- we have all that going on because the public is not excited by either candidate, even those who've decided whom they are going to vote for are not excited. So as a result, I predict it's going to be the lowest turnout ever. And you can't just get people excited because the race is close, you have to give them some motivation to get excited.

BATTISTA: All right, Arianna, I got to take another break. And Arianna and Bill, I know you both need to run, so thank you very much for joining us today, appreciate it as always.

Still ahead, how polls can send you, as we just said, running to the voting booth and change the outcome of an election. It has happened before. We'll be right back.

(COMMERCIAL BREAK)

BATTISTA: Welcome back.

Joining us now is Hal Bruno, senior political analyst at Politics.com; and Larry Sabato, director of governmental studies at the University of Virginia. He is the author of "The Rise of Political Consultants," it includes a chapter on the impact of polls on presidential elections. Gentlemen, welcome.

HAL BRUNO, POLITICS.COM: Bobbie.

BATTISTA: Hal, if I could get you to weigh in here on the effect of these polls on -- journalistically, number one -- but when you were political director at ABC, how did you use these polls?

BRUNO: Well, we used them very carefully, or we tried to. I have great respect for the pollsters of the national news organizations and especially ABC, where I worked for 20 years. They were very, very good. My complaint about polls was, though, that they were overused, misused and abused, and there was too much reliance by the media in general on polling. And television in particular, does not do a very good job of explaining polls; the print media does it much better, they analyze the polls in much more detail.

So the poll itself should not be the story, I've always believed that there should be reporting to go with the poll, and I have always viewed the poll as a useful reporting tool, and that's the way it should be used.

BATTISTA: Let me quote something to you that was in my research. Tom Rosenstiel, who is the director of the Project for Excellence in Journalism suggests that "journalists have abdicated the responsibility of listening to voters in favor of listening to polls as the primary diviner of meaning and political coverage. We use polls as a crutch, and it is weakening other skills that we have." Would you agree with that?

BRUNO: I think there is some truth to it, but don't forget it's the pollsters who do the listening and pass it on to the journalists. I think you do have to get out and talk to people. I've always believed that reporting was the most important thing.

Let me give you a fast example: I do a weekly study of state by state where the electoral vote is going, and we base it upon interviews with the people in politics, as well as with voters, but we try and get a group of people in the state who represent different viewpoints and then we factor into that what we know about polling data from good, reputable polls -- there is a lot of polls we pay no attention to because they have a bad reputation -- and from that, we make a subjective judgment as to which way a certain state may be leaning at this particular time, which in a presidential election is very important, because the name of the game is those 270 electoral votes.

BATTISTA: Larry, historically speaking, when have the polls had an impact on an election one way or the other?

LARRY SABATO, UNIVERSITY OF VIRGINIA: Well, everyone always points to examples like the "Literary Digest" poll of 1936, which wasn't a random sample poll. That particular publication projected Governor Alf Landon of Kansas to beat President Franklin Roosevelt, and of course Roosevelt scored an enormous landslide in 1936, and the "Literary Digest" went out of business. 1948 is another year that everybody remembers because the pollsters, Gallup and his colleagues stopped polling in September, saying that Harry Truman was too far behind Governor Thomas E. Dewey to possibly win the election. That was a major mistake, because as we all know Harry Truman won that election and the polls were as much as a loser as Governor Dewey was.

So there are many cases of this. I agree with everything Hal Bruno has just said. I don't think that polls create a bandwagon effect. I don't think they create a counterbandwagon effect. I do, though, agree with Hal when he says that sometimes journalists don't use polling as a tool; they use it as a crutch, and it ends up driving the coverage. So the polls go up two points for one candidate, and suddenly, all the network reports that night talk about the surging momentum of candidate x when statistically nothing's changed.

BATTISTA: Yes, go ahead, Hal.

BRUNO: Yes, I'd like to follow up what Larry said.

Oftentimes what happens is at this stage of the election, the polls begin to catch up with political reality. For example, in the early fall right after Labor Day, in certain states, there were wide margins between Bush or Gore: one ahead big in one state, one ahead big in another state. Well, that never was real, and you knew it wasn't real at that time because the voters just simply weren't concentrating.

Now we're in the final weeks of the campaign. The polling that we're getting now is the polling that really counts. And now it's much more realistic.

BATTISTA: I have a question from David in Delaware on this e- mail that says: "Is there any industry standard or police watchdog that covers pollsters?" Or maybe the question is, should there be?

BRUNO: No, I don't think there should be. There isn't. I think that -- I think Larry pointed out what happens to pollsters that go bad. They go out of business or they're terribly embarrassed. In 1936, the poll didn't influence the election, but when the election results came out, it was the end of "The Literary Digest." I think Gallup and the others had egg on their face for a long time after the mistakes they made in '48.

So I think -- I think your performance dictates how much respect the poll is going to get.

BATTISTA: Ken in the audience, comment? Question?

KEN: Yes. Say, for instance, the worst-case scenario is that the poll is not accurate. I still think it's safe to assume that there is one advantage, especially for people in my age group taking the polls, and that is that it makes the person taking the polls actually think about the election. Since the voter turnout rate is so bad these days, perhaps, like the other audience member said, it will make the person who's taking the poll actually think more, become more politically aware, and go out and maybe vote or research the candidates.

BATTISTA: Larry...

SABATO: There's actually some research that supports this young man's suggestion. Being polled makes you more likely to read a newspaper, watch an evening newscast, think about politics, and to turn up and vote. So in that sense, the more polling, the merrier. It actually gets people, at least a small number of people, more involved in politics. Also in an earlier segment, your analyst, Bill Schneider, mentioned 1960. It really is worth pointing out that one reason why we had over a 61 percent turnout of the electorate in 1960 as opposed to '49 percent in 1996 is because that election was very close, and all the polls showed it to be tight as a tick throughout the general election. People got interested. People got excited. They turned out.

Maybe the same thing will happen this year. We're not going to hit 61 percent. But I'm hoping that we'll at least get over 50 percent again.

BRUNO: Bobbie, could I just add one thing to that?

BATTISTA: Yes.

BRUNO: Polling is maybe a factor. If the polls show a close race, perhaps that does encourage people to come out, but there are other factors that may be even more important. One is the state of the economy.

When you have prosperity, you tend to have low turnout, in all elections. When you have economic hard times, people come out and vote because they're angry and they're scared and worried. And that's the greatest motivation for people to get out and vote.

SABATO: That's true.

BATTISTA: I've got to take a break here at this time. As I do, Roger e-mails us: "Do polls influence my vote? I'll answer this question later after I see what the other people are saying about it."

We'll be back in just a second.

Al Gore visited four states yesterday and traveled a total of 1,964 miles. George W. Bush visited three states and traveled a total of 1,215 miles.

(COMMERCIAL BREAK)

BATTISTA: When asked, "Who would you buy a used car from?" on the Web site Autotrader.com, 56 percent said Al Gore while 43 percent said George W. Bush. When asked, "What sort of car they would be?" the vice president, a Volvo station wagon, while the governor of Texas, a Porsche.

We decided he was more like an SUV maybe than a Porsche. But Joe is in our chat room today and you're got a good comment that's coming from in there.

JOE: It was from Chip in Texas. He said: "Polls do not necessarily measure public opinion. They cause public opinion because most people do not want to vote for a losing candidate." So it's obvious that people are watching the polls and want to vote for who's the leading candidate right now. BATTISTA: We were saying, is that the danger, Larry, in how -- that the polls are mainly used by people who are not keeping up with the political process, who are not that interested in this race, and then suddenly at the last minute they have an attack of civic guilt or something, and they run to the polls and say, well, that's the guy who's ahead, he's the winner, I'll vote for him.

BRUNO: I don't think so, Bobbie. I -- I really don't believe that. I don't believe that people are influenced by the polls and I don't believe that people try and manipulate the polls. Mike Royko, the late Mike Royko, tried a campaign some years ago that tried to encourage people in the Illinois primary to lie to the pollsters, and it just simply didn't happen.

BATTISTA: Jack's on the phone from Texas. Go ahead, Jack.

JEFF: I'm Jeff.

BATTISTA: Jeff, I'm sorry. Jeff, go ahead.

JEFF: It's OK. I don't really have that much confidence in the polls. I saw a poll basically all week long with Bush at 46 points and Gore was diving to 39 percent. And then the very next day, I saw Bush diving to 46 percent, which is what he was at -- dropping about four points -- and Gore rising to 44 percent. So it seems that it's kind of manipulated. I really don't have that much confidence in the pollsters at all. It seems like they're trying to influence our opinions about who should win.

SABATO: Well, if I can answer that one. And I think Hal would agree with me on this. You are attributing far too much to the people in any busy newsroom. There is simply not room or time enough for these sorts of conspiracies to try to influence your opinion one way or another in that -- in that kind of circumstance.

Now, having said that, it's really important to remember that you shouldn't put too much confidence in any one poll or any one number, because many of the variations from day to day make very little sense. I believe, as your analysts do on CNN, in the poll of polls, where you take all the polls as a group and you do a simple addition and an average, and you probably come closer to the truth than any single poll.

And finally, let me point out that, as much as I respect the pollsters, and as close as they frequently are to the real result, let's remember that, in 1996, every poll but one, the Zogby International poll, was off in the presidential contest. And every one but the Zogby poll had President Clinton too high, well over 50 percent. One of the well-known national polls has President Clinton winning by 19 percentage points.

He actually won by eight. So your caution is well, well justified.

BATTISTA: Hal, did you ever have to go with a poll that you were not comfortable with, or that you, you know, you looked at it and you said: Gosh, this doesn't seem right?

BRUNO: On rare occasions, yes. There was one time where we did a 50-state poll. And we had very good sampling in the big states, but very low sampling in the small states. And when it came in, it showed things that just didn't make sense.

For example -- this was 1998 -- it showed Dukakis coming close or beating Bush in Mississippi. Well, we knew better than that. That was not going to happen. And the reason that it came in was because the sample was so low in that one particular state. So there will be those abnormalities that occur from time to time. And it shows you that the 50-state poll was probably not a very good thing to do.

And we learned from that. And we didn't do it. I mean, we do make occasional mistakes. But the idea that we make these mistakes -- or that we deliberately try and mislead, as Larry said, we don't have the time or the attention. I would also add we don't have the brains to do it. It takes all of our energy to try and just cover the news.

BATTISTA: Stephen e-mails us. He brings up a good point. He says: "This is same thing as the networks calling winners in Florida or New York while the voting is still going on in the West." And that's actually exit-polling that he is talking about.

BRUNO: Yes, that's totally different.

BATTISTA: It is totally different. And I -- as far as I -- as far as I understand it, I believe all of the networks have decided not to do exit polling. They don't do that anymore. Am I correct?

BRUNO: Well, I hadn't heard that.

(CROSSTALK)

SABATO: ... still do it.

BATTISTA: I thought there was an agreement not to do this now until the polls close out West.

BRUNO: No, Bobbie -- Bobbie, the agreement is this: You don't report the results from the state until the polls have closed in that state. And then the exit poll can come into play. The exit poll is a marvelous tool for analyzing how and why people voted as they did. It's also a very useful tool for tipping you off as to what the -- what kind of a night it's going to be.

Is it going to be close? Or is it going to be a landslide? And it tells you state-by-state. So the exit poll is a guide, in that sense. In the end, if it's close, you've got to use other means of calling the states, such as key precincts, which is actual votes coming in from a demographic model. And sometimes when it's very, very close, you have to wait for the actual raw vote to be counted. And I have seen elections like that. This one is going to be very close.

I wouldn't be surprised if we don't know who the president is until California, Oregon and Washington

(CROSSTALK)

BATTISTA: Is the agreement then that they were not going to project winners until the polls were closed?

BRUNO: No, no, until the polls close in a specific state.

BATTISTA: OK.

BRUNO: Now, don't -- because of the time zones, it roles from East to West across the country. And in years when you have a one- sided, a landslide election, a child with a crayon and a cardboard can figure out, by the time those big Eastern states have come in, who the winner is going to be, because they have got the electoral votes that are required. I don't think they will be able to do it in this election.

BATTISTA: All right, got to take another break. As we do, we will take a look at the result of our "Online Viewer Vote." The question: Do polls influence your opinion of the candidate? A whopping 94 percent are saying no.

We'll be back in just a moment.

(COMMERCIAL BREAK)

BATTISTA: Quickly, this e-mail that came into it us from Nancy says: "The last time I answered a poll, when I told the caller I was voting Libertarian, he said that was not on his list. And I had to choose between Bush, Gore or independent."

And one more here. Says Mike in New Jersey: "I find polls useful in the following way. If the lead story on the morning news is about a new poll, I know that there is no news worth reporting and I turn it off."

A lot of comedians out there today sending us e-mails. Hal Bruno, thanks very much. Larry Sabato, thank you for joining us as well.

SABATO: Thank you, Bobbie.

BATTISTA: Appreciate it, once again.

And join us again tomorrow for more TALKBACK LIVE at 3:00 Eastern. "STREET SWEEP" is next.

TO ORDER A VIDEO OF THIS TRANSCRIPT, PLEASE CALL 800-CNN-NEWS OR USE OUR SECURE ONLINE ORDER FORM LOCATED AT www.fdch.com

 Search   


Back to the top  © 2001 Cable News Network. All Rights Reserved.
Terms under which this service is provided to you.
Read our privacy guidelines.