Today’s show is brought to you by Audible. Please visit Audible podcast dot com slash point to get a free audio book download. This is point of inquiry for Friday, December 17th, 2010.
Welcome to Point of Inquiry. I’m Chris Mooney. Point of inquiry is the radio show and podcast of the Center for Inquiry, a think tank advancing reason, science and secular values in public affairs and the grassroots.
At the outset of the show, I want to remind you that point of inquiry is sponsored by Audible. Audible’s, the Web’s leading provider of spoken audio, entertainment information and educational programing, offering 75000 books for download to your computer iPod or C.D. And now Ottawa wants to give you one of them for free. All you have to do is go to this Web site, audible podcast, dot com slash point for a free ebook download this holiday season. You might consider some of the top selling science titles of the year. Rebecca Skloot, The Immortal Life of Henrietta Lacks, for instance. Audible has that one. Or think about downloading a book by some of the authors we’ve had on the program this year. Massimo Pillages Nonsense on stilts or Jedburgh blames the Poisoner’s Handbook. Audible has them. They’re all right there on the site. So head on over now. Audible podcast dot com slash point and capitalize on this offer.
Ever been in an argument with someone and felt massively frustrated because nothing you can say seems to change the person’s mind? Maybe that’s what you should expect to happen. Maybe you should get used to it. According to political scientist Brendan Nyhan of the University of Michigan, that’s how our minds work. And it’s not just that when it comes to politics, people who believe incorrect things tend to be unswervingly convinced that they’re right. Even worse, they sometimes become stronger in that conviction when they’re refuted, when they’re forced to face the facts. They simply won’t do it. It’s a pretty alarming aspect of human nature. But in this episode of Point of Inquiry, Nyad explains how we know what we do about people’s intransigent clinging to misperceptions and how we can work to change that.
Brendan Nyhan is a political scientist and Robert Wood Johnson, scholar in health policy research at the University of Michigan. He was previously a coauthor of the political debunking Web site Spin Sanity dot com and coauthor of the New York Times best selling book, All the President’s Spin. You can find him on the Web at Brendin by hand dot com.
Brendan Nyhan, welcome to Point of Inquiry. Thanks for having me.
It’s great to have you on. You are a political scientist and I invited you on the show to have you talk a little bit about the latest research relating to people’s Misk perceptions and why they cling to them. But first, tell us a bit about what you study, why it is that you look at misperception or people getting things wrong from a political science standpoint?
Well, in terms of the political science, it’s an area we don’t know that much about. So for a long time, pollsters and political scientists have done surveys and shown that the American people don’t know all that much in terms of factual information about how government works, various policy issues, et cetera. But we haven’t looked too closely at the case where people are actually misinformed as opposed to uninformed. And that’s that’s what my coauthor and I are are trying to remedy.
And you previously ran this Web site with two other people called Spin Sanity, where you used to debunk lies and spin. Well, there seems to be some connection between what you did then and what you do now. Could you tell us a bit more about that?
Sure. Well, by my experience, it’s insanity, which was this nonpartisan fact checking Web site that I ran with through other people showed me how difficult it was to convince people that their mistaken beliefs were wrong. In particular, when it came to issues that were partizan or ideological, it was very difficult even to convince our readers who are selected group of people who are interested in these sorts of issues, even even convincing them that they were wrong was often very difficult. And when we went out to the wider public, it was even more challenging. And so what I want to understand is why that was.
Well, I think everyone’s stunned lately by the amount, the sheer amount of misinformation that seems to have circulated just in the last two years, just in politics, you know, claim the president’s a Muslim. He caused the financial collapse. He hypnotized people into voting for him. He wasn’t born in the U.S. There’s no birth certificate.
How does this stuff start? Why does it keep going?
Well, it’s a good question, and I’m not sure there’s one single answer. Some of these things start at some sort of a grassroots level, and some of them are elite driven. I think what’s been striking to people is the extent to which these things have crossed over into the mainstream and have been endorsed by elites in the last few years. And so people who otherwise ignore the sort of these sorts of issues have had to take a closer look. So so during the Bush years, there were all sorts of crazy conspiracy theories about 9/11, but they never really crossed over to the mainstream in the way that Obama is a Muslim or Obama’s on citizen have in the last few years.
We don’t know what causes that amines political expediency or is it just the mass number of people who are buying in?
I would say that salience is a big factor. So the fact that Obama was running for president is now president. Obviously, there’s a there’s a demand for reasons to dislike him. A latent demand out there that these rumors help address. And there’s been some work done on on rumors and went the conditions under which those tend to circulate into one factor as times of social or economic unrest. And so the last couple of years certainly fit the bill there. So that might be a factor as well. But, you know, I mean, part of this may just be simple human psychology, that these things are maybe more pervasive than we give them credit for it. Again, it’s just the elite aspect of it.
In the last few years, it’s made it so noticeable when we talk about people believing something that’s incorrect. But that obviously humors in some way their internal sense of self. How much is the driver sort of ideology partizanship and how much is the driver? Stovepipe information channels. So people get her forwarded emails from people with the same political viewpoint than they have and then therefore demand to people, some more people with the same political viewpoint. And then a rumor starts.
It’s definitely some combination of the two, you know, a lot of cases. I think it is the interaction of the two. So you you’re selective. Any information you choose to consume, which tends to slant the news you receive towards your preexisting views. And then even in that filter, News Stream, you’re then interpreting things in a way that’s consistent with your preexisting views. And I think that’s especially important touch in understanding how these things start, but how people resist information that suggests that they’re wrong, which is really what my research is focused on most specifically.
Then let’s go on to talk about that, because you’ve got this one study that I’ve read and I’ve quoted, I think probably on this show, which is about showing people newspaper corrections. Can you elaborate on that research and what it found?
Sure. So one of the issues that we brought up, it’s insanity, was that we thought the media should be more aggressive, in fact, checking politicians and prominent political figures when they made misleading claims. So what my coauthor Jason write for and I wanted to do was to test exactly what would the effects of a more aggressive correction in in the news media be. And what we found was that corrections were very often ineffective for the group that is most susceptible to the misperception in question. And in some cases, they actually made the misperception worse. And that’s what we call a backfire effect.
And who did you who did you test this on and what was the the misperception and how did you try to correct it and then watch that attempt to correct it fail?
So this was with college students. We did two rounds of experiments. For experiments reported in the article. And we gave people mock news articles that included a quotation from a prominent political figure that was based, you know, based on a real issue or or statement that that someone had made. And then we experimentally manipulated, whether they saw a paragraph after that saying that the claim was was wrong or misleading. And then what we did is we looked at it and we asked them half of that, whether they believed in a misperception. And we then disagreed. Those responses to see if they differed by whether they got the correction and also by ideology and partizanship.
And so one of the examples I recall is, does Iraq have weapons of mass destruction? So you quote someone that says it does. I don’t know. Maybe it’s Colin Powell or something like that. Then you say actually U.N. weapons inspectors have found or something like that. And then you see if what they’re political, but where they are politically on the spectrum and then whether they change their opinion.
That’s right. That’s right. So this was after it was very clear that Iraq did not have weapons of mass destruction immediately before the U.S. invasion. And we provide your quote, it’s actually from President Bush that suggested that they were there. He had a way of speaking about it after the war that suggested that that there was a threat from weapons of mass destruction when, in fact, we learned later that there wasn’t.
What we found in this first study was that conservatives who were exposed to the correction became more likely to think Iraq had weapons of mass destruction rather than less. And what about liberals?
It essentially had no statistically significant effect as a marginal. It might have you might have slightly reduced the misperceptions among the most liberal subjects that have to go back and look at the article. But the effect of the crash is primarily among conservatives, and it was primarily a resistance. It was primarily this backfire effect rather than a misperception reducing effect.
How much you think it matters who the messenger is and how much they associate with that person, like in this case, who is President Bush? But of course, it could be a Fox News host. If you’re a conservative, if you’re a liberal, it could be an MSNBC host. And the fact that this person, you vest authority in them and you trust them, therefore, if someone corrects them, it’s something you’re likely to reject. I think that plays an important role.
There may be people when we test someone more obscure misperceptions who really aren’t familiar with the details of the claim. And they may be responding to the correction based on what they think of the person making the original statement. So I think that’s an important factor. It’s definitely something where we want to do more investigation of how much that changes, how people respond to it. But I think the bottom line is that in real politics, these statements are almost always attributed to someone that’s part of what makes them news.
And so you have to take that into account when you think about what the effects of corrections are.
I think some might say or the instinct might be out there that, hey, this is political science. But I mean, it’s sort of just what people have said about human nature for centuries that, you know, we we we cling to things. I mean, you could find this argument in literature. You could find it in George Eliot’s novels or Charles Dickens novels, people who just have blinders on. And that’s sort of the tragedy of the characters. I mean, in what way is this? Is political science doing something new here or psychology doing something new here in underscoring an aspect of human nature that’s clearly been around for a good long time?
Yeah. I think people understand is this finding in a way that they maybe don’t often with academic research is because they’ve experienced it in their own lives. So I wouldn’t claim that were the first to discover this. I think where we’re adding value is where we’re extending previous work in terms of how partizanship and ideology affect how you respond to. Nation, instead of looking at people’s opinions, we’re looking at people’s actual factual beliefs directly. So that’s one contribution. And then the second contribution, I think, is that situated in terms of news articles, which is a way that people typically get information about politics. So you can get different findings if you attribute the information to, you know, the surveyor. So essentially, if the interviewer says to you, you’re wrong.
That sounds more authoritative than some third party source. But again, you’re very often directly told by interviewers that you’re wrong.
How does all this relate to the group around a law professor called named Dan Kahan at Yale, who’s done a lot of really well publicized research about how people’s cultural values determine who they believe to be an expert in, like the global warming debate or something like that.
But I think it’s closely related to the finding. As I understand it, in much of his work is that people have differing preexisting cultural values, which you can think of as being pretty closely related to ideology, which is what we look at. And then that shapes how they respond to information, who they choose to believe as an expert and so forth. So I think it’s very consistent with the work that we’ve done. And in some ways, the main difference is the subject matter, looking at things like new technologies or science and technology policy. I’m not sure of the exact detail.
But from the standpoint of of political scientists like yourself, I mean, if there’s a political misperception or a scientific misperception that’s out there and that’s believed for some ideological reason. It’s not like people are going to treat the two kind of categories differently. I mean, a scientific issue is going to fare just like a a more, you know, completely electorally political issue is going to fare.
As long as it lines up with some existing cultural or political cleavage, there are some science issues that people have no opinions about and they don’t line up well with existing cleavages. And so I think you might see a different pattern of responses. You wouldn’t see the same level of polarization by your preexisting views. You might see more deference to experts. But once it starts being mapped into these preexisting partizan or ideological or cultural worldviews, then then you’re going to start to see this pattern coming to again.
In your expertize, if we shift topics a little bit, your expertize is actually in health care policy where of course, there’s been and maybe this is one thing that drew you into this, a ton of crazy misinformation, bad statistics, talking, the evidence, you know, claims about rationing and especially about death panels. Can you tell us a little bit about your work on that?
Sure. I have an article about misinformation in the health care debate that contrasts Clinton’s experience in the 1993 health care reform debate with Obama’s in 2009. Just to show that in both cases, misinformation played a major role in those debates. And to illustrate some of the fundamental aspects of misinformation. So I go back and look at survey data from both debates to show that the people, the Republicans in this case who believe the misperception about their respective health care plans. So in 1993, it was claimed that Clinton’s plan would not allow your choice of doctors. And in 2009 was claim that Obama’s plan would put all people in this labeled into into death panels. And in both cases, the people who Bill who believe those things were much more likely to think that they had a good understanding of the issue in question. So the thing about misperceptions that’s different than than simple ignorance is not just it’s not just that you hold the wrong view is that you you think you’re right. You have confidence in your belief. And so this is something we saw in both of those health care debates. And we can see it in other debates about health care. So I’m working on research about vaccines. And that’s that’s an interesting area where there are very significant misperceptions on both sides of the political aisle. And, you know, it extends into areas like the debate over mammograms and breast cancer, which is another kind of sensitive issue that maps into preexisting cleavages and people’s preexisting views seem to drive how they interpret the evidence.
Well, vaccines, interesting one, cause we had Dr. Paul Offit on who talks about basically all the claims that vaccines cause autism and how they’re not right. And maybe you’ve looked into that. But I’m interested in this point you make about a misperception being different than just something wrong that you vaguely heard misperception, something that you strongly believe and you think there’s evidence for it and you’ve probably researched or at least, you know, poked around, Googled, tried to, you know, get yourself versed in this information. Why on earth is it that more engagement with information leads to this strong rejection of the facts?
Well, I’m not sure about the causal direction there. It could be that you have this strong view and then you go search out that information to buttress that existing view. So the causal arrow could go in either direction. But I do think this this confidence issue is important because it’s part of what drives the resistance and new information. So if I ask you, who’s the chief justice of the Supreme Court and you say Anthony Scalia. But you know that you’re just guessing. You don’t actually know the answer. And I tell you, no, it’s John Roberts. You’re you’re pretty likely to agree with me. But if you really think you are correct at what the psychologist called metacognitive level, that it’s much harder to overcome that resistance. And that’s that’s what seems to happen on misperceptions more frequently.
So on health care reform, because everybody’s so entrenched, because there are these misperceptions and it’s not just, again, rumors that we vaguely heard, it’s stuff that people have intensely argued about and dug themselves in on. What happens now? We have a law that exists, but we have one side vowing to try to repeal it. What do you see happening?
Well, it’s it’s difficult. One of the things we found with the misperceptions is that they’re very difficult to change even over time. So Iraq’s weapons of mass destruction misperception persisted for years after the evidence was clear that they didn’t exist. And my sense is that we’re going to see the same thing with the Obama health care law because Republicans are still opposing it. And there’s there’s still lots of elites who are pushing misleading claims about it. So the flow of information to the public that reinforces claims about death panels and so forth is not going to stop. And people tend to persist in their preexisting views over time. So, you know, there’s been an optimistic cleanup out there that once people had some experience with the new health care system, that they would somehow discover the merits. The system, or they would then learn that the death panels claim was wrong. But I think we have a lot of evidence to suggest we should be skeptical of that claim. There are still lots of people who think Medicare isn’t a government program. More than 40 years after its creation. So I wouldn’t be optimistic in terms of how quickly public opinion is going to turn around on this.
So the question then becomes, you know, what does it take to kill a misperception? I’m just trying to think of an example where people do think we actually achieved it somehow or at least made a big dent. And I guess the first thing that comes to mind would be smoking. Certainly there was a ton of misinformation and ton of doubts raised about whether it was really dangerous somewhere. It seemed like that issue turned and now everybody really accepts that that it is dangerous.
Yeah, I think I think the strongest factor is a unified information flow from a week. So in a case of tobacco, because of the threat of perjury lawsuits and an evidence that was that was brought forward in the discovery process, even the tobacco executives had to admit under oath that smoking causes cancer. Smoking is addictive and all those other things. So when you you had this case where every single expert or person involved in the debate said that smoking did all these things. And so there wasn’t almost anything to grab on to to to buttress what you wanted to believe. So I think the same, you know, would need to happen on on political issues. So certainly Republicans are conservative. You don’t hear many people saying that the weapons of mass destruction are still in Syria. And so I think that that claim has finally eroded. That myth has eroded over time. But the difficulty is it’s not really an issue anymore. So people aren’t hearing much about it either way, whereas with smoking, it stayed salient. And so that information flow was not just unified, but it was relatively significant. And so people were being exposed to that information over and over again. So there is there is hope, but there aren’t a lot of issues where our elites agree. You know, politics is about where they disagree.
One of my core questions, I’ve asked you this before, and it’s certainly a touchy one, but I think it’s what everyone is sort of thinking in the background. Is there any evidence that liberals versus conservatives or some other political divide that the people who fall into those categories process information differently, are more or less likely to change their minds? Anything like that that we can sort of segment and draw relationship between ideology and essentially, you know, flexibility or willingness to hear contrary information.
This is a live debate within psychology. There’s there’s a person named John Jostein and his coauthors who have written a couple of articles claiming that conservatives are more closed minded and dogmatic. It’s not a case where I feel like I have strong enough evidence in my own research to make sweeping claims. You know, we’ve only done a small number of experiments and we we want to keep exploring a lot more before we try to make any of these broader claims. And the thing that I always try to point out is that there are lots of examples of liberal misperceptions, resistance to new information and so forth. You can think of the 9/11 conspiracy claims. You can think of the way people respond. Economic information between presidential administrations. Democrats seem to be just as biased about Republican presidents as the converse and on and on and on. So I would be hesitant about making any sweeping generalizations like that. But people can certainly look up these papers if they’re interested in that debate.
Well, one that you found was about the stem cell ban. Right. Quote, Ban, it wasn’t a real ban. Bush somewhat severely restricted research, fairly severely restricted federal funding of research.
But he didn’t ban it outright. It was just it could be funded in the private sector and on existing lines. So you found that liberals thought nevertheless, that he had banned it, which was incorrect?
That’s right. That’s right. So we gave him a correct. This is a claim that was made by John Kerry and John Edwards during the 2004 presidential campaign. And so we tested it as part of the study that you were discussing earlier. And we found that liberals resisted the correct information, but they did not that we didn’t observe that the so-called backfire effect where their belief in it became stronger.
So is the would the backfire effect be one of the ways in which the two groups might be different?
Well, it’s certainly a measure of ethics in a relative extreme form of what we would call motivated reasoning, where you’re you’re processing information in a manner that, you know, designed to buttress your preexisting views. So you could think of that as kind of the worst case scenario. We have observed it with conservatives. We haven’t observed yet with liberals. But again, we’re very cautious about drawing any sweeping conclusions from that limited set of experiments.
Well, based on all of this, I mean, I guess the big question terms of what do we do is win win. Based on your research, would you infer that it’s worthwhile? To set the factual record straight, as we, you know, rationalists are always inclined to do, and when is it just, you know, you figure it’s not going to do anything? So why am I even arguing? It’s a very tricky question.
Despite my research, I’m still trying to set the record straight whenever I can in my own political writing and the way that I justifier or at least rationalize it. I may be engaging in motivated reasoning myself, but what I tell myself is that it’s important in terms of beliefs. So from everything we know about human psychology, it’s going to be very difficult to undo a myth that’s out there among the public. That’s a very difficult thing. But I do think there are ways in which we can pressure a leader to set the record straight. Who are more responsive to facts, who are in institutions where there’s more pressure to be accurate and so forth. And those sorts of mechanisms targeted at elites. So encouraging people to set the record straight and sanction people who don’t. Those are the mechanisms I think can’t be effective at the elite level.
So in other words, it’s almost like a kind of shaming if some TV host and this happens all the time or prominent radio personality says something judged to be beyond the pale. Then there are all these calls for them to retract, to, you know, sit say it right and say it right a bunch of times, you know, and almost be a little bit embarrassed over it. That kind of activity.
Absolutely. Yeah. No, I, I’ve called for naming and shaming, and it’s the shaming first and then the shunning afterwards. So if you’re someone who promotes misinformation, we should call you out, identify you. She knew and then not give you access to large audiences anymore. The problem is the media has almost no institutional memory and a strong bias towards controversy. And as a result, keep having these people on even as they use the platform to mislead people over and over again.
So what maybe is missing then is sort of a culture of accountability for what you’re calling elites or what we might call pundits, people who do clearly affect the perceptions of large numbers of people. They don’t have to be accurate in in. There’s no one keeping Palli.
No. That’s right. And there there’ve been some interesting projects to facilitate this sort of thing on the Internet. And I’m I’m hopeful that people will be more organized about it. So there was someone who was tracking pundit predictions just to show how frequently they were wrong. And you can imagine all sorts of similar things to kind of build up a record of inaccuracy and misinformation as a way of kind of showing that this is a pattern. This isn’t just a one time thing.
I guess it brings me back, though, to the point that, hey, maybe we said earlier, is it segmented information channels that drive all this or is it just, you know, the the deep said political ideology to the public? Now, we’re seem to be saying that it’s the segmented information channels where there isn’t a lot of accountability on different sides of the political spectrum to get things right, because they’re just they’re just feeding the frenzy and they’re the authoritative voices. They’re the ones that people trust. That’s who we should be holding accountable.
Yeah. I mean, there’s certainly an important springboard for making these things well known. So death panels went from zero before Sarah Palin said it. And we said that exact phrasing. There was a preexisting history of this claim about euthanasia. But she says death panels and a bunch of people pushed euthanasia and death panels claims in the media. Either politicians or pundits. And within a month, you know, something like 80 percent of American public has heard about it. And especially the people were biased towards accepting those claims. You know, many of them accepted it and believed it and still do today.
Well, I guess it ends up being somewhat of a dismay outlook on communication of information, although I guess, you know, it really is the only realest one. And we have to be informed by the research. So at the end of the day, if I could just ask you one final question.
You know, when you get up in the morning and you’re studying the stuff and you’re like people are biased, people are biased, they’re not listening to the facts. What do you look forward to as a big kind of change that will make it at least somewhat better?
I have no specific positive trends to to point toward, but I do think there’s at least a growing awareness that this is a problem and that we have to figure out better approaches. And we’ve gotten more recognition and awareness of some of the research we’ve done in the last year than we ever have before. So I’m hopeful there are concerned members of the public and journalists and other people who want to make us better. And that’s at least the starting point.
Well, I agree this stuff is getting more more attention. And that’s why I wanted to have you on the air, and that really is a good thing. So thanks. Brendan Nyhan for being with us on Point of Inquiry. Thanks for having me.
I want to thank you for listening to this episode of Point of Inquiry to get involved in a discussion about Brendan Nye hands research. Please visit our online forums by going to Center for Inquiry, dot net slash forums and then clicking on point of inquiry. The views expressed on point of inquiry aren’t necessarily the views of the Center for Inquiry, nor of its affiliated organizations. Questions and comments on today’s show can be sent to feedback at point of inquiry. Dot org.
One of inquiry is produced by Adam Isaac in Amherst, New York, and our music is composed by Emmy Award winning Michael Waylan. They show also featured contributions from Debbie Goddard. I’m your host, Chris Mooney.