Nuclear Risk and Reason – David Brenner and David Ropeik

April 11, 2011

When the devastating earthquake and tsunami struck Japan last month, it left behind not only mass destruction, but also a nuclear crisis that was covered 24-7 by the international media.

Since then, we’ve been embroiled in a huge debate about nuclear policy—should there be a “Nuclear Renaissance” in the United States, or should we put it on hold?

A central issue underlying all this is the scientific question of risk. How dangerous is radiation, anyway? Do we overreact to reactors?

To tackle that question, we turned to two different guests. One is one of the world’s foremost experts on radiation exposure and its health consequences; the other is a journalist who’s done a new book about why we often misperceive risk, to our own detriment.

David Brenner is the director of the Center for Radiological Research at Columbia University. His research focuses on understanding the effects of radiation, at both high and low doses, on living systems, and he has published more than 200 papers in the peer-reviewed scientific literature. Dr. Brenner was the recipient of the 1991 Radiation Research Society Annual Research Award, and the 1992 National Council on Radiation Protection and Measurements Award for Radiation Protection in Medicine.

David Ropeik is an author, consultant, and speaker on phorisk communication and risk perception, and an instructor in the Harvard University School of Education, Environmental Management program. He’s the author of the 2010 book How Risky is it Really? Why Our Fears Don’t Always Match the Facts.



Today’s show is brought to you by Audible. Please visit Audible podcast dot com slash point to get a free audio book download. This is Point of Inquiry for Monday, April 11th, 2011. 

Welcome the point of inquiry. I’m Chris Mooney. Point of inquiry is the radio show and podcast of the Center for Inquiry, a think tank advancing reason, science and secular values in public affairs. And at the grassroots. At the outset of our show, I want to remind you that point of inquiry is sponsored by Audible Audible’s, the Web’s leading provider of spoken audio, entertainment information and educational programing. 

The site offers thousands of books for download to your computer, iPod or a C.D., and it has a special offer for point of inquiry listeners. One audio book download for free to participate. All you have to do is go to the following Web site. Audible podcasts, dot com slash point again. Audible podcast, dot com slash point. That’s our own special. You are l for listeners to this show. Let me make a recommendation. Download the book we featured on my last episode. Lawrence Krauss is Quantum Man Richard Feynman’s Life in Science. Audible is featuring it right now. It kind of links up perfectly with what you’ve been listening to. So, again, just head over to Audible podcast dot com slash point if you haven’t already. Try this site out. 

When the devastating earthquake and tsunami struck Japan last month, it left behind not just mass destruction, but also a nuclear crisis that was covered 24/7 by the international media. Since then, we’ve been embroiled in a huge debate about nuclear policy. Should there be a so-called nuclear renaissance in the United States or should we put it on hold? A central issue underlying all of this is the scientific question of risk. How dangerous is radiation? Do we overreact to reactors? Listen to this exchange between HLN host Nancy Grace and weatherman Bernie Rayno about whether radiation from Japan is a danger to the U.S.. 

Bernie, explain how this is going to affect us. 

Well, in the United States, I don’t think there’s going to be a big impact at all. You know, I think that earning the radiation is there. Oh, boy. Here we go, Nancy. Huh? 

Yeah, Nancy. He’s not heading. You know, it’s the last time and I’m going to continue to say this. You’re not married, Nancy. This is not clear. This is not Gamage. 

This is not damaging radiation in the United States. It’s one billionth of what is needed to cause any problems. The state of emergency in California. 

From what I remember from last week when I think what happened happen. Any government, Santu. Don’t worry. Everything’s fine. 

Believe it or not, the show got even worse from there on, Bernie Rayno deserves some kind of award. Of course, there’s also the other extreme. Here’s Ann Colter on Fox’s The O’Reilly Factor. 

And here she is. The column is entitled A Glowing Report on Radiation, Glowing Radiation. Very, very good. But you are not down on radiation poisoning. 

Well, it’s not me. I’m citing a stunning number of physicists and from The New York Times and The Times of London. There’s a growing body of evidence that radiation in excess of what the governments are says is are the minimum out amounts which should be exposed to are actually good for you and reduce cases of cancer. 

Clearly, some people are going overboard in both directions when it comes to the risk of radiation exposure at various doses. With this show, I wanted to provide some data on the question of radiation risk and risk in general. To that end, we’re featuring two separate interviews, first with one of the world’s foremost experts on radiation exposure and its health consequences. And second, with a journalist who’s done a new book about why we often misperceive risk to our own detriment. So without further ado, let me introduce both of our guests. In the order in which we’ll be hearing from them. David Brenner is director of the Center for Radiological Research at Columbia University. He studies the effects of radiation at both high and low doses on living systems and has published more than 200 papers in the peer reviewed literature. Dr. Brenner was the recipient of the 1991 Radiation Research Society Annual Research Award and the 1992 National Council on Radiation Protection and Measurements Award for Radiation Protection in Medicine. David Ropeik is an author, consultant and speaker on risk communication and risk perception. He’s an instructor in the Harvard University School of Education Environmental Management program and he’s the author of the 2010 book. How risky is it really? Why our fears don’t always match the facts. Dr. David Brenner welcomed the point of inquiry. Pleasure to be with you. 

You’ve been a much in demand since the nuclear disaster began in Japan. How do you feel that the press has covered what’s happened from a sort of technical standpoint? 

Well, very much a mixed bag. There have been certainly some excellent reports. And and as one might expect, there’ve been some really pretty appalling ones. We’re just just really doing scaremongering. But overall, I think the quality has been higher than I would have expected. 

And, of course, it’s still a moving target. The situation there, although it appears to be improving. But what can we say at this point about the extent of the danger from release of damaging radiation that’s occurred? 

Well, I think you are quite right. As of the time when we’re recording this talk, things are looking better. But there are still a lot of uncertainties. Over the next couple of weeks. So I don’t think we’re in any sense out of the woods. But let us supposing that things go recently. Well, from now on, we have issues both in the short term in terms of radiation exposure, and then we have issues in the long run over decades, generations, even in terms of low dose exposure to the general public. And then, of course, we have the issue of the higher doses that the workers inside the nuclear plants will have received. 

You’ve said on the one end in the past, you’ve said there’s no absolutely safe level of radiation to be exposed to. But on the other, I get the sense that you do think some fears are overblown or exaggerated. So what’s the basic science that leads us to that kind of conclusion? 

Well, I do believe that, in fact, there is no level of exposure to radiation that we can say is absolutely safe in that the risk is zero. I think the more likely scenario is that the lower the dose, the lower the risk, the higher dose, the higher the risk. But the risk never actually becomes zero. So it’s all a question of figuring out, well, how much radiation dose people are getting in any given situation. 

Well, some have said, you know, it’s only them worried about radiation reaching the United States. I mean, it’s reaching other countries from Japan. What what is your sense of that kind of risk? 

Well, I mean, it certainly has the the wind happily was it was blowing most of the radioactive materials across the way from Japan and across the Pacific Ocean. And most of the radioactive materials are actually now in the Pacific Ocean. Some of it reached the west coast of the USA and some of it is even reach the east coast of the USA. But the issue is how much and the fact that we have pretty sensitive instruments and can actually measure the radiation exposure does not mean in any sense there is a health significant health risk because the dose has become so low as the radioactive material has got dispersed across the Pacific Ocean. By the time it reached the west coast of the USA, the amount of radiation was really at absolutely miniscule levels. So, again, it’s not, I think, good science to say no risk, but it’s quite fair to say absolutely minimal risk. I think we need to bear in mind that what we’re talking about here is a risk of cancer. That’s the long term concern about radiation exposure. And you and I, all of us have about a 40 percent chance that one at one point in our lives we will get cancer and that the risks, the increased cancer risks from the radiation exposure are really not going to make any significant difference to that 40 percent. It will be a tiny, tiny increment on that 40 percent. 

Well, there’s all these analogies that are flying around. People compare what’s going on in Japan to other kinds of nuclear radiation releases. 

And so I want to sort of see if we can do some comparison to similar situations in some sense that we need to compare with are the Three Mile Island situation. Country Pennsylvania, where there was a partial meltdown of a nuclear reactor. And the situation in Chernobyl in the old Soviet Union where there was a fire and a large amount of the core of that reactor was actually emitted into the environment. So those are actually two extremes that we can we can think about in the Three Mile Island situation. Arguably, there were no health hazards associated with the red. Pretty small releases of radiation. Now, some people certainly argued the fact of whether there are any health hazards. And they probably were. But they were small enough that you couldn’t detect them in a study. Again, because we have this problem of the fact that 40 percent of us get cancer anyway. It’s really hard to detect a tiny increase in that 40 percent. So I suspect that probably was. But I don’t think it was in any way detectable with the epidemiological population based studies. So that’s one end of the spectrum. The other end of the spectrum is the Chernobyl situation. And it’s very, very different in terms of scale. Chernobyl was perhaps a million times as big as Three Mile Island, so a million times as much releases and that we know there were health consequences. One unfortunate thing is that the population studies that we would we should have done around Chernobyl haven’t been done. And the reasons for that are various. The breakup of the Soviet Union, different countries, Belarus, Ukraine, having different desires, economics, how much money these things cost. So what we’ve ended up with is studies only of a couple of cancers, studied thyroid cancer, studied leukemia, and both of those certainly have showed an increase in risk. But the more common cancers that really should have been studies hadn’t been studied. That’s a that’s a very unfortunate situation. So if we’d lost an awful lot of information that we might otherwise get about the effects of these incidents. So so there are two extremes. Chernobyl and Three Mile Island. So. So where does the current Japanese situation fit into that? Well, it’s very much closer in terms of the amount of radioactivity released to Three Mile Island. It’s clearly bigger than Three Mile Island. But it’s clearly it’s nothing like a Chernobyl like situation. So it’s 95 percent towards Three Mile Island, if you will. 

Let me ask you a little bit more about Chernobyl. There is a if you’re aware of this, you probably are. There is a lively blogosphere debate right now about how many people died from or will die from sheer knowable. And the estimates range from what I’m seeing, something like 6000 to one million. What can we believe? 

Well, I. I have to put myself down as an agnostic in this situation. And I’ll tell you what the background to that. I mean, the first background, this is what we just discussed, that, in fact, that the actual epidemiological studies have not been done for a variety of non scientifically based reasons. So we have to rely on estimates of risk. And basically so we can make estimates of people in this region, shall we say, getting these these doses, people further away, getting lower doses. People still further away, gettings to lower doses. And you can actually do that essentially for all the population of the world. Everybody got some radiation exposure due to Chernobyl. The further away you are, the less it was. So if you start counting. And the further away you you are, the more people there are. Because ultimately, you would be counting all the people in Europe, counting all the people in the world. So if you start doing risk estimates where the number of people exposed is actually the population of the world, even though the individual risks are my nute, you multiply those individual my new risks by an enormous number of people. Population of Europe. So we say and you’ll end up with a significant number of cancers. Now, is that facing? Do is it fair to take a tiny, tiny risk and multiply it by the number of people exposed to that risk and say, well, that’s that’s the number of cancers you expect? The answer is we don’t know that. There’s a lively, solid science debate. I will say about that question. Is it reasonable to consider risk a tiny, tiny risk to at a very, very large population? If you do that, you end up with a very large number of predicted cancers from Chernobyl. If you say, well, you know, people who have got a minuscule dose, their risk is zero. Well, then you end up with a very, very much smaller estimate of the overall cancer burden from Chernobyl. So people get worked up about that argument, but it is based ultimately in a scientific question is is it reasonable to count the risks of a very large number of people being exposed to extremely low doses of radiation? And that’s a better science question, which has not yet really been answered. You can’t answer it with population studies for the reasons we discussed. You just can’t measure population effects. So when they’re so small in such a large population. So the only way we can really answer that question is with the basic science, with trying to understand the all the processes that go into radiation induced cancer and try and make sense of the question of what about a very, very low dose of radiation. So that’s that’s the way that question will ultimately be resolved. And I don’t think it has been resolved yet. And it’s there. And it’s a really important question because the same issues are going to apply to the situation in Japan. The doses are lower. But the questions essentially are the same. We’ve got to have a very large population exposed to extremely low doses. Now, what does that mean in terms of the ultimate number of cancers produced? 

Well, so my understanding would be of ages. Push this a little bit further. The World Health Organization studied Share Noble and they put a low end estimate on the number of deaths. So they were ruling out essentially these extremely low doses to extremely large numbers. It was that a valid thing to do? 

Valid is a tricky word with an appropriate thing to do. That’s a hard question. 

I had the best science that we have. I would suggest just cannot rule out the possibility of that. We should really include everybody who is exposed to extremely low doses. And if you do that, you end up with quite large population but cancer burdens. That being said, that doesn’t mean that the individual risk to anybody was high. It’s that the distinction here is between individual risk that the risk of any one person gets from a tiny dose of radiation and population risk. The risk to a whole population of number of cancers that might be produced in a whole population. The different concepts and what the population risk involves individual risk and the number of people exposed. Individual risk is just individual risk and trying to make that distinction. It’s an absolute critical distinction and it’s one which gets lost in the in the flurry of debate. 

But it’s even trickier than that because it both implies that, hey, I’m in California and we had some happen Japan. So I individually, I don’t need to worry very much. But at the same time, it gives ammo to those who will say later, well, it killed this ungodly number of people, which will scare people in the future. 

Indeed. Yo, yo, you you’ve hit the nail very much on the head. But I mean, it is fair that one should look at risk from from both these aspects. It’s important to know what people’s individual risks are, but it’s also important to understand what the consequences for a very large population would be. The analogy which I sometimes give is, is the lottery. So you buy a ticket for a lottery. Your chances of winning my nute. But lots of people are buying tickets. So the ultimate result is that somebody will win the lottery. It’s not that just because your individual risk is of winning the lottery is tiny, doesn’t follow that nobody will win the lottery. In fact. Very much the opposite. Somebody will win the lottery. So that’s that’s actually the argument for counting everybody in that. Even though the individual risks are tiny. What an individual, a tiny individual risk means is that if a very, very large number of people are exposed to that tiny risk. Some of them will end up with the consequence we’re talking about, in this case, cancer. But it’s central to policy planning. If you want to think about, well, what are the consequences of a large scale radiological release like like this one, that that argument is completely central as to whether perhaps nobody will get cancer and from the Fukushima incident or a significant number will. And the bottom line is we don’t have enough science to really come down on one side or another. And I would say I think it’s time they are untimely that to point out that the one agency in this country, in the U.S., which is supporting research on basic science research on low dose risks, is facing being chopped the because of the current financial negotiations going on in Washington. That’s the Department of Energy. Low dose program, which is specifically there to try and look at the basic science behind these risk estimates and try and bring some some hard science to this key question of whether we should be thinking about extremely low dose risks or not. Probably not a good time to be cutting that program. 

I would think it makes sense to me. And I’m glad you shift that a little bit toward policy, because another thing we hear I’m hearing certainly a lot is, you know, everyone’s so worried about nuclear power, will, hey, you know, all the emissions from coal kill a lot of people every year. And that numbers more. Would you still agree with that based on what you said? 

Well, let me comment on coal. But, I mean, when you consider any use of radiation in society, you need to balance the risks and the benefits. It’s true for a C.T. scan in a hospital. It’s true for the new airport scanners in airports. And it’s also true for nuclear power that one has to think about the benefits and think about the risks and see whether that balance makes sense. And the benefits of nuclear power, of course, are being able to get away from coal or oil or other things which have problems associated with them. But there are risks. That’s clear. CNN seen the risks in Japan. So how do we make that risk benefit balance for nuclear power? I think the way we do that is to do our very best to understand in a numerical way, in a quantitative way, what the risks really are. The debate around nuclear power is typically from from the extremes, from the nuclear power is deadly. And there’s a killer, too. Nuclear power is entirely safe. And neither of those points of view. All right. We need to try and have a science based discussion and haven’t been too many of those. But if we’re really to understand where we’re going to go in the nuclear power world, we need to have that science based discussion of what what are the real risks, because we’re certainly at a turning point, a fork in the road, if you will, for nuclear power. What the. One of the lessons I think we’ve certainly should be learning from the Japanese situation is we need to do something about our aging fleet of nuclear reactors. The Fukushima number one reactors was was 1971 vintage. And we have those same reactors in this country, 1970s vintage, and they don’t have the safeguards that more modern reactors do. So that’s what happened when when there was a total station blackout. All the power inside the plant and all power outside the plant went out. The the the reactor couldn’t cope with that situation. And we know the consequences. More modern designs really don’t have those issues or the. Very much reduced in that they can continue to cool, even if no power of any sort. They have, for example, gravity driven cooling. So we’re clearly going to have to do something about our old reactors. We’re going to have to either move away from nuclear power to a considerable extent or we’re going to have to replace those reactors with modern safer reactors. And making that decision should be science based. It shouldn’t be based on rhetoric. 

It seems that one conclusion would be then that we really ought to when we think about risk, we ought to try to tweak the risk. And maybe this is obvious. Tweak the risk by changing how safe the reactors are. Rather, you know, try and count counted up afterwards if something goes wrong. 

Well, we need to do both. I mean, yes, you need to design safer reactors, but you always I mean, nobody is going to design a a quote, perfectly safe reactor that there’s nothing in this world which is perfectly safe. So one is always going to have to consider the possibility of a of an accident. And again, what the consequences of that are, I think has to be built into the argument about whether nuclear power is the right way to go forward in this country. 

At the end of the day, where do you come out, you know, and your feeling on the future of nuclear power amid still part of the whole picture and has to be. 

Well, I think I’m not qualified to make make a comment really on where nuclear power should go. I think what I could say is that in order to make a reasonable decision about where we want to go in nuclear energy, we need to really understand better what the nature of the risks would be if there was an accident. And we’ve discussed that at length today. I don’t think right now we have enough science to really make those conclusions. And the fact that we don’t have enough science leads to the to the polar opposites coming into play. The nuclear power is disastrous. Nuclear power is perfectly safe. We really do need to produce more science. We have good risk estimates that everybody can can buy into. 

Okay. Well, on that note, Dr. Brenner, I want to thank you for being with us on point of inquiry. 

Pleasure talking with you. 

We’ve just heard a very nuanced take on the science of radiation risk, one that does not lend support to viewpoints on either ideological extreme. Obviously, though, not everybody reacts to this issue with the thoughtfulness of Dr. Brenner. So how do we decide what we think about risk? And why do we so often get it wrong? For that, we now turn to our second interview. David Ropeik, welcome to Point of Inquiry. 

Good to talk to you. And I have have chatted a bit about nuclear power. But more generally, why do we misjudge or miscalculate the risk of various things out there in the world? 

Well, seeing it that way actually sounds more pejorative than I think about it. Misjudge. We do in the sense that our judgments, our views, our feelings. Sometimes don’t match the evidence. People are more afraid of vaccines than they need to be or fluoride and less afraid of the sun or obesity than they ought to be. Yes, there’s a perception gap, as I put it in my book, between our feelings and the facts. But calling it a misjudgment or irrationality or lots of ways that it’s referred to sounds more pejorative, negative. Like people are dumb or ignorant or getting it wrong, then I think it’s fair. And that’s because of this. That’s naturally how we do it. We use partial information. We never have all the information we need or all the time to go get it or all the smarts to all understand it all. So we take this partial set of information and we make sense of it quickly using a bunch of evolved mental shortcuts, if you will, and instincts and feelings, kinds of filters to to categorize the information very quickly as to whether it’s threatening or not, because that’s a pretty high imperative. So we want to get that decision made kind of fast. So risk to us is subjective risk to scientists, maybe probabilistic and hard knowable. But to you and me and to the scientists, when they take your lab coats off, it’s subjective a feeling. So to call that wrong is kind of dismissive of people’s feelings, even though, yes, there’s a gap sometimes between what they feel and what the facts say. 

I got you. And I’m actually I’m not in. They’re all idiots. That can’t be helped. Sometimes people are too dismissive of the public, I feel. And yet, at the same time, it certainly is the case that excess fear itself. I think you say this in the book can cause harm. 

Oh, there’s no question that the perception gap is a huge risk in and of itself. When we get risk wrong, by which I mean when our feelings about the risk don’t match the evidence in some substantial way, we can do dumb stuff like if we’re too afraid to fly after nine one one and we decide to go to Las Vegas in a car and hit a tree. It’s our fear that killed us because statistically it’s way more likely that we’ll be killed driving than flying even right after nine one one. Well, especially then. But it works the other way, too. We can be not afraid enough. So I weigh several pounds more than I ought to. And if I were more worried about obesity as I am, for example, about antibiotic resistance in the sun, other things I probably would arrest and weighing more is dangerous for me, too. We have to fear fear itself too much or too little, which is why I wrote the book. And we have these discussions because understanding why we get risk wrong. What these natural feelings are that lead to the perception gap is a first step towards thinking about risks more carefully and making healthier choices. 

What about this is something that I think about a lot of television media. I mean, you know, you could have a science journalist who does understand something about how to assess risk. But if you have some kind of crisis like what happened in Japan and you have 24/7 coverage, even if you have people trying to be scientifically responsible, if you’re just getting, you know, blaring attention to one thing, doesn’t that in and of itself cause people to misalign their perceptions? 

Oh, you’re absolutely right. And by the way, I was a television journalist. So Mayor Culpa, I did this, but I I’ve also been a science writer for The Boston Globe and other places. And it’s not just television. There’s no question that what the public knows about the world beyond their own personal experience largely comes from the information media. That’s a broad thing these days. And if the information media first start with breathless alarmism, oh, my God, the radiation cloud is coming from Japan and then cautiously say, but it’s not a risk. Well, the breathless alarmism is what catches people’s attention to and sends people running for their iodine pills. So it’s absolutely true that the media magnify our perception gaps. They pay more attention to the risks that are scary because as a reporter, I wanted to be a lead story and my bosses wanted to sell tomorrow’s newscast for both reasons. The the attention getting or dramatic or frightening aspects of a story would get played up and the ameliorating or neutral or reassuring would get played down or left out, not covered at all. And we ended up with a Chicken Little view of the world. We have a Chicken Little view of the world, in my opinion. It’s a scarier place, according to the media, than I have found it to be living my 60 years so far. Yeah, the media make it worse. 

Another factor that I’m consciously aware of is the discourse of, you know, let’s call it the discourse of risk and reason can itself be really political. I mean, you know, telling people not to worry, depending upon how you do it. You might sound like you might want to sound like you’re scientific, but it could be ideological in global warming. 

The people who don’t want us to worry call those of us who do worry alarmists. 

Oh, you you’ve capped something that’s really fundamental here, Chris. And it’s wonderful that you did the views that we hold on risk. That’s what we’re talking about at the moment. But on politics and other things, too. Ah, though we argue them rationally based on the facts, in fact, reflective of our selective interpretation of the facts so that our views end up agreeing with those with whom we identify. Because as social animals, our survival depends on the strength of the group and the group’s acceptance of us as a good a member in good standing. And the more we get along with the group and support its views, the stronger it gets and the more it accepts us. So we can look at the same facts on climate change or nuclear radiation or whatnot and argue them until we’re blue in the face as though we’re being rational and fact based. But goodness, at the end of the day, it’s the same data. We’re just interpreting it differently because of these these underlying drivers that subconsciously to most of us shape our views into the. Detective, things that come out in the end and end survival being as important as it is, once we’ve made up our minds about things like risk and danger and what threatens us, we’re really reluctant to open our minds to an other view which might suggest A, were wrong. B, are tribes wrong? And C, our view being wrong means we might be in danger. So getting people to change their minds about risk is really hard. And one other point, if I may. You wisely point out that there’s a big B ideological battle between people who are thoughtful and fact based and rational and people who are emotional and pejoratively stupid and wrong and ignorant. And what I say in the book is that’s an ideological battle that’s used by the right and the left over environmental issues or economic issues and whatnot. The reality is from the social sciences. That the human animal is. Both affective. That is, uses emotion and uses facts. And we need both of them to be rational, to conform our behavior to social norms and. Arguing that it should be one or the other and this post enlightenment, Cartesian argument, that reason is supreme in our cognitive brain is so powerful that we can think our way through anything is is hubris. As Ambrose Bierce put it, once brain is defined as the organ with which we think we think we feel, too. And then saying it should be one or the other just misses a body of evidence that it’s both. 

Well, this point you’ve made about affect or emotion, I guess, is the more technical term for for affect is the more technical term for emotion. The term my shrink used once and and also, you know, subconscious. 

It suggests that sometimes we’re reacting to these things before we even know we’re doing it. My question is, yeah. I know we’re not the Cartesian rational animal that they wanted us to be two hundred thousand years ago. But it’s a case where the emotion actually leads the reasoning on the right path. 

More often than not, it does. Surprisingly, although with risk, that’s getting to be less and less true. The reason for that is the characteristics of a risky situation that ring our alarm bells. Applied more consistently to simpler risks. The dark, bad guys with clubs, snakes and not risks that involved as many trade off as many scientific complications. We didn’t live in societies that were this large where our safety depended as much on government. So trust which mattered always to social animals. Now, trust in government is playing more of a role and a lot of modern risk issues. So so far we’ve done pretty well how we’ve survived with this decision, haven’t we? But the more complicated, modern technological science based risks are now calling for a more cognitive, if you will, rational fact based analysis. And we’re not very good at it. What we are good at, at least, however, is the science of knowing. We’re not very good at it. We can see that. The gap between fear of vaccines and the evidence is potentially dangerous to us in places where vaccination rates go down. Nearly eradicated diseases go back up. We can see that fear of nuclear radiation. It has pros and cons, but compared to coal, it’s a no brainer. And we can see that more coal and less nukes is bad for our health. And we can see why nuclear scares us for a bunch of reasons. It’s invisible and that leaves us powerless. And that’s scary. And it’s manmade as opposed to natural radiation from the sun. And manmade risks are scarier and it’s imposed on us. We don’t like a risk that’s imposed on us, whereas when we go to the doctor and get a chest x ray, no problem. Right. So it has a bunch characteristics that coal doesn’t have and it scares us. We can see a smart thinking beings that that’s not good for our health and start to use our self-awareness of getting risk wrong. As a way of trying to overcome the emotional based system that we have, but it’ll only carry us so far because, as I said, we’re a mix of. Fact and feeling. 

Now, I got you on that, I worry, too, that modern decision making, technocratic decision making on issues of environmental risks that combines together science, uncertainty and politics is not something that you would necessarily expect us to subconsciously, emotionally evaluate. 

They’re fraught thoughts, fraught issues. Absolutely. And evoke very powerful subconscious world views. 

I think it’s worthy of noting, for example, that the powerful movement for the precautionary principle, let’s be uber precautionary about things that we don’t know about and not let them into the marketplace until we’ve tested them or look for alternatives, which makes sense, is only invoked usually or initially on environmental issues. It’s not invoked for trains, airplanes and guns and other things. That’s in my opinion, because the people in the environmental movement have a particular concern about things that have been let out in the past. OK, I’m not arguing for or against precaution, but I what I am pointing out is that the argument for precaution is coming from some deeper worldview about environmental threats. We we can’t overcome this. We are affective beings. We’re hard wired. There’s brain science. First chapter in my book is about this. There’s brain science that ensures that affect in instinct and feeling. And all these subconscious first reactions to risk happen before we even think about things a little early, before the data even gets to the outer layer of the brain, the cortex where we think about it, the feelings part have already kicked in and said, well, well, morning finger. And then in the ongoing reaction, there’s much more power from the affective instinctive feeling parts of the neural system than there are from the sinking one. We are doomed. Andy Rifkin has a great name for this. We have An Inconvenient Mind. And that’s the bad news. The good news is we can see its dangers. And we are smart enough, I hope, to, in recognizing that start to realize that the risk of the perception gap is a big risk that we have to deal with all by itself. 

And on that note, I’ll just ask you a closing question. 

What do you tell people to do if they want their perception of risk to come more into line with what we know? There’s some simple steps that can be taken. 

Yeah, I think so. The first one is I use the metaphor of a seatbelt when we drive. Most of us put on a seatbelt knowing that we’re entering into a risky circumstance and the seatbelt reduces our danger. We have a tool. We don’t to think about it. Most of us, we just do it. But that’s why. We should recognize that the kind of instinctive, I’ll call it that, the instinctive system that we use to respond to risk sometimes. Can lead to judgments that feel right, but don’t match the facts. In dangerous ways, that’s like going out, driving in a dangerous circumstance. The knowledge first that sometimes our instincts lead us to get risk wrong. And second, to know specifically what the characteristics of a risky situation are that lead us to get risk wrong, which I’ve laid out in my book in there. There’s lots of other writing on this. The knowledge of those things is like a seatbelt when you’re going out to make a risk decision. If you know that, you get it wrong and you know why. You can protect yourself a little bit from the perception gap of getting risk wrong and make healthier choices. So know that you do it. No. Why? And you can do it. That is judge risk in a healthier way. That’s right. For you. 

Well, that sounds great. And on that note, David Ropeik, it’s great to have you on one of inquiry. 

I’m glad to have contributed. Thanks. I hope it helps. 

We started this show with some extreme media moments. I hope we’ve ended in a very different place with introspection, not just about the nuclear issue, but about why we get these kinds of things wrong across the board. Why does that matter? Because when it comes to scary but also highly technical issues exploding into the media, there’s always gonna be a next time. I don’t know what it will be after radiation or when that moment will arrive. But I know that it will. And I know this also as journalists, as citizens, we can calm down. We can think we can do better even with these most difficult, complex and emotional of issues. On that note, I want to thank you for listening to this episode of Point of Inquiry to get involved in a discussion about the risks of nuclear radiation and our nuclear future. Please visit our online forums by going to center for inquiry, dot net slash forums and then clicking on point of inquiry. The views expressed on point of inquiry aren’t necessarily the views of the Center for Inquiry, nor of its affiliated organizations. Questions and comments on this show can be sent to feedback at point of inquiry dot org. 

One of inquiry is produced by Adam Isaac in Amherst, New York. And our music is composed by Emmy Award winning Michael Whalan. Today show also featured contributions from Debbie Goddard. I’m your host, Chris Mooney.