J.D. Trout – The Empathy Gap

September 04, 2009

J.D. Trout is a professor of philosophy at Loyola University Chicago, and an adjunct professor at the Parmly Sensory Sciences Institute. He writes on the nature of scientific and intellectual progress, as well as on the contribution that social science can make to human well-being. He is the author of Measuring the Intentional World, and co-author of Epistemology and the Psychology of Human Judgment. His most recent book is The Empathy Gap: Building Bridges to the Good Life and the Good Society.

In this interview with D.J. Grothe, J. D Trout draws distinctions between empathy and sympathy. He talks about the “empathy gap,” which is a set of natural, evolved limits on empathy, and how these limits negatively affect society, such as when people experience difficulties trying to empathize with others who are religiously, culturally or psychologically different from themselves. He talks about how the results of empathy can actually be crippling for an individual. He talks about how we should use new research in the social sciences to overcome the empathy gap. He explores if new social science questions the basic capitalistic assumptions of the American Dream and also calls into question basic philosophical concepts, such as free-will.

He explains how new social science research supports the Enlightenment outlook. He details a number of well-researched cognitive biases that lead people to make bad decisions, such as the base-rate effect, overconfidence bias, the omission bias, the hindsight bias, and the availability bias, among others. He shares his skepticism that education about cognitive biases, or the adoption of “inside strategies,” can diminish the negative effect of such biases. He proposes that society adopt “outside strategies,” which is the government or institutions adopting policies and strategies to help the public overcome their cognitive biases, because he argues individuals will not be very successful on their own in counteracting them. And he explores to what extent these kinds of institutional or governmental strategies and policies are “social engineering.”



Links Mentioned in this Episode


This is point of inquiry for Friday, September 4th, 2009. 

Welcome to Point of inquiry. I’m DJ Grothe a point of inquiries, the radio show and the podcast of the Center for Inquiry, a think tank advancing reason, science and secular values in public affairs. And at the grass roots. Before we get to this week’s guest, I want to invite our listeners to join Paul Kurtz and Lawrence Krauss and others. This November 12th through 20th on CFI is Western Caribbean Cruise. It’s kind of a critical thinking and science and secularism and humanism cruise of the Western Caribbean. I’ve been on these cruises there. A lot of fun, more than that. There’s a great sense of community and kind of celebration of science and reason. 

So if you want more information, if this sounds like it’s up your alley, you can get more info at point of inquiry dot org. 

My guest this week is Professor JD Trout, he’s professor of philosophy and a professor in the Parmly Sensory Sciences Institute at Loyola University in Chicago. He’s held fellowships from the National Science Foundation, the Mellon Foundation and the National Endowment for the Humanities. His previous books include Epistemology and the Psychology of Human Judgment. Measuring the Intentional World and the Theory of Knowledge. He joins me on the show to talk about the empathy gap building bridges to the good life and the good society. Welcome to the show, Professor Jadi Trout. 

Thanks. I’ve been here. The empathy gap. Building bridges to the good life and the good society. What’s the difference between empathy and sympathy? Professor Trap. 

Well, psychologists usually distinguish between empathy and sympathy by pointing out that sympathy is a kind of self focused emotion. So when you look at another person in need, when you feel sympathy for them, you might be imagining their pain, but you’re really focused on the impact that that imagining has on you. Empathy is usually described as a more forward looking and outlooking emotion that inspires you to act or give you an impetus to act. 

And you’re saying in the book that we have these evolved capacities to both sympathize with others and feel their pain, also empathize with them. But sometimes it backfires. We tend to care mostly for people who are like us, people who are close to us, proximity wise, not so much for people unlike us or who are far away. That’s what you’re calling the empathy gap. 

That’s right. And that’s an idea that was developed by a psychologist called George Loewenstein. And he looked at, as many others have since the gap that exists between your present self and your future self, for example, or individuals in a rational state not being able to predict how they would react if they were in an emotional state. And in the book, it appears as a notion that individuals experience when they try to put themselves in the place of people who were very different from them. 

So people who are religiously or culturally or psychologically very different or say the rich empathizing with the poor, the educated, empathizing with the uneducated, something like that. 

That’s right. And one nice feature of empathy is that it’s there’s a kind of biological basis for empathy as well, is that unless you actually feel the ability to help someone, the result of empathy can be very crippling. 

Right. It’s kind of debilitating to think about all the starving poor in the world and feel powerless to change it. So it kind of makes your mood up. 

Yeah. And it actually has clinical consequences. So in Britain, there’s a kind of clinical condition that is sometimes diagnosed in hospice care workers and military hospital workers and so on, who can’t help the people in the long term. 

So the main thrust of your book is, as I’m reading it, and I should say, it’s one of the best books I’ve read. 

Oh, thanks very much. Loved the book. 

The main argument in it is that we can use social science to overcome this empathy gap. You say it’s not only possible, but we actually should. 

Oh, yeah. I mean, I think the time has come for policy that’s science based. And that policy making, by intuition and impressionistic judgment has had its run, these arguments in your book. 

And we’ll get to them in depth. They seem to be using science actually to question basic assumptions of American capitalism or let’s say the American dream that the well-off, the happy, deserve their lot. The poor or the needy. If they just worked harder, pulled themselves up by their bootstraps or something, they wouldn’t be in such bad situations. In other words, it’s their fault that they’re not well off. Those of us who are better off don’t have a big obligation to share the wealth for to make other people’s lives better because they kind of deserve their lot. 

Right. There’s a kind of canonical view in America about how to distribute responsibility for your social position. 

And we’re all pretty familiar with this American explanation for the way in which the goods and society settle out and they have to do with ideal. Of hard work and pulling yourself up by your own bootstraps. And so we’re pretty familiar with that image. 

The undermining of that image is a very remote consequence of a lot of the psychological research on judgment decision making. But it does tend to undermine that view. Along with that view comes a view about freewill that suggests that people who are who need to be able to pull themselves up by their bootstraps can justify an act of the will. 

You’re saying that it’s a lot more complicated than that, though. The cognitive science, the cutting edge stuff suggests that there’s not some little Jim Occulus inside of us that is in charge and not influenced by these horrible situations that face some people in society. 

Yes, that’s right. I mean, on the basis of psychological science, it’s pretty clear that whether or not people make good decisions has a lot to do with the choices that are available to them, and that sometimes their decisions can improve by doing little more than limiting the number of choices available to them. But you limit them to the better choices. 

Mm hmm. Professor, the science in your book, again, cutting edge, cognitive science, social science, research. But the philosophical underpinnings are straight out of the Enlightenment. Folks like David Hume, Adam Smith, who was a moral philosopher. I’d like to point out not this kind of a diehard, you know, free market economist. Right. Only the question which came first for you. Were you persuaded by the Enlightenment philosophy before you found the science to back it up? Or were you a scientist to kind of said, wow, you know, these two strands of the Western intellectual tradition are now having a conversation and kind of agreeing with each other? 

Well, oddly, I think they happened at the same time or they happened more or less, at least logically, independently. My scientific background is mainly in the area of speech perception. So I used to do experimental work and spoken language processing, and that gave me the ability to read a lot of cognitive science research on judgment in decision making and be able to understand what the statistics actually showed. And it also meshed nicely with the picture that I had independently held based on a bunch of philosophical suspicions about the overuse of thought experiments in political philosophy and so on. 

So they they sort of happened together. 

But you’re not setting out to use science to vindicate the Enlightenment or something like that? 

No, I think I I think there’s independent evidence for the positive aspects of an enlightenment outlook. You know, just if you look at, for example, the way in which medical advances have increased lifespan and increased the quality of people’s lives while they’re living it. 

The advantages of having a solidly grounded science are pretty clear. 

I’d like to let our listeners know about your blog. The greater good that’s at Psychology Today. 

It is. Yeah. We’ll post a link on our Web. Great. 

Professor, let’s get into some of the specifics. You not only go into the kinds of cognitive biases that lead people to make bad choices, but you get to specific policy proposals. How to fix these inequities. The empathy gap. First, let’s talk about some of these cognitive biases. These errors in reasoning. Explain to me what base rate neglect is. You talk about it in the book. 

Yeah. Base rate neglect is the tendency that people have to underestimate the relative frequency of an event in a population. So one way to look at it is you get on a plane and you sit down next to somebody and you and you strike up a conversation and they say, I’m going to go see my aunt for her birthday, which is November 4th. And you go, wow, that’s really strange because my grandmother’s birthday is on November 4th. What are the chances? Well, I don’t know. Probably one in two people underestimate the relative frequency with which they run into people who have the same birthdays as the birthday of somebody that they know or even their own birth. 

They ask a group of 100 people, you know, if it’s not one out of 100, in other words, or downright 365. It’s a different numbers game. 

That’s right. It’s you know, it’s a relatively it’s relatively probable Jim Underdown that throughout any given day you will meet somebody. The question is whether you will talk with them and discover it. 

Right. And this base rate neglect, this kind of. Discounting of the frequency of a certain event, I’m among a larger collection of events, it also fuels our inflated self-confidence that we ourselves are going to succeed in the future or do well at certain things in the future. It’s like the myth that even the assembly line worker can become the CEO even if he just works hard enough. You’re saying there’s a cognitive bias involved in all that? All of that sort of stuff? 

That’s right. And certainly in the case of overconfidence, which is, you know, very robust. And one of the most widely studied biases, most individuals are overconfident about their talents and skills and so on. And that doesn’t mean that some people aren’t don’t have low confidence. They do. But that might actually mean that even the people who have low confidence should have lower confidence as a as a feature of it. When come when you compare to the actual probability that they’re correct on a series of tasks. So if you if you think about the earlier bias that we’re fallacy that we were talking about the base rate neglect case, get defense from limitations on our own cognitive architecture. We can’t be tracking the probabilities of all these events as we walk through the day. So we tend to focus on some small subset of them. And the about the availability bias arises when we use certain kinds of heuristics to identify the important ones. So the ones that are psychologically salient are especially accessible or available to our memory. The ones that are easy to visualize, for example, are the ones that we remember. Now, they may not be the most representative, but it’s sure easier to remember something that’s vivid than to track a bunch of dreary probabilities. Mm hmm. 

You’re suggesting that most people have an overinflated view of themselves, but that research coming out of cognitive science, social science stuff that you’re into seems to jut up against other trends and in, you know, psychological sciences these days, like the you know, the I don’t want to call it the positive thinking movement, but, you know, positive psychology. Right, where we should esteem ourselves highly. It’s not just about self-esteem, but we should think highly of our own capacities because that belief itself will increase our capacities. 

Yeah, I mean, it can’t be the case. And certainly some people have suggested that the explanation for overconfidence is the maintenance of self-esteem. But and while that may be true, it’s not entirely clear at this point. There is also a benefit to believing about yourself what’s true, not just what you want to believe about yourself, because there may come a time when you have to make a very sober judgment about how likely it is that you’re correct. 

You know, for example, about whether you should go to the doctor to take care of this lump or in our evolutionary history, if, you know, if I’m on the Serengeti and I over estimate my ability to outrun that tiger or something. Well, you know, that’s that’s not going to be good for me, right? 

Oh, you know, I mean, there’s a certain number of resources that you expend when you make an effort based on the belief that you can do it. Now, I’m not plumping for the idea that we become pessimistic about our own abilities. And there’s only limited evidence that we can control this overconfidence using what I call inside strategies in the book. But I am suggesting that it’s worth being calibrated and as judgment and decision making people use that term is just trying to bring into contact your level of confidence with the actual probability that you’re correct. 

How does this this push that you’re making to better gel the actual capacities with a person’s confidence in his or her own capacities? What’s that have to do with the empathy gap? Should we be? Should we be looking at people who aren’t doing well and and stop saying to them, oh, you just need to buck up. You just need to have the right mental attitude or there’s something media that you’re saying there? 

Well, there are two separate things you can cast. This is a kind of personal observation that individuals oftentimes make. So they they they’re exercise in putting themselves in the position of another person is to say, well, if I were in that person’s position. Would do X? 

Right. I wouldn’t be homeless because I would work harder. 

Right. If I if I were homeless, I wouldn’t starve because I’d figure out a way to catch pigeons and eat them. I’d be clever. You know, people would come up with certain kinds of explanations for how they could cleverly avoid some kind of circumstance, and that could be a result of their own overconfidence. But I’ll I raised the point you can develop it in part because it has policy implications. Overconfidence is one of the features that allows a politician to stand up on the House floor and announce to other people what the consequences are going to be if there’s a gun control law passed or if a needle exchange program is approved, or if a health care provision gets included in a bill. And many of these sorts of provisions ended up in the bill, in part because there’s good research indicating that the kinds of choices that people would make if they are given certain options, whereas many of the people on the House floor who were making these announcements have no advanced degree, have no special expertize on the issue. And yet their overconfidence, as well as the power of tradition, allows them to hold forth on issues and present themselves as experts on the topic. Mm hmm. 

What’s would the social scientists called the availability bias? We’re doing it. Just a quick survey of some of the cognitive biases you talk about in the book. And then we’ll talk about some of the public policy implications. 

Yes, the availability bias is the result of a heuristic or a rule of thumb that people use to process information. And it causes us to rely on what’s easiest to recall. And sometimes that’s what is easiest to visualize or what happened most recently. So it causes us to suppose that what we can remember or what we can visualize is actually what’s most representative. So when people look out onto the social landscape and they think what what is worth changing in the policy arena, they focus on a lot of the problems that seem especially salient, whether or not it’s a leading cause of death or suffering or anything like that. So, you know, it’s very common for school shootings to get a lot of attention, a lot of attention. But as a matter of fact, you know, more people die from lightning strikes per year. 

Now, in the case of school shootings, there might be an additional reason, of course, because these are young people who were being killed in schools and people have a special attachment to that idea of that potential being snuffed out. But the basic idea is there that, you know, people oftentimes look at what’s most salient, you know, so they think about homicide as being a major problem in society. And it is. But suicide outpaces it. Hmm. 

There are many other of these cognitive biases you cover in the book. Let’s touch on just one more. And that’s the admission bias in all of these as you’re talking about them. They all seem very related. They kind of build on one another. 

That’s right. And in part, that’s because many of them have a common source. They result from our cognitive bounded ness. The omission bias arises a lot in our moral reasoning. So a lot of times we don’t recognize that there are certain bad consequences that come from our inaction rather than our action. So, you know, the sort of classic cases people who have to take their children in to get inoculated against some kind of disease. 

They’re actually doing something to bring about a certain state of affairs to improve the health of their child and make them more resistant. But if their child happened to fall in that very small but unlucky proportion of the population that dies as a side effect or becomes severely disabled as a result, then those individuals feel very guilty for having gotten their children inoculated. 

They don’t just say was the luck of the draw. 

They say, I did this, I did something and had I not done it, they reasoned counter factually about the situation. Had I not done it, then my child would be fine. And oftentimes people might feel better by being involved in a bad consequence that results from omission rather than commission. But they don’t really track those. So many omissions aren’t nearly as available. So that’s how the biases are connected. 

Do you really have optimism that through education about all these sorts of cognitive biases, like what should we teach everyone the way the brain’s wired to kind of make it hard on us to make good decisions? Well, if that’s done, that will all just make better choices and that will lead to more flourishing. In other words, can we can we fix this or is this just an exercise in scientists figuring out what’s going on? 

Frankly, I think at the individual level, education only matters to the extent that it inspires people to develop policies that control biases. I don’t think individuals are ever going to be very resourceful at counteracting the biases that they’re prone to. So and that’s to use the terminology in the book. In inside strategy. So, you know, there is some evidence that in the case of hindsight bias and the overconfidence bias, the hindsight bias is just when you think that in effect is more predictable than it actually is, simply because it’s in the past and you know how it turned out. You can moderate people’s hindsight, bias and overconfidence bias a little with certain kinds of exercises that an individual might do. And one exercise that’s been tried is the consider the opposite exercise. So you consider an alternative hypothesis, like how might this situation turned out otherwise? And if you think about very specific ways that the situation might have turned out otherwise, you become a little less certain that in effect was inevitable. The problem is that people can’t really lead a normal life while running these exercises online at every opportunity or even these mental exercises. 

If you stop before every decision to think about cognitive biases that might be leaving you to make that decision, you’re going to be kind of stunted or stuck in this dread of too many variables to think about. 

Right. I mean, you could ultimately be paralyzed with indecision or you’d be a social outcast because your conversations would be so awkward and whole thing, you know. So where it actually has some good effect is in controlled environment. Like in jury settings where a sheet could be handed out and and instructions could be delivered. 

But in order to try to teach people about overconfidence and then correct overconfidence with an educational program, it would really require a kind of social intervention that almost everyone would dislike, you know, to enter public schools and and put children on a deep biasing program of a sort that is both ridiculous and troublesome. 

Right. Something out of science fiction or something. 

Right. Right. And so, you know, recognizing that recognizing that when you can start policy, you should begin by taking people as as they are. Hmm. You develop these outside strategies that are strategies that create an environment in which these biases won’t be tempted in the first place. 

And this is the the other big part of your book where you not your term for it, but where you talk about social engineering, you talk about kind of organizing society in a way that helps people overcome these biases without them necessarily even knowing it. 

That’s right. And, you know, one one point to be made at the outset is that, you know. In recognition of the fact that social engineering has a an eerie sound to it, we’re always engaged in and we’re always subjects in grand social experiment whether or not we want to be and whether or not the experiment is done well. So, you know, we were in in a supply side experiment in the 80s. You know, oftentimes we suffer the effects of such experiments for years to come. But here is a case where in in surgical ways on particular topics like finance and retirement, in medical choices and so on, you could actually engineer circumstances or formulate circumstances where options are proposed to people and they can hear the clear signal out of the noise. 

So these are things like automatic savings programs for people or. 

That’s right. And not only making the program automatic actually helps people achieve their goal, but also on the basis of psychological research. 

We’re pretty confident and we hope that confidence is calibrated that people adapt to the lower income in their monthly paycheck when they are part two, when they participate in an automatic savings program. So they never feel it. 

So we’re touching on all sorts of subjects, economics, and we’re talking kind of big picture economic issues, you know, capitalism versus if not other ways to organize ourselves, other ways to moderate capitalism, let’s say kind of this naked free market stuff. But, you know, we’ve only scratched the surface. I’d like to have you back on the show so we can continue the conversation just because I know you have a tight schedule this morning. 

But I want to get into the actual policy proposals that that you get into in the book. How’s that sound? Sure. They’d be great. Well, with that, thank you very much for joining me. Professor J.B. Trought on point of Inquiry. Thanks for having me. 

The world is under assault today by religious extremists to invoke their particular notion of God to try and control what others think can do. One magazine is dedicated to keeping you up to date with analysis that cuts through the noise and the surprising courage to appear politically incorrect. That magazine is Free Inquiry, the world’s leading journal of secular humanist opinion and commentary. Subscribe to free inquiry today. One year, six controversial issues for nineteen ninety five. Call one 800 four five eight one three six six. Or visit us on the web at Secular Humanism. 

Dawg, thank you for listening to this episode of Point of Inquiry. For updates throughout the week, find me on Facebook and on Twitter. If you want to get involved with an online conversation about today’s show, do so at our Web site point of inquiry dot org. Views expressed on today’s show aren’t necessarily the views of the Center for Inquiry, nor its affiliated organizations. Questions and comments on today’s show can be sent to feedback at point of inquiry dot org. 

Point of inquiry is produced by Thomas Donnelly and recorded from St. Louis, Missouri, Point of inquiries. Music is composed Bora’s by Emmy Award winning Michael Quailing. Contributors to today’s show included Sarah Jordan and Debbie Goddard. I’m your host DJ Grothe. 

 


DJ Grothe

D.J. Grothe is on the Board of Directors for the Institute for Science and Human Values, and is a speaker on various topics that touch on the intersection of education, science and belief. He was once the president of the James Randi Educational Foundation and was former Director of Outreach Programs for the Center for Inquiry and associate editor of Free Inquiry magazine. He previously hosted the weekly radio show and podcast Point of Inquiry, exploring the implications of the scientific outlook with leading thinkers.