Interview:

ROBERT TRIVERS


Professor of anthropology and biological sciences at Rutgers University, Dr. Trivers was awarded the prestigious Crafoord Prize in biosciences in 2007 by The Royal Swedish Academy of Sciences. The academy chose Dr. Trivers “for his fundamental analysis of social evolution, conflict, and cooperation.” World renowned among scientists, Dr. Trivers’ work is frequently cited in books by contemporary science icons Steven Pinker, E. O. Wilson, Richard Dawkins, and many others.


Dr. Trivers is brilliant, tough, and has made numerous contributions to human knowledge. The interview primarily covers his important work, The Folly of Fools: The Logic of Deceit and Self-Deception in Human Life. If you want to better understand the nature human, and the modus operandi of DNA vehicles in general, please read Dr. Trivers’ book.

(See excerpts from the book above in Book Excerpts 2)



Interview:

 

I’m a little nervous about this interview because you already know I’m full of it, by definition.

 

(Laughs) I wouldn’t have said that.

 

 I would have understood if you had. Can you talk about deception in nature?

  

Deception is a deep and widespread feature of nature at all levels. Viruses are fooling us. HIV is constantly changing to fool us, just like the camouflage of an octopus. When an octopus is put in front of a background where its camouflage doesn’t work, it will shuffle through images of itself about every three seconds, so that just as the predator recognizes it, it has morphed into a new form. It has about 60 or 70 camouflage images it can try.

 

That’s amazing.

         

It is amazing. And it’s kind of parallel in logic to what the HIV strain does. So deception, camouflage, mimicry between different species or within a species is pervasive. Self-deception, which is really what my book is about, is less studied in other species.


Why do we self-deceive?

         

The general argument of the book is that the ability to self-deceive improves the ability to deceive others. But self-deception has a cost. It puts you out of context with reality and that can turn around and bite you in various ways.

 

Regarding self-deception, you write about people picking out their photo that was altered to be 20% better looking.

         

Yes, that’s one of the best examples of self-deception. In general, people place themselves in the top half of various categories more than fifty percent of the time. So maybe 70% of people will say they’re better than average in looks. Of course we know that can’t be possible. But is that just the mouth talking or do they really think they’re better looking? That’s where the work of Epley and Whitchurch is so nice. They’ve created an experiment where they’ve taken a picture of you and morphed it to look 20% better and again to be 20% uglier. The next step is to have you pick out your face from a set of twelve photos of people of your age and sex. Your photo is one of the twelve. The difference is, some of the time the photo is the real you, sometimes it’s the uglier you, sometimes it’s the better-looking you. The question is, which “you” do you see first? The answer is that you see better-looking you first, then real you, and last and slowest, the ugly version. 

 

So it’s more than the mouth blabbing. People self-deceive about their looks.

         

They do.

 

You also write about overconfidence.

         

Yes, and the sad thing there is that overconfidence and knowledge are poorly correlated. For example, it’s been noted in the field of medicine that compared to newer physicians, senior physicians are both more likely to be wrong but more confident that they are right. This has turned out to be a real problem because, of course, senior physicians have the most power. Hospitals used to have a problem with major infection in surgery from the hands of the surgeon. This was well known—the importance of reducing bacteria levels in surgery—but the problem was you couldn’t get the surgeons to wash their hands. The nurse would say, “Doctor, I think you ought to wash your hands.” And the surgeons would respond, “What are you talking about, get me the scalpel,” or something. A system was instituted to empower the nurses. A special line was installed that allowed direct communication with the head surgeon. All the nurse had to do was to call the head surgeon and have him or her say, “Wash your hands!” The rate of infections plummeted by almost half an order of magnitude—a huge positive effect. (Ed. Note: Paper published by Pronovost and Vohr 2010.) It was just a matter of having the “lowly” person’s opinion count. They are viewing things less through the lens of ego than the more powerful person, and often more accurately because of that.


There is also some evidence in the judicial system with regard to eyewitnesses and overconfidence. Some witnesses are more likely to be mistaken, and, simultaneously, more confident that they’re right. Confident people make better witnesses. They’re more believable to the jury. So let’s say you and I are both witnesses to a crime. I’m uncertain of exactly what I saw and I acknowledge my uncertainty. You’re certain, but you’re wrong! Your certainty is what is noted by the jury because they don’t have any independent information as to what’s right or wrong. So if one witness is hesitant and the other confident, the confident one sways the jury.

 

Sure. Surgeons not washing their hands fits with what you wrote about power, that people in positions of power are almost instantly corrupted by that power.


This is the new social psychology that I like where they get away from all this question-and-answer stuff and actually do something clever. So in this case, they use a con, a device where you give the organism some stimulus, called a prime, which causes a response. You measure people’s behavior and perceptions right after administering the prime to see its effect.


The prime is almost unbelievably simple. They bring you and me into a lab and they have you remember a situation in which you were in power. They have you elaborate: what did you feel, what did you do, etc. They want you to feel the sense of power. Then they let you determine how many M&Ms the other group members get. With me they do the opposite. They have me recall a situation where I felt powerless then ask the same questions, how did I feel, etc. Then they have me write down how many M&Ms I hope to get.


The priming has the following effects. If I’m asked to write an E on my forehead, there are two different ways I can write it. I can write it so I can read it, or I can write it backwards so others can read it. Sure enough, the more powerful you feel, the more likely you are to write the E so it represents your own vantage point. You can read it, but it’s backwards to others. Conversely, the weaker you feel, the more likely you are to write the E so others can read it. But writing E’s on foreheads ain’t what life’s about.

         

So what are the real consequences? The real consequences are that if you’re made to feel more powerful, your ability to recognize the various facial expressions is diminished. The powerless read facial expressions more accurately—and they remember them better too. So power is inducing a kind of social blindness in which you’re seeing reality less clearly. That’s part of an egocentric, on-top-of-the-world approach. The powerless person, on the other hand, has got to pay attention to people’s facial expressions because those other people have power over them.

 

In the book, you cite studies that have found that language, ethnic and religious diversity to be partly a function of parasite load.

.

Yes. Basically the theory is that when there are more parasites in a region, that is, a greater parasite load, the people outside the group represent a dimension of threat because they are more apt to have parasites that members of your group are unfamiliar with and may lack immunity for. It’s not coincidental, I think, that out-group members are often described as parasite-ridden—covered with flies, they stink, etc. So the theory is that with a higher parasite load you’re going to be more inward-turning, more ethnocentric, and less likely to want to interact with neighbors. There’s more fragmentation so languages and religions are more split up. The linguistic data are pretty strong. There are many more languages in areas with greater parasite loads. Religious diversity goes along with that division. I find that a lot of colleagues that aren’t used to thinking in terms of parasite load are skeptical that an underlying variable like this could have these ramifying social effects. But I’ve studied the literature produced by these people, paper after paper, and the correlations seem to me to be very strong.

 

You write about free will and consciousness in the book. Can you go on that a bit?

         

Regarding free will and consciousness, I’ve always thought that a lot of the controversy comes from how you define the words. What’s free will? I don’t exactly know what people mean by free will, but if they mean that we have the capacity to look at our past behavior and readjust it, as I’m sure we’ve been selected to do, then I have no problem with that definition of free will.

         

The key thing about the neurophysiological evidence is that the conscious mind is running behind reality. When an event occurs, it takes a half second before it fully registers in consciousness. Events register unconsciously in a twentieth of that time. During lectures I will take out a pen and throw it across the room to make a point. We know that the impulse to do that started a good second to as much as eight seconds before I actually threw it. It looks like I’m making a point by throwing the pen, but in fact I was already planning that unconsciously. Consciousness is more like an observer after the fact, but it appears to have veto power. For example, you’ll start an action, it’ll be a second or so before your conscious mind is aware that you are now thinking of throwing the pen, and you then have a period of half a second where you can veto the action.

         

I think in general people don’t understand consciousness, and assume that one of our species’ big advantages is that our conscious mind is in charge, that we have significant control over events, and so on. The fact of the matter is that our conscious mind is not sitting there running things. The bigger part of the brain is running things and it transmits all kinds of biased information to the conscious mind, presumably in service of trying to fool others.

 

Is there anything else you want to go on?

         

One of the deficiencies in my book is that I never provide a good catalog of the costs of self-deception. I was painfully aware of that when I published it. There should have been a taxonomy of the costs of self-deception. What are the social costs of self-deception? For example, if you’re the only one who doesn’t realize you’re a jerk but everyone else thinks you are, your self-deception costs you socially.


Self-deception also increases one’s vulnerability to con artists and so on. The con artist is reading the mark’s self-deceptions and playing them like a musical instrument. The immune system effects of self-deception are fascinating and subject to alternative interpretations, but basically, I argue, ala the work of Jamie Pennebaker, about the suppression of trauma. Is it a good idea to hide, to suppress trauma? The answer is no. The more you hide trauma from other people, the more you suffer in immune costs and your immune system is weakened.

         I’m often asked if cultures vary in their degrees of self-deception. Well, by logic they must, but we don’t have much evidence. One thing that I write about on my Psychology Today blog (see my website, RobertTrivers.com) is a study across 15 countries that shows that the more wealth inequality there is across the society, the more self-inflation occurs. Let’s say wealth is equal across a country and everyone has the same amount. Well, what’s the benefit of self-inflating in the economic sphere? Not much. But if there’s a great differential in wealth, convincing people that you’re better than you are may yield benefits. There’s a reason to self-inflate.

 

Are you an Atheist?


When I’m asked on a scientific basis if I’m an Atheist I say no. The only scientifically defensible position is to be an agnostic because you can’t prove either half of the equation. Richard Dawkins is very skillful on the topic of religion and says some very funny things. A favorite of mine is: “We are all atheists about most of the gods that humanity has ever believed in. Some of us just go one god further.”  They have all the other gods of the world nailed down except their own, which they then cling to.


I certainly don’t believe in intercessory prayer. The idea that you have all these universal laws of nature that god is willing to change on a minute-to-minute basis due to begging behavior by humans, I think that is utter nonsense.

         

As I mention in the book, the Templeton Foundation funded a large, double-blind study at six different institutions, including Harvard Medical School, about the effectiveness of intercessory prayer. They found that being prayed for had no beneficial effect whatsoever. However, being told that you were being prayed for had a negative effect. It’s what they call, not a placebo effect, but a no-cebo effect.

         

On every single measure for how well you were doing the month after the operation, being told you were being prayed for made you worse. So if you’re in bed and people come in and tell you, “We’re all praying for you,” you think, “Oh man, this is even worse than I thought.” Or, is it that instinctively you know that being prayed for isn’t doing anything for you? When they visit, why don’t they say, “I’m feeding your dog,” or “I cancelled your subscription to the New York Times,” you know, practical things, rather than prayer?