How We Know What Isn’t So by Thomas Gilovic

There is a plague if illogical reasoning today. Mr. Gilovich says this is because “. . . there are inherent biases in the data upon which we base our beliefs, biases that must be recognized and overcome if we are to arrive at sound judgments and valid beliefs.” The cost of these biases is real and severe. We are naturally prone to wrong thinking, and this book shows us how we can combat this.

Points Mr. Gilovich made:

1. Seeing Order in Randomness – We have a natural tendency to see order in data, even when the data is totally random and irregular. We do this even when we have no personal reason to see order. When we remember events from the past, our memory plays tricks with us by emphasizing any possible patterns, and forgetting irregularities that might refute the patterns. For instance, basketball players often think that if they make one successful basket, they are more likely to make the next – baskets come in streaks – when you’re “hot.” However, objective statistical studies show that, if anything, the opposite is true.

This natural tendency to misconstrue random events is called the “clustering illusion.” Chance events often seem to have some order to them, but when the law of averages is applied objectively, this order disappears. This error is compounded when our imagination tries to create theories for why there should be order. Because of this, we need to be careful when we make conclusions based on a sequence we see in some data.

2. Tendency to Look for Confirmation – We have a natural tendency to look for “yes” instead of “no.” If we have an idea, we tend to look for evidence that confirms our idea, not evidence that will disprove it. This is true even if we have no personal attachment to the idea.

Some researchers believe this tendency results from our need to take an extra neurological step when we try to understand negative or disconfirming evidence, as contrasted with positive or confirming evidence. Every negative proposition may need to be translated into a positive one for us to understand it. Therefore, we subconsciously look for easy positives and not difficult negatives. This tendency makes for bad objectivity and bad science. If we want to do good science, we need to train ourselves to look for negative evidence that contradicts our ideas.

3. Hidden Data – When we search for evidence, often there is data that we unintentionally overlook. For instance, if we get a bad first impression about a person, we may avoid them, and by avoiding them, they never have a chance to show us a better side of their personality. But if we get a good impression, we may decide get to know a person better, and thereby gather more positive data, and falsely confirm in our mind that first impressions are meaningful. Often the way we collect data filters out important types of data, and this causes us to confirm our wrong ideas. We need to pay attention to how we may see only a distorted side of an issue.

4. Mental Corner-Cutting – We all cut corners with our mind. We often use mental strategies – inductive generalizations – to understand the world around us more quickly and easily. These strategies are very useful. But they come at a cost. These corner-cutting strategies can cause systematic errors in our thinking.

5. Objectivity is Not Always Useful – We shouldn’t expect everyone to reevaluate their beliefs every time a new piece of evidence comes along. “Well-supported beliefs and theories have earned a bit of inertia. . .” However, we should draw a distinction between a belief that is well supported by evidence over time, and a belief that only has traditional or popular support. Some scientists believe the complex mental processes that give us biases and preconceived notions are some of the same processes that make us intelligent beings – superior to computers or animals. Our biases are useful, but dangerous.

6. Reinterpreting Evidence – When people are presented with ambiguous information, they interpret it to support their established beliefs. When people are presented with unambiguous information that contradicts their beliefs, they tend to pay close attention to it, scrutinize it, and either invent a way of discounting it as unreliable, or redefine it to be less damaging than it really is.

For instance, gamblers tend to remember their losses very well – remember them better than their winnings – but they remember their losses as “near” wins that provide clues on how to win next time. Gamblers aren’t the only ones to do this. Christians do too, as do scientists, presidential candidates and insurance agents.

7. Science is basically the systematic attempt to remove biases as we search for truth. Nobel laurite Linus Pauling said that to be a good scientist, “. . . you need to have a lot of ideas, and then you have to throw away the bad ones.”

8. Remembering Selective Evidence – Charles Darwin once said that he “. . . followed a golden rule, namely that whenever a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favorable ones.”

In fact, this does not always occur. People do not necessarily remember only evidence that supports their beliefs. Rather, they remember events that cause them pain or difficulty, events that they predicted, or events that otherwise drew their attention. They forget events that follow the normal course of events.

For example, some people think that they always end up needing things that they threw away. But this is only because they remember the things that they threw away, but later needed; while they forget about the many more times that they threw something away, and never needed it again. Another example is how people often say they wake up and their digital clock reads something like 1:23 or 12:12. This seems to be more than a coincidence. However, they are simply forgetting the many more times that they’ve woke up and the clock read 3:54 or 10:17. Certain types of events stick in our memory. We need to be careful that our selective memories do not bias our thinking.

9. The Wish to Believe and the Lake Wobegon Effect – The vast majority of people think of themselves as above average in qualities that they think are important. This is called the “Lake Wobegon Effect” after the fictitious community where “all the women are strong, the men are good-looking, and all the children are above average.”

For instance, a survey of high-school seniors found that 70% of them thought that they were above average in leadership ability, and 60% thought they were in the top 10% of likeable people. 94% of college professors think they were better than their colleagues.

One way that people try to confirm their beliefs is to search for evidence until they find something that supports them. They may do a very detailed, in-depth study of something, but they do not stop when they uncover evidence against their beliefs, they continue on and only stop when they’ve found enough evidence on their side to relieve their conscience.

When looking at evidence that supports what we believe, we only ask that it leave the door open for our beliefs. But when we find evidence that contradicts what we believe, we hold it to a higher standard and ask that it prove its findings beyond a reasonable doubt.

For example, people who believe in a particular stringent health diet may look around for evidence that their diet is working, while people who eat more permissively find solace in studies that say it doesn’t matter what we eat. Conservatives tend to read conservative periodicals and not liberal ones, and therefore they are only exposed to evidence that bolsters their convictions. Liberals do the same. What we need here is to search in an even-handed way for supporting evidence and contradicting evidence, and weigh the two objectively.

Beliefs are like cloths, sometimes we go for what’s on sale.

10. Telling Stories – Much of what we know about our world we heard from others. But second-hand information is often simplified and “cleaned up” as it is told. As we relate stories, we often exaggerate them, or make them happen to a friend instead of an unknown person, or try to make the story more understandable. We do this subconsciously because we want our audience to be entertained or impressed.

As a result of this, we need to evaluate stories we hear by: (1) considering the source of the message, (2) putting more credence in actual statements of fact and not predictions, (3) scale estimates down by accepting the less drastic if two numbers offered to us, (4) not allowing our personal feelings towards someone deceive us into thinking that they are an example of a widespread phenomenon.

11. Correction from Others – Our friends and acquaintances can bring an objective perspective to our habits and beliefs. For instance, young children are good at correcting silly behaviors in each other, such as a funny way of walking, or eating with your mouth open, or the belief that calk is made from dried bird droppings. But, as we get older, we tend to associate with people who agree with us or share our habits, and therefore we no longer receive these useful corrections. As a result, if we adopt a defective belief, we may never receive the correction we need.

12. Strategies – If we humans have some innate natural tendencies to reason wrongly, how can we try to combat this? We can help by training our mind to compensate for our shortcomings. (1) We should be aware of how our mind tries to see order even when there is no order. (2) We should be aware of how our mind forgets things and remembers things in a very biased way. (3) We should actively search for data that we may have missed, and especially search for data that contradicts our theories or beliefs. (4) Ask ourselves, how would someone who disagrees with me look at this data? (5) Remember that stories that we hear may come from an unreliable source, or may be exaggerated by the storyteller to make a point.

As we understand more about our erroneous beliefs we can put more faith in the beliefs we have validated.

Conclusion

These observations apply to the conservative Christian community as much as the rest of the world. We desperately need leaders who will look at their own beliefs with the same critical eye that they turn on the “liberal media.” I’ve never found a book like this one in the conservative community, and this makes me ashamed. Mr. Gilovich is not a Christian. Why can’t I find Christian leaders who will take a stand for self-criticism like Mr. Gilovich does in the secular world?

I’m thinking right now of several leaders in the conservative Christian community – leaders who have done waves of good for strengthening the Biblical family and culture, but who have torn down sacred standards of reasoning in the process. Let’s not try to promote good ideas at the expense of our standards of reasoning. For instance, it’s hard to get creationists to admit the evidence that contradicts creationism. They like to think that all evidence is in their favor. It’s hard to admit this myself. But if creationists were to be more public about the problems with their theories, more people would be impressed with their objectivity and reliability.

The challenge I have for myself is to become more aware of how I am reasoning, and be honest enough to acknowledge the errors I may discover there.

Facebook Comments

Site Comments

No comments yet.

Commenting is not available in this channel entry.
How We Know What Isn't So

Sign Up!

Join the The Fallacy Detective News and receive "The Fallacy Detective Test" for free!