1. Our web server is getting tweaked this week and as a result some links are behaving strange. We're working on getting everything back to normal ASAP.
    Dismiss Notice
Dismiss Notice
Alien Soup is a free community for fans of science-fiction, horror, & fantasy! Everybody is welcome here.

How We Know What Isn't So by Thomas Gilovich

Discussion in 'Books' started by Anthony G Williams, Aug 1, 2008.

  1. Anthony G Williams

    Anthony G Williams Greybeard Writer

    Jul 14, 2007
    How We Know What Isn't So by Thomas Gilovich

    A break from fiction this week, to consider Gilovich's important book, subtitled 'The Fallibility of Human Reason in Everyday Life'. As this suggests it is a study of why we tend to believe certain things (with lots of examples of popular misconceptions from everyday life), despite the lack of evidence for them or, in many cases, solid evidence that they are not true. A couple of points to clarify before anyone starts getting defensive: the book doesn't belittle people for what they believe, it just analyses the basis for such beliefs; and little is said about religion.

    A common problem is the misunderstanding of statistics, especially probability theory, which can sometimes produce counter-intuitive results. A well-known example of this is the answer to the question "how many people do you need in a room in order to get a 50% chance that two of them will have the same birthday?" The answer is 23; and what's more, you only need 35 for the probability to rise to 85%. Most people find this amazing (I did too, despite some limited experience of probability theory). One reason for misunderstanding statistics is the clustering tendency of random events. If you toss a coin and it comes up heads five times in succession, you might think this is remarkable, but it is in fact inevitable if you keep tossing the coin long enough: the "50/50" rule only applies over a long series.

    As a result of this lack of understanding, people experience coincidences which are well within normal probabilities and wrongly believe that something remarkable has happened, or even that they can't really be coincidences at all but must have some greater significance. This is exacerbated by the fact that humans have an inbuilt tendency to seek patterns in events, to the extent of seeing them where they don't exist.

    A lack of contrary information can lead to unwarranted beliefs. For example, a selection board which interviews candidates for an academic or training scheme may believe that they are doing a good job, because the majority of their choices perform well. But they have no way of knowing how well the people they rejected would have performed, given the chance. In fact, research into the selection process has shown that "decisions based on objective criteria alone are at least as effective as those influenced by subjective impressions formed in an interview".

    We are often misled by information we receive second hand, because of the tendency to "sharpen and level", as the author puts it. By this he means that in relaying a news item, for instance, we tend to emphasise the points which we consider to be important (or which we believe) and downplay or omit other aspects. So if a carefully-written report comes to a tentative conclusion which we agree with, but wraps this around with qualifications and caveats, we tend just to relay the conclusions, making the results appear far more definite than the report's authors intended. As people "sharpen" different aspects of information to suit their beliefs, so we get a rapid polarisation of opinions on controversial issues. Even worse, some organisations deliberately "sharpen and level" because they want to turn public opinion in their favour [popular news media and politicians are of course notorious for presenting such selectively slanted information, especially during election campaigns, but so do many organisations with agendas]. Most "urban legends" probably develop as a result of an extreme version of this, with the key points pulled out and exaggerated.

    This sharpening effect is exacerbated by the fact that if we hold certain beliefs, we are likely to discuss them only with people who agree with us, and only to read supportive publications. Our beliefs are thereby rarely challenged but instead are constantly reinforced, so we tend to end up with the view that our beliefs are naturally and obviously right. Anyone who disagrees with them must therefore be entirely mistaken and possibly downright stupid if not malevolent. This polarisation is obvious today in politics and in debates about other controversial issues. In reality, of course, situations are rarely as polarised as this: we exaggerate differences.

    A major reason for many misplaced beliefs is that notable events stick in our minds, whereas we are much less likely to remember when something did not happen. This can distort our understanding of the likelihood of particular events. For example, it is commonly believed that a previously infertile couple is much more likely to conceive after they have adopted a child. A careful analysis of a mass of birth and adoption statistics shows that there is no truth in this at all; there is no such effect. People believe that there is because if a couple does conceive after adoption it is a notable event likely to be commented on and remembered. Conversely, no-one remembers the couples who did not conceive after adoption, or those who eventually conceived without adoption (who may well not have publicised their fertility problems).

    A related issue is that if we hold certain beliefs, we are much more likely to seize on and remember any events which appear to confirm those beliefs, while dismissing and quickly forgetting any contrary evidence. Even if we do spend time examining contrary evidence, it is usually only to attack it aggressively and try to find fault with it, while we accept at face value anything which appears to support our beliefs.

    A major explanation for our beliefs is that we tend to believe what we would like to be true. An obvious example is life after death. It would be wonderful if our personalities and intelligence survived in some way after death, which accounts for some of the most powerfully-held human beliefs: most people really want to believe this. More generally, there is a yearning for order and purpose in life, a wish to believe that there is more to it than meets the eye. Many find the concept that we are here (individually and collectively) only by random chance in a vast and uncaring universe simply unacceptable. They feel that it makes them, and life itself, pointless and worthless, so they instinctively reject it, leading them to dismiss, for instance, the overwhelming evidence for evolution in favour of beliefs which have no evidential support at all.

    A belief in extra-sensory perception is also widespread (and a very common theme of SFF) but, as the author points out, no evidence for it has ever survived any objective analysis. Some promoters of the idea claim that trying to measure it prevents it from working, which sceptics might regard as a self-serving way of avoiding the need to provide any proof. There are various reasons for a belief in ESP, including a long history of plausible fraudsters and a very biased coverage in relevant news media, books and magazines (the vast majority of which uncritically support the idea), but the basic reason is probably that it's something that we would love to be true – for us to have such impressive and useful powers. I suspect that a belief in an alien origin of UFOs falls into the same category.

    A similar example concerns alternative medicine in general, and faith healing in particular. For people (especially if seriously ill) who have not been helped by conventional medicine, there is a powerful motivation to believe anyone who offers a potential cure. Examples of "cures" are seized upon as proof, ignoring the fact that the body has a potent self-repairing system and that many ailments clear up by themselves given time. Alternative medicine also often relies on plausible (but false) similarities. The classic case is the enthusiasm in some parts of the world for medicines incorporating ground-up rhino horn to use as a kind of alternative Viagra – simply because it's long and hard and stands up all the time. More controversially (because it concerns our culture's popular beliefs rather than another's) the author points out that homeopathy falls into the same category; there is no validated evidence that it works, and no logical reason why it should [it makes the rhino horn notion look relatively sensible].

    We have a remarkable capacity for self-delusion when it suits us. A survey of one million US high school seniors showed that 70% believed that they were above average in leadership ability, and only 2% that they were below average [much the same results occur in surveys which invite people to rate their own driving ability].

    A final point: the perception of human fallibility in understanding is not exactly new. The book includes a couple of quotes from Francis Bacon, the 16th/17th century philosopher:

    "The human understanding supposes a greater degree of order and equality in things than it really finds; and although many things in nature be sui generis and most irregular, will yet invest parallels and conjugates and relatives where no such thing is." Which is to say in simpler modern language, that we tend to see patterns and relationships where none exist.


    "…all superstition is much the same whether it be that of astrology, dreams, omens, retributive judgment, or the like…[in that] the deluded believers observe events which are fulfilled, but neglect or pass over their failure, though it be much more common."

    In this review I have only had space to provide a very superficial summary of a few highlights, but Gilovich's book is packed full of examples and detailed explanations, so if this kind of thing intrigues you, go and find a copy!

    (This entry is cross-posted from my science-fiction & fantasy blog.)

Share This Page