AP Psychology

Module 4 - The Need for Psychological Science

LEARNING OBJECTIVES:

FOCUS QUESTIONS: How do hindsight bias, overconfidence, and the tendency to perceive order in random events illustrate why science-based answers are more valid than those based on intuition and common sense?

Some people suppose that psychology merely documents and dresses in jargon what people already know: “So what else is new - you get paid for using fancy methods to prove what everyone knows?” Others place their faith in human intuition: “Buried deep within each and every one of us, there is an instinctive, heart-felt awareness that provides – if we allow it to – the most reliable guide,” offered Prince Charles (2000).

Prince Charles has much company, judging from the long list of pop psychology books on “intuitive managing,” “intuitive trading,” and “intuitive healing.” Today’s psychological science does document a vast intuitive mind. As we will see, our thinking, memory, and attitudes operate on two levels - conscious and unconscious - with the larger part operating automatically, off-screen. Like jumbo jets, we fly mostly on autopilot.

So, are we smart to listen to the whispers of our inner wisdom, to simply trust “the force within”? Or should we more often be subjecting our intuitive hunches to skeptical scrutiny?

This much seems certain: We often underestimate intuition’s perils. My geographical intuition tells me that Reno is east of Los Angeles, that Rome is south of New York, that Atlanta is east of Detroit. But I am wrong, wrong, and wrong.

Modules to come will show that experiments have found people greatly overestimating their lie detection accuracy, their eyewitness recollections, their interviewee assessments, their risk predictions, and their stock-picking talents. As a Nobel Prize-winning physicist explained, “The first principle is that you must not fool yourself - and you are the easiest person to fool” (Feynman, 1997).

Indeed, observed novelist Madeleine L’Engle, “The naked intellect is an extraordinarily inaccurate instrument” (1973). Three phenomena - hindsight bias, judgmental overconfidence, and our tendency to perceive patterns in random events - illustrate why we cannot rely solely on intuition and common sense.

Did We Know It All Along? Hindsight Bias

Consider how easy it is to draw the bull’s eye after the arrow strikes. After the stock market drops, people say it was “due for a correction.” After the football game, we credit the coach if a “gutsy play” wins the game, and fault the coach for the “stupid play” if it doesn’t. After a war or an election, its outcome usually seems obvious. Although history may therefore seem like a series of inevitable events, the actual future is seldom foreseen. No one’s diary recorded, “Today the Hundred Years War began.”

This hindsight bias (also known as the I-knew-it-all-along phenomenon) is easy to demonstrate: Give half the members of a group some purported psychological finding, and give the other half an opposite result. Tell the first group, “Psychologists have found that separation weakens romantic attraction. As the saying goes, "Out of sight, out of mind." Ask them to imagine why this might be true. Most people can, and nearly all will then view this true finding as unsurprising.

Tell the second group the opposite, “Psychologists have found that separation strengthens romantic attraction. As the saying goes, ‘Absence makes the heart grow fonder.’” People given this untrue result can also easily imagine it, and most will also see it as unsurprising. When two opposite findings both seem like common sense, there is a problem.

Such errors in our recollections and explanations show why we need psychological research. Just asking people how and why they felt or acted as they did can sometimes be misleading - not because common sense is usually wrong, but because common sense more easily describes what has happened than what will happen. As physicist Niels Bohr reportedly said, “Prediction is very difficult, especially about the future.”

Some 100 studies have observed hindsight bias in various countries and among both children and adults (Blank et al., 2007). Nevertheless, our intuition is often right. As Yogi Berra once said, “You can observe a lot by watching.” (We have Berra to thank for other gems, such as “Nobody ever comes here – it’s too crowded,” and “If the people don’t want to come out to the ballpark, nobody’s gonna stop ‘em.”) Because we’re all behavior watchers, it would be surprising if many of psychology’s findings had not been foreseen. Many people believe that love breeds happiness, and they are right (we have what Module 40 calls a deep “need to belong”). Indeed, note Daniel Gilbert, Brett Pelham, and Douglas Krull (2003), “good ideas in psychology usually have an oddly familiar quality, and the moment we encounter them we feel certain that we once came close to thinking the same thing ourselves and simply failed to write it down.” Good ideas are like good inventions; once created, they seem obvious. (Why did it take so long for someone to invent suitcases on wheels and Post-it Notes?)

But sometimes our intuition, informed by countless casual observations, has it wrong. In later modules we will see how research has overturned popular ideas - that familiarity breeds contempt, that dreams predict the future, and that most of us use only 10 percent of our brain. (See also TABLE 4.1.) We will also see how it has surprised us with discoveries about how the brain’s chemical messengers control our moods and memories, about other animals’ abilities, and about the effects of stress on our capacity to fight disease.

Overconfidence

We humans tend to think we know more than we do. Asked how sure we are of our answers to factual questions (Is Boston north or south of Paris?), we tend to be more confident than correct. Or consider these three anagrams, which Richard Goranson (1978) asked people to unscramble:

WREAT ---> WATER

ETRYN ---> ENTRY

GRABE ---> BARGE

About how many seconds do you think it would have taken you to unscramble each of these? Did hindsight influence you? Knowing the answers tends to make us overconfident - surely the solution would take only 10 seconds or so, when in reality the average problem solver spends 3 minutes, as you also might, given a similar anagram without the solution: OCHSA.

Are we any better at predicting social behavior? University of Pennsylvania psychologist Philip Tetlock (1998, 2005) collected more than 27,000 expert predictions of world events, such as the future of South Africa or whether Quebec would separate from Canada. His repeated finding: These predictions, which experts made with 80 percent confidence on average, were right less than 40 percent of the time. Nevertheless, even those who erred maintained their confidence by noting they were “almost right.” “The Quebecois separatists almost won the secessionist referendum.”

Perceiving Order in Random Events

In our natural eagerness to make sense of our world - what poet Wallace Stevens called our “rage for order” - we are prone to perceive patterns. People see a face on the moon, hear Satanic messages in music, perceive the Virgin Mary’s image on a grilled cheese sandwich. Even in random data we often find order, because - here’s a curious fact of life - random sequences often don’t look random (Falk et al., 2009; Nickerson, 2002, 2005) . Consider a random coin flip: If someone flipped a coin six times, which of the following sequences of heads (H) and tails (T) would be most likely: HHHTIT or HTTHTH or HHHHHH?

Daniel Kahneman and Amos Tversky (1972) found that most people believe HTTHTH would be the most likely random sequence. Actually, all three are equally likely (or, you might say, equally unlikely). A poker hand of 10 through ace, all of hearts, would seem extraordinary; actually, it would be no more or less likely than any other specific hand of cards (FIGURE 4.1).

In actual random sequences, patterns and streaks (such as repeating digits) occur more often than people expect (Oskarsson et al., 2009). To demonstrate this phenomenon for myself, I flipped a coin 51 times, with these results:

Looking over the sequence, patterns jump out: Tosses 10 to 22 provided an almost perfect pattern of pairs of tails followed by pairs of heads. On tosses 30 to 38 I had a “cold hand,” with only one head in nine tosses. But my fortunes immediately reversed with a “hot hand” - seven heads out of the next nine tosses. Similar streaks happen, about as often as one would expect in random sequences, in basketball shooting, baseball hitting, and mutual fund stock pickers’ selections (Gilovich et al., 1985; Malkiel,2007; Myers,2002). These sequences often don’t look random and so are overinterpreted. (“When you’re hot, you’re hot!”)

What explains these streaky patterns? Was I exercising some sort of paranormal control over my coin? Did I snap out of my tails funk and get in a heads groove? No such explanations are needed, for these are the sorts of streaks found in any random data. Comparing each toss to the next, 23 of the 50 comparisons yielded a changed result - just the sort of near 50-50 result we expect from coin tossing. Despite seeming patterns, the outcome of one toss gives no clue to the outcome of the next.

However, some happenings seem so extraordinary that we struggle to conceive an ordinary, chance-related explanation (as applies to our coin tosses). In such cases, statisticians often are less mystified. When Evelyn Marie Adams won the New Jersey lottery twice, newspapers reported the odds of her feat as 1 in 17 trillion. Bizarre? Actually, 1 in 17 trillion are indeed the odds that a given person who buys a single ticket for two New Jersey lotteries will win both times. And given the millions of people who buy U.S. state lottery tickets, statisticians Stephen Samuels and George McCabe (1989) reported, it was “practically a sure thing” that someday, somewhere, someone would hit a state jackpot twice. Indeed, said fellow statisticians Persi Diaconis and Frederick Mosteller (1989), “with a large enough sample, any outrageous thing is likely to happen.” An event that happens to but 1 in 1 billion people every day occurs about 7 times a day, 2500 times a year.

The point to remember: Hindsight bias, overconfidence, and our tendency to perceive patterns in random events often lead us to overestimate our intuition. But scientific inquiry can help us sift reality from illusion.

The Scientific Attitude: Curious, Skeptical, and Humble

FOCUS QUESTION: How do the scientific attitude’s three main components relate to critical thinking?

Underlying all science is, first, a hard-headed curiosity, a passion to explore and understand without misleading or being misled. Some questions (Is there life after death?) are beyond science. Answering them in any way requires a leap of faith. With many other ideas (Can some people demonstrate ESP?), the proof is in the pudding. Let the facts speak for themselves.

Magician James Randi has used this empirical approach when testing those claiming to see auras around people’s bodies:

Randi: Do you see an aura around my head?

Aura-seer: Yes, indeed.

Randi: Can you still see the aura if I put this magazine in front of my face?

Aura-seer: Of course.

Randi: Then if I were to step behind a wall barely taller than I am, you could determine my location from the aura visible above my head, right?

Randi told me that no aura-seer has agreed to take this simple test.

No matter how sensible-seeming or wild an idea, the smart thinker asks: Does it work? When put to the test, can its predictions be confirmed? Subjected to such scrutiny, crazy-sounding ideas sometimes find support. During the 1700s, scientists scoffed at the notion that meteorites had extraterrestrial origins. When two Yale scientists challenged the conventional opinion, Thomas Jefferson jeered, “Gentlemen, I would rather believe that those two Yankee professors would lie than to believe that stones fell from Heaven.” Sometimes scientific inquiry turns jeers into cheers.

More often, science becomes society’s garbage disposal, sending crazy-sounding ideas to the waste heap, atop previous claims of perpetual motion machines, miracle cancer cures, and out-of-body travels into centuries past. To sift reality from fantasy, sense from nonsense, therefore requires a scientific attitude: being skeptical but not cynical, open but not gullible.

“To believe with certainty,” says a Polish proverb, “we must begin by doubting.” As scientists, psychologists approach the world of behavior with a curious skepticism, persistently asking two questions: What do you mean? How do you know?

When ideas compete, skeptical testing can reveal which ones best match the facts. Do parental behaviors determine children’s sexual orientation? Can astrologers predict your future based on the position of the planets at your birth? Is electroconvulsive therapy (delivering an electric shock to the brain) an effective treatment for severe depression? As we will see, putting such claims to the test has led psychological scientists to answer No to the first two questions and Yes to the third.

Putting a scientific attitude into practice requires not only curiosity and skepticism but also humility - an awareness of our own vulnerability to error and an openness to surprises and new perspectives. In the last analysis, what matters is not my opinion or yours, but the truths nature reveals in response to our questioning. If people or other animals don’t behave as our ideas predict, then so much the worse for our ideas. This humble attitude was expressed in one of psychology’s early mottos: “The rat is always right.”

Historians of science tell us that these three attitudes - curiosity, skepticism, and humility - helped make modern science possible. Some deeply religious people today may view science, including psychological science, as a threat. Yet, many of the leaders of the scientific revolution, including Copernicus and Newton, were deeply religious people acting on the idea that “in order to love and honor God, it is necessary to fully appreciate the wonders of his handiwork” (Stark,2003a,b).

Of course, scientists, like anyone else, can have big egos and may cling to their preconceptions. Nevertheless, the ideal of curious, skeptical, humble scrutiny of competing ideas unifies psychologists as a community as they check and recheck one another’s findings and conclusions.

Critical Thinking

The scientific attitude prepares us to think smarter. Smart thinking, called critical thinking, examines assumptions, assesses the source, discerns hidden values, confirms evidence, and assesses conclusions. Whether reading a news report or listening to a conversation, critical thinkers ask questions. Like scientists, they wonder: How do they know that? What is this person’s agenda? Is the conclusion based on anecdote and gut feelings, or on evidence? Does the evidence justify a cause-effect conclusion? What alternative explanations are possible?

Critical thinking, informed by science, helps clear the colored lenses of our biases. Consider: Does climate change threaten our future, and, if so, is it human-caused? In 2009, climate-action advocates interpreted an Australian heat wave and dust storms as evidence of climate change. In 2010, climate-change skeptics perceived North American bitter cold and East Coast blizzards as discounting global warming. Rather than having their understanding of climate change swayed by today’s weather, or by their own political views, critical thinkers say, “Show me the evidence.” Over time, is the Earth actually warming? Are the polar ice caps melting? Are vegetation patterns changing? And is human activity spewing gases that would lead us to expect such changes? When contemplating such issues, critical thinkers will consider the credibility of sources. They will look at the evidence (“Do the facts support them, or are they just makin’ stuff up?”). They will recognize multiple perspectives. And they will expose themselves to news sources that challenge their preconceived ideas.

Has psychology’s critical inquiry been open to surprising findings? The answer, as ensuing modules illustrate, is plainly Yes. Believe it or not, massive losses of brain tissue early in life may have minimal long-term effects (see Module 12). Within days, newborns can recognize their mother’s odor and voice (see Module 45). After brain damage, a person may be able to learn new skills yet be unaware of such learning (see Modules 31-33). Diverse groups - men and women, old and young, rich and middle class, those with disabilities and without - report roughly comparable levels of personal happiness (see Module 83).

And has critical inquiry convincingly debunked popular presumptions? The answer, as ensuing modules also illustrate, is again Yes. The evidence indicates that sleepwalkers are not acting out their dreams (see Module 24). Our past experiences are not all recorded verbatim in our brains; with brain stimulation or hypnosis, one cannot simply “hit the replay button” and relive long-buried or repressed memories (see Module 33). Most people do not suffer from unrealistically low self-esteem, and high self-esteem is not all good (see Module 59). Opposites do not generally attract (see Module 79) . In each of these instances and more, what has been learned is not what is widely believed.

Before You Move On

ASK YOURSELF: How might critical thinking help us assess someone’s interpretations of people’s dreams or their claims to communicate with the dead?

TEST YOURSELF: How does the scientific attitude contribute to critical thinking?