AP Psychology

Module 35 - Solving Problems and Making Decisions

LEARNING OBJECTIVES:

Problem Solving: Strategies and Obstacles

FOCUS QUESTIONS: What is intuition, and how can the representativeness and availability heuristics, overconfidence, belief perseverance, and framing influence our decisions and judgments?

One tribute to our rationality is our problem-solving skill. What's the best route around this traffic jam? How should we handle a friend's criticism? How can we get in the house without our keys?

Some problems we solve through trial and error. Thomas Edison tried thousands of light bulb filaments before stumbling upon one that worked. For other problems, we use algorithms, step-by-step procedures that guarantee a solution. But step-by-step algorithms can be laborious and exasperating. To find a word using the 10 letters in SPLOYOCHYG, for example, you could try each letter in each of the 10 positions - 907,200 permutations in all. Rather than give you a computing brain the size of a beach ball, nature resorts to heuristics, simpler thinking strategies. Thus, you might reduce the number of options in the SPLOYOCHYG example by grouping letters that often appear together (CH and GY) and excluding rare letter combinations (such as two Y's together). By using heuristics and then applying trial and error, you may hit on the answer. Have you guessed it?l

Sometimes, no problem-solving strategy seems to be at work at all, and we arrive at a solution to a problem with insight. Teams of researchers have identified brain activity associated with sudden flashes of insight (Kounios & Beeman, 2009; Sandklihler & Bhattacharya, 2008) . They gave people a problem: Think of a word that will form a compound word or phrase with each of three other words in a set (such as pine, crab, and sauce), and press a button to sound a bell when you know the answer. (If you need a hint: The word is a fruit.) EEGs or fMRls (ftmctional MRls) revealed the problem solver's brain activity.

In the first experiment, about half the solutions were by a sudden Aha! insight. Before the Aha! moment, the problem solvers' frontal lobes (which are involved in focusing attention) were active, and there was a burst of activity in the right temporal lobe, just above the ear (see FIGURE 35.1),

We are also not the only creatures to display insight, as psychologist Wolfgang Kohler (1925) demonstrated in an experiment with Sultan, a chimpanzee. Kohler placed a piece of fruit and a long stick outside Sultan's cage. Inside the cage, he placed a short stick, which Sultan grabbed, using it to try to reach the fruit. After several failed attempts, he dropped the stick and seemed to survey the situation. Then suddenly, as if thinking"Aha!" Sultan jumped up and seized the short stick again.

This time, he used it to pull in the longer stick - which he then used to reach the fruit. What is more, apes will even exhibit foresight, by storing a tool they can use to retrieve food the next day (Mulcahy & Call, 2006), Insight strikes suddenly, with no prior sense of "getting warmer" or feeling close to a solution (Knoblich & Oellinger, 2006; Metcalfe, 1986). When the answer pops into mind (apple!), we feel a happy sense of satisfaction. The joy of a joke may similarly lie in our sudden comprehension of an unexpected ending or a double meaning: "You don't need a parachute to skydive,You only need a parachute to skydive twice,"

Inventive as we are, other cognitive tendencies may lead us astray. For example, we more eagerly seek out and favor evidence verifying our ideas than evidence refuting them (Klayman & Ha, 1987; Skov & Sherman, 1986). Peter Wason (1960) demonstrated this tendency, known as confirmation bias, by giving British university students the three-number sequence 2-4-6 and asking them to guess the rule he had used to devise the series. (The rule was simple: any three ascending numbers.) Before submitting answers, students generated their own three-number sets and Wason told them whether their sets conformed to his rule. Once certain they had the rule, they could announce it. The result? Seldom right but never in doubt. Most students formed a wrong idea ("Maybe it's counting by twos”) and then searched only for confirming evidence (by testing 6-8-10,100-102-104, and so forth).

"Ordinary people," said Wason (1981), "evade facts, become inconsistent, or systematically defend themselves against the threat of new information relevant to the issue." Thus, once people form a belief - that vaccines cause autism spectrum disorder, that President Barack Obama is a Kenyan-born Muslim, that gun control does (or does not) save lives - they prefer belief-confirming information. The results can be momentous, The U.S. war against Iraq was launched on the belief that Saddam Hussein possessed weapons of mass destruction (WMD) that posed an immediate threat. When that assumption turned out to be false, the bipartisan U.S. Senate Select Committee on Intelligence (2004) identified confirmation bias as partly to blame: Administration analysts "had a tendency to accept information which supported [their presumptions] ... more readily than information which contradicted" them. Sources denying such weapons were deemed"either lying or not knowledgeable about Iraq's problems," while those sources who reported ongoing WMD activities were seen as "having provided valuable information."

Once we incorrectly represent a problem, it's hard to restructure how we approach it. If the solution to the matchstick problem in FIGURE 35.2 eludes you, you may be experiencing fixation - an inability to see a problem from a fresh perspective. (For the solution, turn the page to see FIGURE 35.3.)

A prime example of fixation is mental set, our tendency to approach a problem with the mind -set of what has worked for us previously. Indeed, solutions that worked in the past often do work on new problems. Consider:

Given the sequence O-T-T-F-?-?-?, what are the final three letters? Most people have difficulty recognizing that the three final letters are F(ive), S(ix), and S(even). But solving this problem may make the next one easier: Given the sequence J-F-M-A-?-?-?, what are the final three letters? (If you don't get this one, ask yourself what month it is.)

As a perceptual set predisposes what we perceive, a mental set predisposes how we think; sometimes this can be an obstacle to problem solving, as when our mental set from our past experiences with matchsticks predisposes us to arrange them in two dimensions.

Forming Good and Bad Decisions and Judgments

FOCUS QUESTIONS: What cognitive strategies assist our problem solving, and what obstacles hinder it?

When making each day's hundreds of judgments and decisions (Is it worth the bother to take a jacket? Can I trust this person? Should I shoot the basketball or pass to the player who's hot?), we seldom take the time and effort to reason systematically. We just follow our intuition, our fast, automatic, unreasoned feelings and thoughts. After interviewing policy makers in government, business, and education, social psychologist Irving Janis (1986) concluded that they "often do not use a reflective problem-solving approach. How do they usually arrive at their decisions? If you ask, they are likely to tell you ... they do it mostly by the seat of their pants."

When we need to act quickly, the mental shortcuts we call heuristics enable snap judgments. Thanks to our mind's automatic information processing, intuitive judgments are instantaneous and usually effective. However, research by cognitive psychologists Amos Tversky and Daniel Kahneman (1974) on the representativeness and availability heuristics showed how these generally helpful shortcuts can lead even the smartest people into dumb decisions.

The Representativeness Heuristic

To judge the likelihood of things in terms of how well they represent particular prototypes is to use the representativeness heuristic. To illustrate, consider:

A stranger tells you about a person who is short, slim, and likes to read poetry, and then asks you to guess whether this person is more likely to be a professor of classics at an Ivy League university or a truck driver (adapted from Nisbett & Ross, 1980). Which would be the better guess?

Did you answer "professor"? Many people do, because the description seems more representative of Ivy League scholars than of truck drivers. The representativeness heuristic enabled you to make a snap judgment. But it also led you to ignore other relevant information. When I help people think through this question, the conversation goes something like this:

  Question: First, let's figure out how many professors fit the description. How many Ivy League universities do you suppose there are?
  Answer: Oh, about 10, I suppose.
  Question: How many classics professors would you guess there are at each?
  Answer: Maybe 4.
  Question: Okay, that's 40 Ivy League classics professors. What fraction of these are short and slim?
  Answer: Let's say half.
  Question: And, of these 20, how many like to read poetry?
  Answer: I'd say half-10 professors.
  Question: Okay, now let's figure how many truck drivers fit the description. How many truck drivers do you suppose there are?
  Answer: Maybe 400,000.
  Question: What fraction are short and slim?
  Answer: Not many-perhaps 1 in 8.
  Question: Of these 50,000, what percentage like to read poetry?
  Answer: Truck drivers who like poetry? Maybe 1 in 100 - oh, oh, I get it - that leaves 500 short, slim, poetry-reading truck drivers.
  Comment: Yup. So, even if we accept your stereotype that the description is more representative of classics professors than of truck drivers, the odds are 50 to 1 that this person is a truck driver.

The representativeness heuristic influences many of our daily decisions. To judge the likelihood of something, we intuitively compare it with our mental representation of that category - of, say, what truck drivers are like. If the two match, that fact usually overrides other considerations of statistics or logic.

The Availability Heuristic

The availability heuristic operates when we estimate the likelihood of events based on how mentally available they are. Casinos entice us to gamble by signaling even small wins with bells and lights-making them vividly memorable-while keeping big losses soundlessly invisible.

The availability heuristic can lead us astray in our judgments of other people, too. Anything that makes information "pop" into mind-its vividness, recency, or distinctiveness can make it seem commonplace. If someone from a particular ethnic or religious group commits a terrorist act, as happened on September 11, 2001, when Islamic extremists killed nearly 3000 people in the United States in coordinated terrorist attacks, our readily available memory of the dramatic event may shape our impression of the whole group.

Even during that horrific year, terrorist acts claimed comparatively few lives. Yet when the statistical reality of greater dangers (see FIGURE 35.4) was pitted againsta single vivid case,the memorable case won, as emotion-laden images of terror exacerbated our fears (Sunstein, 2007)

We often fear the wrong things. We fear flying because weplay in our heads some air disaster. We fear letting our children walk to schoolbecause we play in our heads tapes of abducted and brutalized children. We fear swimming in ocean waters because we replay Jaws in our heads. Even just passing by a person who sneezes and coughs heightens our perceptons of various health risks (Lee et al., 2010). And so, thanks to these readily available images, we come to fear relatively rare events. (see "Thinking Critically About: The Fear Factor - Why We Fear the Wrong Things" below)

Meanwhile, the lack of comparably available images of global climate change - which some scientists regard as a future "Armageddon in slow motion" - has left most people little concerned (Pew, 2007). The vividness of a recent local cold day reduces their concern about long-term global warming and overwhelms less memorable scientific data (Li et al., 2011) . Dramatic outcomes make us gasp; probabilities we hardly grasp. As of 2013, some 60 nations-including Canada, many in Europe, and the United States-have, however, sought to harness the positive power of vivid, memorable images by putting eye-catching warnings and graphic photos on cigarette packages (Riordan, 2013) . This campaign may work, where others have failed. As psychologist Paul Slovic (2007) points out, we reason emotionally and neglect probabilities. We overfeel and underthink. In one experiment, donations to a starving 7 -year-old child were greater when her image was not accompanied by statistical information about the millions of needy African children like her (Small et al., 2007) . "If I look at the mass, I will never act, " Mother Teresa reportedly said. "If I look at the one, I wilL" "The more who die, the less we care," noted Slovic (2010).

Thinking Critically About

The Fear Factor-Why We Fear the Wrong Things

After the 9/11 attacks, many people feared flying more than driving. In a 2006 Gallup survey, only 40 percent of Americans reported being "not afraid at all" to fly. Yet from 2005 to 2007 Americans were -mile for mile-1 70 times more likely to die in an automobile or pickup truck crash than on a scheduled flight (National Safety Council, 2010). In 2009 alone, 33,808 Americans were killed in motor vehicle accidents-that's 650 dead people each week. Meanwhile, in 2009 (as in 2007 and 2008) zero died from airline accidents on scheduled flights.

In a late 2001 essay, I calculated that if-because of 9/11we flew 20 percent less and instead drove half those unflown miles, about 800 more people would die in the year after 9/11 (Myers, 2001). German psychologist Gerd Gigerenzer (2004, 2006) later checked this estimate against actual accident data. Only didn't I think of that?) U.S. traffic deaths did indeed increase significantly in the last three months of 2001 (see FIGURE 35.5). By the end of 2002, Gigerenzer estimated, 1600 Americans had "lost their lives on the road by trying to avoid the risk of flying." Despite our greater fear of flying, flying 's greatest danger is, for most people, the drive to the airport.

Why do we fear the wrong things? Why do we judge terrorism to be a greater risk than accidents? Psychologists have identified four influences that feed fear and cause us to ignore higher risks.

  1. We fear what our ancestral history has prepared us to fear. Human emotions were road tested in the Stone Age. Our old brain prepares us to fear yesterday's risks: snakes, lizards, and spiders (which combined now kill a tiny fraction of the number killed by modern-day threats, such as cars and cigarettes). Yesterday's risks also prepare us to fear confinement and heights, and therefore flying.
  2. We fear what we cannot control. Driving we control; flying we do not.
  3. We fear what is immediate. The dangers of flying are mostly telescoped into the moments of takeoff and landing. The dangers of driving are diffused across many moments to come, each trivially dangerous.

Thanks to the availability heuristic, we fear what is most readily available in memory Powerful, vivid images, like that of United Flight 175 slicing into the World Trade Center, feed our judgments of risk. Thousands of safe car trips have extinguished our anxieties about driving. Similarly, we remember (and fear) widespread disasters (hurricanes, tornadoes, earthquakes) that kill people dramatically, in bunches. But we fear too little the less dramatic threats that claim lives quietly, one by one, continuing into the distant future. Bill Gates has noted that each year a half-million children worldwide die from rotavirus. This is the equivalent of four 747s full of children every day, and we hear nothing of it (Glass, 2004).

The news, and our own memorable experiences, can make us disproportionately fearful of infinitesimal risks. As one risk analyst explained, "If it's in the news, don't worry about it. The very definition of news is 'something that hardly ever happens'" (Schneier, 2007). Despite people's fear of dying in a terrorist attack on an airplane, the last decade produced one terrorist attempt for every 10.4 million flights-less than one-twentieth the chance of anyone of us being struck by lightning (Silver, 2009).

The point to remember: It is perfectly normal to fear purposeful violence from those who wish us harm. When terrorists strike again, we will all recoil in horror. But smart thinkers will check their fears against the facts and resist those who aim to create a culture of fear. By so doing, we take away the terrorists ' most omnipresent weapon: exaggerated fear.

Overconfidence

Sometimes our judgments and decisions go awry simply because we are more confident than correct. Across various tasks, people overestimate their performance (Metcalfe, 1998). If 60 percent of people correctly answer a factual question, such as "Is absinthe a liqueur or a precious stone?," they will typically average 75 percent confidence (Fischhoff et at., 1977). (It's a licorice-flavored liqueur.) This tendency to overestimate the accuracy of our knowledge and judgments is overconfidence.

It was an overconfident BP that, before its exploded drilling platform spewed oil into the Gulf of Mexico, downplayed safety concerns, and then downplayed the spill's magnitude (Mohr et aI., 2010; Urbina, 2010). It is overconfidence that drives stockbrokers and investment managers to market their ability to outperform stock market averages, despite overwhelming evidence to the contrary (Malkiel, 2004). A purchase of stock X, recommended by a broker who judges this to be the time to buy, is usually balanced by a sale made by someone who judges this to be the time to sell. Despite their confidence, buyer and seller cannot both be right.

History is full of leaders who were more confident than correct. And classrooms are full of overconfident students who expect to finish assignments and write papers ahead of schedule (Buehler et aI., 1994). In fact, the projects generally take about twice the number of days predicted.

Anticipating how much we will accomplish, we also overestimate our future leisure time (Zauberman & Lyrlch, 2005). Believing we will have more time next month than we do today, we happily accept invitations and assignments, only to discover we're just as busy when the day rolls around. Failing to appreciate our potential for error and believing we will have more money next year, we take out loans or buy on credit. Despite our painful underestimates, we remain overly confident of our next prediction.

Overconfidence can have adaptive value. People who err on the side of overconfidence live more happily. They make tough decisions more easily, and they seem more credible than others (Baumeister, 1989; Taylor, 1989). Moreover, given prompt and clear feedback, as weather forecasters receive after each day's predictions, people can learn to be more realistic about the accuracy of their judgments (Fischhoff, 1982). The wisdom to know when we know a thing and when we do not is born of experience.

Belief Perseverance

Our overconfidence in our judgments is startling; equally startling is our tendency to cling to our beliefs in the face of contrary evidence. Belief perseverance often fuels social conflict, as it did in a classic study of people with opposing views of capital punishment (Lord et al., 1979). Each side studied two supposedly new research findings, one supporting and the other refuting the claim that the death penalty deters crime. Each side was more impressed by the study supporting its own beliefs, and each readily disputed the other study. Thus, showing the pro- and anti-capital-punishment groups the same mixed evidence actually increased their disagreement.

If you want to rein in the belief perseverance phenomenon, a simple remedy exists: Consider the opposite, When the same researchers repeated the capital-punishment study, they asked some participants to be "as objective and unbiased as possible" (Lord et al., 1984). The plea did nothing to reduce biased evaluations of evidence. They asked another group to consider "whether you would have made the same high or low evaluations had exactly the same study produced results on the other side of the issue." Having imagined and pondered opposite findings, these people became much less biased in their evaluations of the evidence.

The more we come to appreciate why our beliefs might be true, the more tightly we cling to them. Once we have explained to ourselves why we believe a child is "gifted" or has a "specific learning disorder," or why candidate X orY will be a better commander-in-chief, or why company Z makes a product worth owning, we tend to ignore evidence undermining our belief. Prejudice persists. As we will see in Unit XIV, once beliefs form and are justified, it takes more compelling evidence to change them than it did to create them.

The Effects of Framing

Framing, the way we present an issue, sways our decisions and judgments. Imagine two surgeons explaining a surgery risk. One tells patients that 10 percent of people die during this surgery. The other tells patients that 90 percent will survive. The information is the same. The effect is not. In surveys, both patients and physicians said the risk seems greater when they hear that 10 percent will die (Marteau, 1989; McNeil et al., 1988; Rothman & Salovey, 1997). Similarly, 9 in 10 college students rated a condom as effective if told it had a supposed "95 percent success rate" in stopping the HIV virus, Only 4 in 10 judged it effective when told it had a "5 percent failure rate" (Linville et al., 1992). To scare people, frame risks as numbers, not percentages, People told that a chemical exposure is projected to kill 10 of every 10 million people (imagine 10 dead people!) feel more frightened than if told the fatality risk is an infinitesimal .000001 (Kraus et al., 1992).

Framing can be a powerful persuasion tool. Carefully posed options can nudge people toward decisions that could benefit them or society as a whole (Thaler & Sunstein, 2008).

The point to remember: Those who understand the power of framing can use it to influence our decisions.

The Perils and Powers of Intuition

FOCUS QUESTIONS: How do smart thinkers use intuition?

We have seen how our irrational thinking can plague our efforts to see problems clearly, make wise decisions, form valid judgments, and reason logically. Moreover, these perils of intuition feed gut fears and prejudices. And they persist even when people are offered extra pay for thinking smart, even when they are asked to justify their answers, and even when they are expert physicians or clinicians (Shafir & LeBoeuf, 2002). So, are our heads indeed filled with straw?

Good news: Cognitive scientists are also revealing intuition's powers. Here is a summary of some of the high points:

The bottom line: Intuition can be perilous, especially when we overfeel and underthink, as we do when judging risks. Today's psychological science reminds us to check our intuitions against reality, but also enhances our appreciation for intuition. Our two-track mind makes sweet harmony as smart, critical thinking listens to the creative whispers of our vast unseen mind, and then evaluates evidence, tests conclusions, and plans for the future.

Before You Move On

ASK YOURSELF: People's perceptions of risk, often biased by vivid images from movies or the news, are surprisingly unrelated to actual risks, (People may hide in the basement during thunderstorms but fail to buckle their seat belts in the car,) What are the things you fear? Are some of those fears out of proportion to statistical risk? Are you failing, in other areas of your life, to take reasonable precautions?

TEST YOURSELF: The availability heuristic is a quick-and-easy but sometimes misleading guide to judging reality, What is the availability heuristic?