7.4 Balancing Intuitive, Heuristic, and Analytic Decision Ma
Consistent with our previous discussion of skill-, rule-, and knowledgebased performance, how people make decisions depends on the situation. People tend to make decisions at one of three ways: intuitive skill-based processing, heuristic rule-based processing, and analytical knowledge-based processing. Making decisions as described by the normative models is an example of analytic decision making and using satisficing heuristics is an example of rule-based decision making. Intuitive decision-making occurs when people recognize the required response without thinking. As we learned in the context of Figure 7.2, people with a high degree of expertise often approach decision making in a fairly automatic pattern matching style, just as Amy did with her first diagnosis. Recognition primed decision making (RPD) describes this process in detail [374]. In most instances, experts simply recognize a pattern of cues and recall a single course of action, which is then implemented. In spite of the prevalence of rapid patternrecognition decisions, there are cases where decision makers will use analytical methods, such as when the decision maker is unsure of the appropriate course of action. The decision maker resolves the uncertainty by imagining the consequences of what might happen if a course of action is adopted: a mental simulation, where the decision maker thinks: “if I do this, what is likely to happen” [375]. Mental simulation can help assess the alternatives, action, or plan under consideration [376]. In this process, the mental simulation can play out possible solutions based on information from the environment and their mental model. Mental simulation shows which options are the most promising, and also generates expectations for other cues not previously considered [377]. Also, if uncertainty exists and time is adequate, decision makers will spend time to evaluate the current situation assessment, modify the retrieved action plan, or generate alternative actions [356]. Experts adapt their decision-making strategy to the situation. Table 7.2 summarizes some of the factors that lead to intuitive rulebased decision making and those that lead to analytical knowledgebased decision making. These characteristics of the person, task, and technology influence the use of heuristics as well as the prevalence of biases that sometimes accompany those heuristics, which we discuss in detail in the next section.
7.4.1 Vulnerabilties of Heuristics: Biases Cognitive heuristics are rules-of-thumb that are easy ways of making decisions. Heuristics are usually very powerful and efficient [378], but they do not always guarantee the best solution [354, 379]. Unfortunately, because they represent simplifications, heuristics occasionally lead to systematic flaws and errors. The systematic flaws represent deviations from the normative model and are sometimes referred to as biases. Experts tend to avoid these biases because they draw from a large set of experiences and they are vigilant to small changes in the pattern of cues that might suggest the heuristic is inappropriate. To the extent a situation departs from these experiences, even experts will fall prey to the biases associated with various heuristics. Although the list of heuristics is large (as many as 37 [380]), the following presents some of the most notorious ones. Acquire and Integrate Cues: Heuristics and Biases. The first stage of the decision process begins with attending to information and integrating it to understand the situation or form a situation assessment (e.g., to support stage 2).
1. Attention to a limited number of cues. Due to working memory limitations, people can use only a relatively small number of cues to develop a picture of the world or system. This is one reason why configural displays that visually integrate several variables or factors into one display are useful (see Chapter 8 for a description). 2. Anchoring and cue primacy. When people receive cues over a period of time, there are certain trends or biases in the use of that information. The first few cues receive greater weight than subsequent information–cue primacy [381]. It often leads people to “anchor" on initial evidence and is therefore sometimes called the anchoring heuristic [354], characterizing the familiar phenomenon that first impressions are lasting. Amy anchored on the cues supporting her initial diagnosis, and gave little processing to additional information available in the phone call by the patient 24 hours later. Importantly, when assessing a dynamic changing situation, the anchoring bias can be truly detrimental because older information becomes progressively less reliable, even as the older information was, by definition, the first encountered and hence served as the anchor. The order of information has an effect because people use the information to construct plausible stories or mental models of the world or system. These models differ depending on which information is used first [382]. The key point is that, information processed early is often most influential. 3. Cue salience. Perceptually salient cues are more likely to capture attention and be given more weight [383, 11]; see also Chapter 6. As you would expect, salient cues in displays are things such as information at the top of a display, the loudest alarm, the largest display, the loudest most confident sounding voice in the room, and so forth. Unfortunately, the most salient cue is not necessarily the most diagnostic, and sometimes very subtle ones, such as the faint discoloration observed by Amy are not given much weight. 4. Overweighting of unreliable cues. Not all cues are equally reliable. In a trial, some witnesses, for example, will always tell the truth. Others might have faulty memories, and still others might intentionally lie. However, when integrating cues, people often simplify the process by treating all cues as if they are all equally valid and reliable. The result is that people tend to give too much weight to unreliable information [384, 385]. Interpret and Assess: Heuristics and Biases. After a limited set of cues is processed in working memory, the decision maker generates and interprets the information, often by retrieving similar situations from long-term memory. These similar situations represent hypotheses about how the current situation relates to past situations. There are a number of heuristics and biases that affect this process:
1. Availability. The availability heuristic reflects people’s tendency to make certain types of judgments or assessments, for example, estimates of frequency, by assessing how easily the state or event is brought to mind [386, 387, 388]. People more easily retrieve hypotheses that have been considered recently and hence more available to memory. The implication is that although people try to generate the most likely hypotheses, the reality is that if something comes to mind relatively easily, they assume it is common and therefore a good hypothesis. As an example, if a physician readily thinks of a hypothesis, such as acute appendicitis, he or she will assume it is relatively common, leading to the judgment that it is a likely cause of the current set of symptoms. Unusual illnesses tend not to be the first things that come to mind to a physician. Amy did not think of the less likely condition. In actuality, availability to memory may not be a reliable basis for estimating frequency. 2. Representativeness. Sometimes people diagnose a situation because the pattern of cues “looks like” or is representative of the prototypical example of this situation. This is the representativeness heuristic [353, 389], and usually works well; however, the heuristic can bias decisions when a perceived situation is slightly different from the prototypical example even though the pattern of cues is similar or representative. 3. Overconfidence. People are often biased in their confidence with respect to the hypotheses they have brought into working memory [390, 351], believing that they are correct more often than they actually are and reflecting the more general tendency for overconfidence in metacognitive processes, as described in Chapter 6 [391]. Such overconfidence appears to grow when judgments are more predictive about the future (than of the current state) and when predictions become more difficult [11]. As a consequence, people are less likely to seek out evidence for alternative hypotheses or to prepare for the circumstances that they may be wrong. Less skilled people are more likely to overestimate their ability, even when they understand their relative ability [392]. 4. Cognitive tunneling. As we have noted earlier in the context of anchoring, once a hypothesis has been generated or chosen, people tend to underutilize subsequent cues. We remain stuck on our initial hypothesis, a process introduced in the previous chapter as cognitive tunneling [393]. Examples of cognitive tunneling abound in the complex systems [394]. Consider the example of the Three Mile Island disaster in which a relief valve failed and caused some of the displays to indicate a rise in the level of coolant [395]. Operators mistakenly thought that that emergency coolant flow should be reduced and persisted to hold this hypothesis for over two hours. Only when a supervisor arrived with a fresh perspective did the course of action get reversed. Notice that cognitive tunneling is different than the primacy, which occurs when the decision maker is first generating hypotheses.
Cognitive tunneling can sometimes be avoided by looking at the functionality of objects in terms beyond their normal use. The nearly catastrophic situation in a moon mission, well captured by the movie Apollo 13, demonstrated the ability of people to move beyond this type of functional fixedness. Recall that the astronauts were stranded without an adequate air purifier system. To solve this problem, the ground control crew assembled all of the “usable" objects known to be on board the spacecraft (tubes, articles of clothing, etc.). Then they did free brainstorming with the objects in various configurations until they had assembled a system that worked. 5. Simplicity seeking and choice aversion. Presenting people with more alternatives can make the decision harder and the result less satisfying [396]. More choice also resulted in fewer people participating in a retirement plan, and for those participating more choices led to less diversification because people distributed investments across the range of investment funds rather than distributing investments across the underlying asset classes (e.g., stocks and bonds) [397]. 6. Confirmation bias. Closely related to cognitive tunneling are the biases when people consider additional cues to evaluate working hypotheses. People tend to seek out only confirming information and not disconfirming information, even when the dis-confirming evidence can be more diagnostic [398, 399]. Amy did not carefully look for what might have been the dis-confirming evidence in the patient’s medical record. In a similar vein, people tend to underweight, or fail to remember, dis-confirming evidence [400, 11] and fail to use the absence of important cues as diagnostic information. The confirmation bias is exaggerated under conditions of high stress and high mental workload [393, 401, 402]. Figure 7.7 shows confirmation bias at work. Plan and Choose: Heuristics and Biases Choice of action is also subject to a variety of heuristics or biases. Some are based on basic memory processes that we have already discussed.
1. Planning bias. Closely related to overconfidence is the planning bias [403, 404]. In planning on both a large scale, like the time required to complete a major construction project such as the Denver International Airport, or on a small scale, like the time required to write a quality paper before the deadline, people seem to assume that the best case scenario will unfold, and be cognitively blind to predicting unexpected delaying effects that can befall them; or at least to underestimate both the probability of those events, or their time costs. 2. Retrieve a small number of actions. Long-term memory may provide many possible action plans, but people are limited in the number they can retrieve and keep in working memory. People tend to adopt a single course of action and fail to consider the full range of alternatives, even when time is available. Working-memory limits make it difficult to consider many alternatives simultaneously, and people tend to neglect cues after identifying a promising alternative. 3. Availability of actions. In retrieving possible courses of action from long-term memory, people retrieve the most “available" actions, just as they tend to do with hypotheses. In general, the availability of items from memory are a function of recency, frequency, and how strongly they are associated with the hypothesis or situational assessment that has been selected through the use of if-then rules. In high-risk professions like aviation, emergency checklists are often used to ensure that actions are considered, even if they may not be frequently performed [405]. 4. Availability of possible outcomes. Other types of availability effects will occur, including the generation/retrieval of associated outcomes. As discussed, when more than one possible action is retrieved, the decision maker must select one based on how well the action will yield desirable outcomes. Each action often has more than one associated consequence, which are probabilistic. As an example, will a worker adhere to a safety procedure and wear a hardhat versus ignoring the procedure and going without one? Wearing the hardhat has some probability of saving the worker from death due to a falling object. A worker’s estimate of this probability will influence the decision to wear the hardhat. The worker’s estimate of these likelihoods will not be objective based on statistics, but are more likely to be based on the availability of instances in memory. It is likely that the worker has seen many workers not wearing a hardhat who have not suffered any negative effects, and so he or she is likely to think the probability of being injured by falling objects is less than it actually is. Thus, the availability heuristic will bias retrieval of some outcomes and not others. Chapter 16 describes how warnings can be created to counteract this bias by showing the potential consequences of not complying, thus making the consequences more available. The decision maker is extremely unlikely to retrieve all possible outcomes for an action, particularly under stress [406]. Thus, selection of action suffers from the same cognitive limitations as other decision activities we have discussed (retrieval biases and working-memory limitations). Because of these cognitive limitations, selection of action tends to follow a satisficing model: If an alternative action passes certain criteria, it is selected. If the action does not work, another is considered. Again, this bias is much more likely to affect the performance of novices than experts [346]. 5. Hindsight bias. After someone is injured because he or she did not wear a hardhat, people are quick to criticize because it was such an obvious mistake. The tendency for people to think “they knew it along" is called the hindsight bias [407, 408]. This process is evident in the “Monday morning quarterback phenomena” where people believe they would not have made the obvious mistakes of the losing quarterback. More importantly, hindsight bias often plagues accident investigators who, with the benefit of hindsight and the very available (to their memory) example of a bad outcome, inappropriately blame operators for committing errors that are obvious only in hindsight [409]. 6. Framing bias. The framing bias is the influence of the framing or presentation of a decision on a person’s judgment [410]. According to the normative utility theory model, the way the problem is presented should have no effect on the judgment, but this is not how people behave. For example, when people are asked the price they would pay for a pound of ground meat that is 10% fat or 90% lean, they will tend to pay 8.2 cents per pound more for the option presented as 90% lean even though they are equivalent [411]. Likewise, students feel that they are performing better if they are told that they answered 80% of the questions on the exam correctly compared to being told that they answered 20% of the questions incorrectly. People also tend to view a certain treatment as more lethal if its risks are expressed as a 20% mortality rate than if expressed as 80% life saving and are thereby less likely to choose the treatment when expressed in terms of mortality [412]. Thus the direction of a choice can be influenced by the extent to which it is framed as a gain or a loss. The framing bias is also clearly expressed when the choice is between a risky option and a sure thing. To provide examples of contrast between negative and positive frames, suppose the risky-sure thing choice is between positive outcomes: accept a sure gift of $100 or take a risky gamble with 50-50 odds to win $200 or nothing at all. In these circumstances, people tend to be risk-averse, more often choosing to take the $100 rather than the risky chance to get $200.