my blog

7.8 Metacognition و 7.9

fizik100 fizik100 fizik100 · 1400/8/7 02:30 ·
7.8 Metacognition و 7.9

 

 7.8 Metacognition

Throughout this chapter we have cited the importance of metacognition: thinking about ones’ own thinking and cognitive processes. Metacognition influences the decision-making process by guiding how people adapt to the particular decision situation. Here we highlight five of the most critical elements of metacognition for macrocognition.

1. Knowing what you don’t know. That is, being aware that your decision processes or those necessary to maintain adequate situation awareness are inadequate because of important cues that are missing, and, if obtained, could substantially improve situation awareness and assessment.

2. The decision to “purchase” further information. This can be seen as a decision within the decision. Purchasing may involve a financial cost, such as the cost of an additional medical test required to reduce uncertainty on a diagnosis. It also may involve a time cost, such as the added time required before declaring a hurricane evacuation, to obtain more reliable information regarding the forecast hurricane track. In these cases, metacognition is revealed in the ability to balance the costs of purchase against the value of the added information [476]. The metacognitive skills here also clearly involve keeping track of the passage of time in dynamic environments, to know when a decision may need to be executed even without full information.

3. Calibrating confidence in what you know. As we have described above, the phenomenon of overconfidence is frequently manifest in human cognition [351], and when one is overconfident in ones’ knowledge, there will be both a failure to seek additional information to reduce uncertainty, and also a failure to plan for contingencies if the decision maker is wrong in his/her situation assessment. 4. Choosing the decision strategy adaptively. As we have seen above, there are a variety of different decision strategies that can be chosen; using heuristics, holistic processing, System 1, recognition primed decisions, or deploying the more elaborate effort-demanding algorithms, analytic decision strategies using System 2. The expert has many of these in her toolkit, but metacognitive skills are necessary to decide which to employ when, as Amy did in our earlier example, by deciding to switch from an RPD pattern match, to a more time analytical strategy when the former failed. 5. Processing feedback to improve the toolkit. Element 4 relates to a single instance of a decision—in Amy’s case, the diagnosis and choice of treatment for one patient. However metacognition can and should also be employed to process the outcome of a series of decisions, realize from their negative outcomes that they may be wanting, and learning to change the rules by which different strategies are deployed, just as the student, performing poorly in a series of tests, may decide to alter his/her study habits. To deploy such metacognitive skills here obviously requires some effort to obtain and process the feedback of decision outcomes, something we saw was relatively challenging to do with decision making.

 

7.8.1 Principles for Improving Metacognition

As with other elements of macrocognition, metacognition can be improved by some combination of changing the person (through training or experience) or changing the task (through task and technology).

1. Ease information retrieval. Requiring people to manually retrieve or select information is more effortful than simply requiring them to scan to a different part of the visual field [138, 477], a characteristic that penalizes the concepts of multilevel menus and decluttering tools that require people to select the level of decluttering they want. Pop-up messages and other automation features that infer and satisfy a person’s information needs and relieve the effort of accessing information [478].

2. Highlight benefits and minimize effort of engaging decision aids. Designers must understand the effort costs generated by potentially powerful features in interfaces. Such costs may be expressed in terms of the cognitive effort required to learn the feature or the mental and physical effort and time cost required to load or program the feature. Many people are disinclined to invest such effort even if the anticipated gains in productivity are high, and so the feature will go unused.

3. Manage cognitive depletion. An extended series of demanding decisions can incline people towards an intuitive approach to decisions, even when an analytic one would be more effective. Coaching people on this tendency might help them take rest breaks, plan complicated decisions early rather than late in the day, and avoid systems that introduce unnecessary decisions. People tend to make the easy or default decision as they become fatigued. As an example, Figure 7.9 shows how cognitive depletion changes the ruling of Israeli judges making parole decisions [479]. The timeline starts at the beginning of the day and each open circle represents the first decision after a break. The pattern cannot be explained by obvious confounding factors such as the gravity of the offense or time served. Similar effects are seen in other domains such as physicians choosing to prescribe more antibiotics as they become cognitively depleted over the day [480].

4. Training metacognition. Training can improve metacognition by teaching people to: (1) consider cues needed to develop situation awareness, (2) check situation assessments or explanations for completeness and consistency with cues, (3) analyze data that conflict with the situation assessment, and (4) recognize when too much conflict exists between the assessment and the cues. Training metacognition also needs to consider when it is appropriate to rely on the automation and when it is not [435].

 

Figure 7.9 Effect of cognitive depletion on rulings in favor of prisoners (Adapted from Proceedings of National Academy of Sciences, Dantziger, Levav, and Pesso (2011), Extraneous factors in judicial decisions. PNAS, 108, 17, Figure 1, p. 6890. [479].)

 

 

7.9 Summary

We discussed decision making and the factors that make it more and less effective. Normative mathematical models of utility theory describe how people should compare alternatives and make the “best” decision. However, limited cognitive resources, time pressure, and unpredictable changes often make this approach unworkable, and people use simplifying heuristics, which make decisions easier but also lead to systematic biases. In many situations people often have years of experience that enables them to refine their decision heuristics and avoid many biases. Decision makers also adapt their decision making by moving from skill- and rule-based decisions to knowledge-based decisions according to the degree of risk, time pressure, and experience. This adaptive process must be considered when improving decision making through task redesign, choice architecture, decision-support systems, or training.

Techniques to shape decision making discussed in this chapter offer surprisingly powerful ways to affect decisions and so the ethical dimensions of these choices should be carefully considered. As an example, should the default setting be designed to provide people with the option that aligns with their preference, what is best for them, what is likely to maximize profits, or what might be best for society [18]? The concepts in this chapter have important implications for safety and human error, discussed in Chapter 16. In many ways the decision-support systems described in this chapter can be considered as displays or automation—Chapter 11 addresses automation, and we turn to displays in the next chapter.

7.7 Planning and Scheduling

fizik100 fizik100 fizik100 · 1400/8/7 02:24 ·

The cognitive processes of planning and scheduling are closely related to those discussed in the previous section, because informed problem solving and troubleshooting often involve careful planning of future tests and activities. However, troubleshooting and diagnosis generally suggest that something is “wrong” and needs to be fixed. Planning and scheduling do not have this implication. That is, planning may be invoked in the absence of problem solving, as when a routine schedule of activities is generated. Planning often accompanies decision making to implement the course of action decided upon. In many dynamic systems, the future may be broken down into two separate components: the predicted state of the system that is being controlled and the ideal or command state that should be obtained. Thus, a factory manager may have predicted output that can be obtained over the next few hours (given workers and equipment available) and a target output that is requested by external demands (i.e., the factory’s client). When systems cannot change their state or productive output easily, we say they are sluggish, or have “high inertia.” In these circumstances of sluggish systems, longer range planning becomes extremely important to guarantee that future production matches future demands. This is because sudden changes in demand cannot be met by rapid changes in system output. Examples of such sluggish systems—in need of planning—are the factory whose equipment takes time to be brought online, the airspace in which aircraft cannot be instantly moved to new locations, or any physical system with high inertia, like a supertanker or a train. In time-critical operations effective planning depends vitally upon anticipating events in the world that might derail the plan implementation. Unfortunately people are not very good at envisioning such events [351], nor the time required to address them. Hence the planning bias, discussed earlier in the chapter, is prevalent. You will recognize the importance to planning of two concepts discussed earlier in this chapter. First, level 3 situation awareness is another way of expressing an accurate estimate of future state and future demands. Second, skilled operators often employ a mental model of the dynamic system to be run through a mental simulation in order to infer the future state from the current state [375]. Mental simulation imposes heavy demands on cognitive resources. If these resources have been depleted or are diverted to other tasks, then prediction and planning may be poor, or not done at all, leaving the operator unprepared for the future. 7.7.1 Principles for Improving Planning and Scheduling Human limits in the area of planning and scheduling are often addressed with automation. Operations research offers many approaches to design the best plan given certain assumptions. Unfortunately, reality often violates these assumptions and people must intervene. 1. Create contingency plans and plan to re-plan. In general, people tend to avoid complex planning schedules over long time horizons [468], a decision driven both by a desire to conserve the resources imposed by high working memory load and by the fact that in an uncertain world accurate planning is impossible, and plans may need to be revised or abandoned altogether as the world evolves in a way that is different from what was predicted. Re-planning is essential. Here, unfortunately, people sometimes fail to do so, creating what is known as a plan continuation error [469, 470], a form of behavior that has much in common with cognitive tunneling, the confirmation bias and the sunk cost bias. Contingency plans and planning to re-plan can avoid these tendencies. 2. Create predictive displays. As with problem solving and troubleshooting, a variety of automation tools are proposed to reduce these cognitive demands in planning [471]. Most effective are predictive displays that offer visual representations of the likely future, reducing the need for working memory [472]. We discuss these in the next chapter. Also potentially useful are computer-based planning aids that can either recommend plans [473] or allow fast-time simulation of the consequence of such plans to allow the operator to try them out and choose the successful one [474]. Air traffic controllers can benefit from such a planning aid known as the User Request Evaluation Tool (URET) to try out different routes to avoid aircraft conflicts [475].

Many of the decision tasks studied in human factors require diagnosis, which is the process of inferring the underlying or “true” state of a system. Examples of inferential diagnosis include medical diagnosis, fault diagnosis of a mechanical or electrical system, inference of weather conditions based on measurement values or displays, and so on. Sometimes this diagnosis is of the current state, and sometimes it is of the predicted or forecast state, such as in weather forecasting or economic projections. The cognitive processes of problem solving and troubleshooting are often closely linked because they have so many overlapping elements. Both start with a difference between an initial “state” and a final “goal state” and typically require a number of cognitive operations to reach the latter. The identity of those operations is often not immediately apparent to the human engaged in problemsolving behavior. Troubleshooting is often embedded within problem solving in that it is sometimes necessary to understand the identity of a problem before solving it. Thus, we may need to understand why our car engine does not start (troubleshoot) before trying to implement a solution (problem solving). Although troubleshooting may often be a step within a problem-solving sequence, problem solving may occur without troubleshooting if the problem is solved through trial and error or if a solution is accidentally encountered through serendipity. While both problem solving and troubleshooting involve attaining a state of knowledge, both also typically involve performance of specific actions. Thus, troubleshooting usually requires a series of tests whose outcomes are used to diagnose the problem, whereas problem solving usually involves actions to implement the solution. Both are considered to be iterative processes of perceptual, cognitive, and response-related activities. Both problem solving and troubleshooting impose heavy cognitive demands, which limits human performance [461, 462]. Many of these limits are manifest in the heuristics and biases discussed earlier in the chapter, in the context of decision making. In troubleshooting, for example, people usually maintain no more than two or three active hypotheses in working memory as to the possible source of a problem [463]. More than this number overloads the limited capacity of working memory, since each hypothesis is complex enough to form more than a single chunk. Furthermore, when testing hypotheses, there is a tendency to focus on only one hypothesis at a time to confirm it or reject it. Thus, in troubleshooting our car we will probably assume one problem and perform tests to confirm that it is the problem. Naturally, troubleshooting success depends on attending to the appropriate cues and test outcomes. This dependency makes troubleshooting susceptible to attention and perceptual biases. The operator may attend selectively to very salient outcomes (bottom-up processing) or to outcomes that are anticipated (top-down processing). As we consider the first of these potential biases, it is important to realize that the least salient stimulus or event is the nonevent. People do not easily notice the absence of something [433]. Yet the absence of a symptom can often be a very valuable and diagnostic tool in troubleshooting to eliminate faulty hypotheses of what might be wrong. For example, the fact that a particular warning light might not be on could eliminate from consideration a number of competing hypotheses.

 

7.6.1 Principles for Improving Problem Solving and Troubleshooting

The systematic errors associated with troubleshooting suggest several design principles. 1. Present alternate hypotheses. An important bias in troubleshooting, resulting from top-down or expectancy-driven processing, is often referred to as cognitive tunneling, or confirmation bias [464, 407]. In troubleshooting, this is the tendency to stay fixated on a particular hypothesis (that chosen for testing), look for cues to confirm it (top-down expectancy guiding attention allocation), and interpret ambiguous evidence as supportive (top-down expectancy guiding perception). In problem solving, the corresponding phenomenon is to become fixated on a particular solution and stay with it even when it appears not to be working. Decision aids can challenge the persons’ hypothesis and highlight disconfirming evidence. 2. Create displays that can act as an external mental model. These cognitive biases are more likely to manifest when two features characterize the system under investigation. First, high system complexity (the number of system components and their degree of coupling or links) makes troubleshooting more difficult [465]. Complex systems are more likely to produce incorrect or “buggy” mental models [466], which can hinder the selection of appropriate tests or correct interpretation of test outcomes. Second, intermittent failures of a given system component turn out to be particularly difficult to troubleshoot [462]. A display that shows the underlying system structure, such as flow through the network of pipes in a refinery, can remove the burden of remembering that information. 3. Create systems that encourage alternate hypotheses. People generate a limited number of hypotheses because of working memory limitations [390]. Thus, people will bring in somewhere between one and four hypotheses for evaluation. Because of this people often fail to consider all relevant hypotheses [351]. Under time stress, decision makers often consider only a single hypothesis [467]. This process degrades the quality of novice decision makers far more than expert decision makers. The first option considered by experts is likely to be reasonable, but not for novices. Systems that make it easy for people to suggest many alternate hypothesis make it more likely a complete set of hypotheses will be considered.

7.5 Situation Awareness

fizik100 fizik100 fizik100 · 1400/8/7 02:15 ·

7.5 Situation Awareness

The diagnosis error made by the medical specialist, Amy in our vignette can be examined more thoroughly using the concept of situation awareness (SA). Situation awareness, or SA, characterizes people’s awareness and understanding of dynamic changes in their environment [447, 448, 449]. A pilot loses SA whenever he or she suffers a catastrophic controlled-flight into terrain [450, 229], and as we shall see later in Chapter 16, control room operators at the Three Mile Island nuclear power plant lost SA when they believed the water level in the plant to be too high rather than too low, a misdiagnosis that led to a catastrophic release of radioactive material [395]. SA is “the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future” [383](p. 36). These three levels, perception (and selective attention), understanding, and prediction, must be applied to a specific situation. Thus, a person cannot be said to have SA without specifying what that awareness is (or should be) about. A car driver might have good awareness of navigational information and time (where I am and how much time it will take me to drive to my destination), but poor awareness of the vehicle ahead that is merging onto the highway. Improving situation awareness for navigation and for the merging vehicle would require very different designs. Note that SA does not define nor incorporate action. That concerns the decisions made from one’s awareness or assessment of the situation. Many elements of microcognition support SA and were covered in the previous chapter. Selective attention is necessary for the first level, while the second level of understanding depends very much upon both working memory and long-term memory. The third level, projection and prediction, has not yet been discussed but will be considered in more detail in the planning and scheduling section. In addition, mental models guide SA development by defining what information people pursue and the interpretation of that information. For example, Amy’s mental model of the operating room procedures might guide her to ask a nurse for estimated completion time for the perforated viscus procedure. She only asks about this procedure because her mental model of the other procedures gives her a good sense of when they would be done and so she only needs information about the procedure with an uncertain completion time. As noted above, situation awareness is not the same as performance. One can have good performance (a lucky decision outcome that was correct) without good awareness. Correspondingly, the pilot of an out-of-control aircraft may have very good situation awareness of the loss of stability; but be unable to perform the necessary actions to recover.

 

7.5.1 Measuring Situation Awareness

The importance of SA can often be realized after an accident by inferring that the loss of SA was partially responsible. In controlledflight-into-terrain accidents it is almost always assumed that the pilot lost awareness of the aircraft’s altitude over the terrain [450]. However, “measuring” SA after the fact by assuming its absence is not the same as measuring how well a particular system or operator maintains SA in the absence of an unexpected event [451]. A popular technique for SA measurement is the SA global assessment technique (SAGAT) [452]; in which the operator is briefly interrupted in the performance of a dynamic task and asked questions about it; for example, asking a driver to identify the location of other road traffic [453] or asking an anesthesiologist about the patient’s state [454] or asking the pilot to identify the direction to the nearest hazardous terrain [455]. Sometimes the display is blanked after the question, to assure that the information is stored in memory. One can then assess the accuracy of answering such questions. Alternatively, one can assess the time required to retrieve the correct answer off of a display that remains visible, in a technique called SPAM (Situation Present Assessment Method) [456]. SA can sometimes be measured by a subjective evaluation (“rate your SA on a scale of 1 to 10” [457]), which has been embodied in a well-used measurement tool called SART (situation awareness rating technique) [458]. However, a concern about the validity of such self-rating techniques is that people are not always aware of what they are not aware. This issue of metacognition is addressed at the end of this chapter. SA can be an important tool for accident analysis, understanding when its loss was a contributing factor [450]. To the extent that accidents may be caused by SA loss, an added implication is that systems should be designed and, when appropriate, certified to support SA. This becomes important when federal regulators are responsible for certification, such as the case with new aircraft or nuclear power plants. Although situation awareness is most commonly applied to individuals, distributed situation awareness merits consideration when multiple people work together [459]. Distributed situation awareness refers to the SA that the members of a team jointly hold. Distributed SA, like concepts of team mental model, can guide design when the focus shifts from individual to team performance. We cover these issues in more depth in Chapter 18.

7.5.2 Principles for Improving Situation Awareness

 Specific principles that follow from these considerations and from a recent review include [449]: 1. Create displays that help people notice changes (level 1 SA). Particularly in multitasking situations with dynamic systems, displays should highlight changes to make them easy for people to notice. Chapter 8 addresses issues of display layout to support SA. 2. Make the situation easy to understand (level 2 SA). Present information about the state of the system relative to the person’s goals rather than require that they interpret and mentally combine and transform information. This might also mean bringing together there are several display elements that might otherwise be placed in different locations. 3. Keep the operator somewhat “in the loop”. This issue will be addressed in more detail in Chapter 11 (Automation). The critical concept introduced here is related to the generation effect. People are more likely to remember actions, and the consequence of actions, if they themselves have generated the action, than if they were watching another agent generate the same action. Automobile manufacturers of self driving cars are struggling to find ways of keeping the driver somewhat in the loop (e.g., hands on the wheel), even as automation is steering the car, in order to preserve SA, should automation fail. 4. Help people project the state of the system into the future (level 3 SA). This is particularly important when the system responds slowly, like a supertanker, industrial oven, or air traffic system. Here create a display that shows the future state, such as the predictive displays we discuss in Chapter 8. This relieves the person of mentally simulating and projecting future states. 5. Organize information around goals. Rather than arbitrary or technology oriented placement of information, displays should cluster information according to the goals the person is trying to achieve. 6. Display to broaden attention. Recognizing that SA may be most critical for dealing with unexpected situations, displays should avoid narrowing people’s attention to a limited array of information that is specific to a particular task or limited to routine situations. Supporting SA when unexpected things happen typically means adding information to the display. This information must be carefully integrated to avoid issues of clutter. 7. Train for SA. When training for SA, it is important to realize that training for routine performance may conflict with training to maintain SA [460]. The former will focus on the information needed for the task as it was intended to be performed. The latter should focus on what is often a broader scope of selective attention, to be aware of the state of the world should the system fail.) Many of the biases relevant to diagnosis, discussed above, are paralleled by biases in situation awareness: for example the confirmation bias or anchoring. Hence debiasing training, can be effective here.

انتهای مورد 7-4

fizik100 fizik100 fizik100 · 1400/8/7 02:12 ·

As we will discuss in the following chapter, sources of information that need to be integrated in diagnosis, should be made available simultaneously (not sequentially; to mitigate anchoring), and in close display proximity so that all can be accessed with minimal effort. Emergent features of object displays can sometimes facilitate the integration process in diagnosis [442, 443, 444]. Automation and decision support tools. Finally, automation and expert systems have offered promise in supporting human decision making. This is described in much more detail in Chapter 11, but to provide a link here, such support can be roughly categorized into front end (diagnosis and situation assessment) and back end (treatment, choice and course-of-action recommendations) support. This dichotomy is well illustrated in the two major classes of medical decision aids [445, 446], because automation is so closely bound to decision support tools and expert systems decision advisors, we postpone further discussion of this topic until Chapter 11, where the entire chapter is devoted to human-automation interaction.