7.5 Situation Awareness
7.5 Situation Awareness
The diagnosis error made by the medical specialist, Amy in our vignette can be examined more thoroughly using the concept of situation awareness (SA). Situation awareness, or SA, characterizes people’s awareness and understanding of dynamic changes in their environment [447, 448, 449]. A pilot loses SA whenever he or she suffers a catastrophic controlled-flight into terrain [450, 229], and as we shall see later in Chapter 16, control room operators at the Three Mile Island nuclear power plant lost SA when they believed the water level in the plant to be too high rather than too low, a misdiagnosis that led to a catastrophic release of radioactive material [395]. SA is “the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future” [383](p. 36). These three levels, perception (and selective attention), understanding, and prediction, must be applied to a specific situation. Thus, a person cannot be said to have SA without specifying what that awareness is (or should be) about. A car driver might have good awareness of navigational information and time (where I am and how much time it will take me to drive to my destination), but poor awareness of the vehicle ahead that is merging onto the highway. Improving situation awareness for navigation and for the merging vehicle would require very different designs. Note that SA does not define nor incorporate action. That concerns the decisions made from one’s awareness or assessment of the situation. Many elements of microcognition support SA and were covered in the previous chapter. Selective attention is necessary for the first level, while the second level of understanding depends very much upon both working memory and long-term memory. The third level, projection and prediction, has not yet been discussed but will be considered in more detail in the planning and scheduling section. In addition, mental models guide SA development by defining what information people pursue and the interpretation of that information. For example, Amy’s mental model of the operating room procedures might guide her to ask a nurse for estimated completion time for the perforated viscus procedure. She only asks about this procedure because her mental model of the other procedures gives her a good sense of when they would be done and so she only needs information about the procedure with an uncertain completion time. As noted above, situation awareness is not the same as performance. One can have good performance (a lucky decision outcome that was correct) without good awareness. Correspondingly, the pilot of an out-of-control aircraft may have very good situation awareness of the loss of stability; but be unable to perform the necessary actions to recover.
7.5.1 Measuring Situation Awareness
The importance of SA can often be realized after an accident by inferring that the loss of SA was partially responsible. In controlledflight-into-terrain accidents it is almost always assumed that the pilot lost awareness of the aircraft’s altitude over the terrain [450]. However, “measuring” SA after the fact by assuming its absence is not the same as measuring how well a particular system or operator maintains SA in the absence of an unexpected event [451]. A popular technique for SA measurement is the SA global assessment technique (SAGAT) [452]; in which the operator is briefly interrupted in the performance of a dynamic task and asked questions about it; for example, asking a driver to identify the location of other road traffic [453] or asking an anesthesiologist about the patient’s state [454] or asking the pilot to identify the direction to the nearest hazardous terrain [455]. Sometimes the display is blanked after the question, to assure that the information is stored in memory. One can then assess the accuracy of answering such questions. Alternatively, one can assess the time required to retrieve the correct answer off of a display that remains visible, in a technique called SPAM (Situation Present Assessment Method) [456]. SA can sometimes be measured by a subjective evaluation (“rate your SA on a scale of 1 to 10” [457]), which has been embodied in a well-used measurement tool called SART (situation awareness rating technique) [458]. However, a concern about the validity of such self-rating techniques is that people are not always aware of what they are not aware. This issue of metacognition is addressed at the end of this chapter. SA can be an important tool for accident analysis, understanding when its loss was a contributing factor [450]. To the extent that accidents may be caused by SA loss, an added implication is that systems should be designed and, when appropriate, certified to support SA. This becomes important when federal regulators are responsible for certification, such as the case with new aircraft or nuclear power plants. Although situation awareness is most commonly applied to individuals, distributed situation awareness merits consideration when multiple people work together [459]. Distributed situation awareness refers to the SA that the members of a team jointly hold. Distributed SA, like concepts of team mental model, can guide design when the focus shifts from individual to team performance. We cover these issues in more depth in Chapter 18.
7.5.2 Principles for Improving Situation Awareness
Specific principles that follow from these considerations and from a recent review include [449]: 1. Create displays that help people notice changes (level 1 SA). Particularly in multitasking situations with dynamic systems, displays should highlight changes to make them easy for people to notice. Chapter 8 addresses issues of display layout to support SA. 2. Make the situation easy to understand (level 2 SA). Present information about the state of the system relative to the person’s goals rather than require that they interpret and mentally combine and transform information. This might also mean bringing together there are several display elements that might otherwise be placed in different locations. 3. Keep the operator somewhat “in the loop”. This issue will be addressed in more detail in Chapter 11 (Automation). The critical concept introduced here is related to the generation effect. People are more likely to remember actions, and the consequence of actions, if they themselves have generated the action, than if they were watching another agent generate the same action. Automobile manufacturers of self driving cars are struggling to find ways of keeping the driver somewhat in the loop (e.g., hands on the wheel), even as automation is steering the car, in order to preserve SA, should automation fail. 4. Help people project the state of the system into the future (level 3 SA). This is particularly important when the system responds slowly, like a supertanker, industrial oven, or air traffic system. Here create a display that shows the future state, such as the predictive displays we discuss in Chapter 8. This relieves the person of mentally simulating and projecting future states. 5. Organize information around goals. Rather than arbitrary or technology oriented placement of information, displays should cluster information according to the goals the person is trying to achieve. 6. Display to broaden attention. Recognizing that SA may be most critical for dealing with unexpected situations, displays should avoid narrowing people’s attention to a limited array of information that is specific to a particular task or limited to routine situations. Supporting SA when unexpected things happen typically means adding information to the display. This information must be carefully integrated to avoid issues of clutter. 7. Train for SA. When training for SA, it is important to realize that training for routine performance may conflict with training to maintain SA [460]. The former will focus on the information needed for the task as it was intended to be performed. The latter should focus on what is often a broader scope of selective attention, to be aware of the state of the world should the system fail.) Many of the biases relevant to diagnosis, discussed above, are paralleled by biases in situation awareness: for example the confirmation bias or anchoring. Hence debiasing training, can be effective here.