Situation awareness

Last updated

Situational awareness or situation awareness (SA) is the understanding of an environment, its elements, and how it changes with respect to time or other factors. Situational awareness is important for effective decision making in many environments. It is formally defined as:

Contents

“the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future”. [1]

An alternative definition is that situation awareness is adaptive, externally-directed consciousness that has as its products knowledge about a dynamic task environment and directed action within that environment. [2]

Situation awareness has been recognized as a critical foundation for successful decision-making across a broad range of situations, many of which involve the protection of human life and property, including law enforcement, aviation, air traffic control, ship navigation, [3] health care, [4] emergency response, military command and control operations, transmission system operators, self defense, [5] and offshore oil and nuclear power plant management. [6]

Inadequate situation awareness has been identified as one of the primary causal factors in accidents attributed to human error. [7] [8] [9] [10] According to Endsley’s situation awareness theory, when someone meets a dangerous situation, he needs an appropriate and a precise decision-making process which include pattern recognition and matching, formation of sophisticated schemata and archetypal knowledge that aids correct decision making. [11]

The formal definition of SA is often described as three ascending levels:

  1. Perception of the elements in the environment,
  2. Comprehension or understanding of the situation, and
  3. Projection of future status. [12]

People with the highest levels of SA have not only perceived the relevant information for their goals and decisions, but are also able to integrate that information to understand its meaning or significance, and are able to project likely or possible future scenarios. These higher levels of SA are critical for proactive decision making in demanding environments.

Three facets of SA have been the focus in research: SA states, SA systems, and SA processes. SA states refers to the actual level of awareness people have of the situation. SA systems refers to technologies that are developed to support SA in many environments. SA processes refers to the updating of SA states, and what guides the moment-to-moment change of SA. [13]

History

Although the term itself is fairly recent, the concept has roots in the history of military theory—it is recognizable in Sun Tzu's The Art of War , for example. [14] The term can be traced to World War I, where it was recognized as a crucial skill for crews in military aircraft. [15]

There is evidence that the term situational awareness was first employed at the Douglas Aircraft Company during human factors engineering research while developing vertical and horizontal situation displays and evaluating digital-control placement for the next generation of commercial aircraft. Research programs in flight-crew computer interaction [16] and mental workload measurement [17] built on the concept of awareness measurement from a series of experiments that measured contingency awareness during learning, [18] [19] and later extended to mental workload and fatigue. [20]

Situation awareness appears in the technical literature as early as 1983, when describing the benefits of a prototype touch-screen navigation display. [21] During the early 1980s, integrated “vertical-situation” and “horizontal-situation” displays were being developed for commercial aircraft to replace multiple electro-mechanical instruments. Integrated situation displays combined the information from several instruments enabling more efficient access to critical flight parameters, thereby improving situational awareness and reducing pilot workload.

The term was first defined formally by Endsley in 1988. [22] Before being widely adopted by human factors scientists in the 1990s, the term is said to have been used by United States Air Force (USAF) fighter aircrew returning from war in Korea and Vietnam. [23] They identified having good SA as the decisive factor in air combat engagements—the "ace factor". [24] Survival in a dogfight was typically a matter of observing the opponent's current move and anticipating his next move a fraction of a second before he could observe and anticipate it himself.

USAF pilots also came to equate SA with the "observe" and "orient" phases of the famous observe-orient-decide-act loop (OODA loop), or Boyd cycle, as described by the USAF war theorist Col. John Boyd. In combat, the winning strategy is to "get inside" your opponent's OODA loop, not just by making one's own decisions quicker, but also by having better SA than one's opponent, and even changing the situation in ways that the opponent cannot monitor or even comprehend. Losing one's own SA, in contrast, equates to being "out of the loop".

Clearly, SA has far reaching applications, as it is necessary for individuals and teams to function effectively in their environment. Thus, SA has gone far beyond the field of aviation to work being conducted in a wide variety of environments. SA is being studied in such diverse areas as air traffic control, nuclear power plant operation, emergency response, maritime operations, space, oil and gas drilling, vehicle operation, and health care (e.g. anesthesiology and nursing). [25] [26] [27] [28] [29] [30] [31]

Theoretical model

Endsley's Cognitive Model of SA

Endsley-SA-model.jpg
Endsley's model of SA. This is a synthesis of versions she has given in several sources, notably in 1995 [32] and 2000. [33]

The most widely cited and accepted model of SA was developed by Dr. Mica Endsley, [25] which has been shown to be largely supported by research findings. [34] Lee, Cassano-Pinche, and Vicente found that Endsley's Model of SA received 50% more citations following its publication than any other paper in Human Factors compared to other papers in the 30 year period of their review. [35]

Endsley's model describes the cognitive processes and mechanisms that are used by people to assess situations to develop SA, and the task and environmental factors that also affect their ability to get SA. It describes in detail the three levels of SA formation: perception, comprehension, and projection.

Perception (Level 1 SA): The first step in achieving SA is to perceive the status, attributes, and dynamics of relevant elements in the environment. Thus, Level 1 SA, the most basic level of SA, involves the processes of monitoring, cue detection, and simple recognition, which lead to an awareness of multiple situational elements (objects, events, people, systems, environmental factors) and their current states (locations, conditions, modes, actions).

Comprehension (Level 2 SA): The next step in SA formation involves a synthesis of disjointed Level 1 SA elements through the processes of pattern recognition, interpretation, and evaluation. Level 2 SA requires integrating this information to understand how it will impact upon the individual's goals and objectives. This includes developing a comprehensive picture of the world, or of that portion of the world of concern to the individual.

Projection (Level 3 SA): The third and highest level of SA involves the ability to project the future actions of the elements in the environment. Level 3 SA is achieved through knowledge of the status and dynamics of the elements and comprehension of the situation (Levels 1 and 2 SA), and then extrapolating this information forward in time to determine how it will affect future states of the operational environment.

Endsley's model shows how SA "provides the primary basis for subsequent decision making and performance in the operation of complex, dynamic systems". [36] Although alone it cannot guarantee successful decision making, SA does support the necessary input processes (e.g., cue recognition, situation assessment, prediction) upon which good decisions are based. [37]

SA also involves both a temporal and a spatial component. Time is an important concept in SA, as SA is a dynamic construct, changing at a tempo dictated by the actions of individuals, task characteristics, and the surrounding environment. As new inputs enter the system, the individual incorporates them into this mental representation, making changes as necessary in plans and actions in order to achieve the desired goals.

SA also involves spatial knowledge about the activities and events occurring in a specific location of interest to the individual. Thus, the concept of SA includes perception, comprehension, and projection of situational information, as well as temporal and spatial components.

Endsley's model of SA illustrates several variables that can influence the development and maintenance of SA, including individual, task, and environmental factors.

In summary, the model consists of several key factors that describe the cognitive processes involved in SA: [38]

The model also points to a number of features of the task and environment that affect SA:

Experience and training have a significant impact on people's ability to develop SA, due to its impact on the development of mental models that reduce processing demands and help people to better prioritize their goals. [40] In addition, it has been found that individuals vary in their ability to acquire SA; thus, simply providing the same system and training will not ensure similar SA across different individuals. Research has shown that there are a number of factors that make some people better at SA than others including differences in spatial abilities and multi-tasking skills. [41]

Criticisms of SA

Criticisms of the SA construct and the model are generally viewed as unfounded and addressed. [42] [43] [44] The Endsley model is very detailed in describing the exact cognitive processes involved in SA. A narrative literature review of SA, performance, and other human factors constructs states that SA “... is valuable in understanding and predicting human-system performance in complex systems.” [42]

Nevertheless, there are several criticisms of SA. One criticism is the danger of circularity with SA: “How does one know that SA was lost? Because the human responded inappropriately. Why did the human respond inappropriately? Because SA was lost.” [45] Building on the circularity concern, others deemed SA a folk model on the basis it is frequently overgeneralized and immune to falsification. [46] [47] A response to these criticisms it arguing that measures of SA are “... falsifiable in terms of their usefulness in prediction.” [42]

A recent review and meta-analysis of SA measures showed they were highly correlated or predictive of performance, which initially appears to provide strong quantitative evidence refuting criticisms of SA. [44] However, the inclusion criteria in this meta-analysis [44] was limited to positive correlations reaching desirable levels of statistical significance. [48] That is, more desirable results hypothesis supporting results were included while the less desirable results, contradicting the hypothesis, were excluded. The justification was "Not all measures of SA are relevant to performance." [44] This an example of a circular analysis or double-dipping, [49] where the dataset being analyzed are selected based on the outcome from analyzing the same dataset.

Because only more desirable effects were included, the results of this meta-analysis were predetermined – predictive measures of SA were predictive. [48] Further, there were inflated estimates of mean effect sizes compared to an analysis that did not select results using statistical significance. [48] Determining the relevance of SA based on the desirability of outcomes and analyzing only supporting results is a circular conceptualization of SA and revives concerns about the falsifiability of SA. [48]

Several cognitive processes related to situation awareness are briefly described in this section. The matrix shown below attempts to illustrate the relationship among some of these concepts. [50] Note that situation awareness and situational assessment are more commonly discussed in information fusion complex domains such as aviation and military operations and relate more to achieving immediate tactical objectives. [51] [52] [53] Sensemaking and achieving understanding are more commonly found in industry and the organizational psychology literature and often relate to achieving long-term strategic objectives.

There are also biological mediators of situational awareness, most notably hormones such as testosterone, and neurotransmitters such as dopamine and norepinephrine. [54]

Phase
ProcessOutcome
ObjectiveTactical (short-term)situational assessmentsituation awareness
Strategic (long-term)sensemakingunderstanding
Scientific (longer-term)analysisprediction

Situational understanding

Situation awareness is sometimes confused with the term "situational understanding." In the context of military command and control applications, situational understanding refers to the "product of applying analysis and judgment to the unit's situation awareness to determine the relationships of the factors present and form logical conclusions concerning threats to the force or mission accomplishment, opportunities for mission accomplishment, and gaps in information". [55] Situational understanding is the same as Level 2 SA in the Endsley model—the comprehension of the meaning of the information as integrated with each other and in terms of the individual's goals. It is the "so what" of the data that is perceived.

Situational assessment

In brief, situation awareness is viewed as "a state of knowledge," and situational assessment as "the processes" used to achieve that knowledge. Endsley argues that "it is important to distinguish the term situation awareness, as a state of knowledge, from the processes used to achieve that state. [1] These processes, which may vary widely among individuals and contexts, will be referred to as situational assessment or the process of achieving, acquiring, or maintaining SA." Note that SA is not only produced by the processes of situational assessment, it also drives those same processes in a recurrent fashion. For example, one's current awareness can determine what one pays attention to next and how one interprets the information perceived. [56]

Mental models

Accurate mental models are one of the prerequisites for achieving SA. [22] [57] [58] A mental model can be described as a set of well-defined, highly organized yet dynamic knowledge structures developed over time from experience. [59] [60] The volume of available data inherent in complex operational environments can overwhelm the capability of novice decision makers to attend, process, and integrate this information efficiently, resulting in information overload and negatively impacting their SA. [61] In contrast, experienced decision makers assess and interpret the current situation (Level 1 and 2 SA) and select an appropriate action based on conceptual patterns stored in their long-term memory as "mental models". [62] [63] Cues in the environment activate these mental models, which in turn guide their decision making process.

Sensemaking

Klein, Moon, and Hoffman distinguish between situation awareness and sensemaking as follows:

...situation awareness is about the knowledge state that's achieved—either knowledge of current data elements, or inferences drawn from these data, or predictions that can be made using these inferences. In contrast, sensemaking is about the process of achieving these kinds of outcomes, the strategies, and the barriers encountered. [64]

In brief, sensemaking is viewed more as "a motivated, continuous effort to understand connections (which can be among people, places, and events) in order to anticipate their trajectories and act effectively", [65] rather than the state of knowledge underlying situation awareness. Endsley points out that as an effortful process, sensemaking is actually considering a subset of the processes used to maintain situation awareness. [66] [43] In the vast majority of the cases, SA is instantaneous and effortless, proceeding from pattern recognition of key factors in the environment—"The speed of operations in activities such as sports, driving, flying and air traffic control practically prohibits such conscious deliberation in the majority of cases, but rather reserves it for the exceptions." Endsley also points out that sensemaking is backward focused, forming reasons for past events, while situation awareness is typically forward looking, projecting what is likely to happen in order to inform effective decision processes. [66] [43]

In team operations

In many systems and organizations, people work not just as individuals, but as members of a team. Thus, it is necessary to consider the SA of not just individual team members, but also the SA of the team as a whole. To begin to understand what is needed for SA within teams, it is first necessary to clearly define what constitutes a team. A team is not just any group of individuals; rather teams have a few defining characteristics. A team is:

a distinguishable set of two or more people who interact dynamically, interdependently and adaptively toward a common and valued goal/objective/mission, who have each been assigned specific roles or functions to perform, and who have a limited life span of membership.

Salas et al. (1992) [67]

Team SA

Team SA is defined as "the degree to which every team member possesses the SA required for his or her responsibilities". [38] The success or failure of a team depends on the success or failure of each of its team members. If any one of the team members has poor SA, it can lead to a critical error in performance that can undermine the success of the entire team. By this definition, each team member needs to have a high level of SA on those factors that are relevant for his or her job. It is not sufficient for one member of the team to be aware of critical information if the team member who needs that information is not aware. Therefore, team members need to be successful in communicating information between them (including how they are interpreting or projecting changes in the situation to form level 2 and 3 SA) or in each independently being able to get the information they need.

In a team, each member has a subgoal pertinent to his/her specific role that feeds into the overall team goal. Associated with each member's subgoal are a set of SA elements about which he/she is concerned. As the members of a team are essentially interdependent in meeting the overall team goal, some overlap between each member's subgoal and their SA requirements will be present. It is this subset of information that constitutes much of team coordination. That coordination may occur as a verbal exchange, a duplication of displayed information, or by some other means. [68]

Shared SA

Shared situation awareness can be defined as "the degree to which team members possess the same SA on shared SA requirements". [69] [70] As implied by this definition, there are information requirements that are relevant to multiple team members. A major part of teamwork involves the area where these SA requirements overlap—the shared SA requirements that exist as a function of the essential interdependency of the team members. In a poorly functioning team, two or more members may have different assessments on these shared SA requirements and thus behave in an uncoordinated or even counter-productive fashion. Yet in a smoothly functioning team, each team member shares a common understanding of what is happening on those SA elements that are common—shared SA. Thus, shared SA refers to degree to which people have a common understanding on information that is in the overlap of the SA requirements of the team members. Not all information needs to be shared. Clearly, each team member is aware of much that is not pertinent to the others on the team. Sharing every detail of each person's job would creates information overload to sort through to get needed information. [71] [72] It is only that information which is relevant to the SA requirements of each team member that needs to be shared.

Team SA model

The situation awareness of the team as a whole, therefore, is dependent upon both a high level of SA among individual team members for the aspects of the situation necessary for their job; and a high level of shared SA between team members, providing an accurate common operating picture of those aspects of the situation common to the needs of each member. [73] Endsley and Jones [57] [73] describe a model of team situation awareness as a means of conceptualizing how teams develop high levels of shared SA across members. Each of these four factors—requirements, devices, mechanisms and processes—act to help build team and shared SA.

  1. Team SA requirements – the degree to which the team members know which information needs to be shared, including their higher level assessments and projections (which are usually not otherwise available to fellow team members), and information on team members' task status and current capabilities.
  2. Team SA devices – the devices available for sharing this information, which can include direct communication (both verbal and non-verbal), shared displays (e.g., visual or audio displays, or tactile devices), or a shared environment. As non-verbal communication, such as gestures and display of local artifacts, and a shared environment are usually not available in distributed teams, this places far more emphasis on verbal communication and communication technologies for creating shared information displays.
  3. Team SA mechanisms – the degree to which team members possess mechanisms, such as shared mental models, which support their ability to interpret information in the same way and make accurate projections regarding each other's actions. The possession of shared mental models can greatly facilitate communication and coordination in team settings.
  4. Team SA processes – the degree to which team members engage in effective processes for sharing SA information which may include a group norm of questioning assumptions, checking each other for conflicting information or perceptions, setting up coordination and prioritization of tasks, and establishing contingency planning among others.

In time critical decision-making processes

In time-critical decision-making processes, swift and effective choices are imperative to address and navigate urgent situations. In such scenarios, the ability to analyze information rapidly, prioritize key factors, and execute decisions promptly becomes paramount. Time constraints often necessitate a balance between thorough deliberation and the need for quick action.

The decision-maker must rely on a combination of experience, intuition, and available data to make informed choices under pressure. Prioritizing critical elements, assessing potential outcomes, and considering the immediate and long-term consequences are crucial aspects of effective time-critical decision-making.

Furthermore, clear communication is essential to ensure that decisions are swiftly conveyed to relevant stakeholders and executed seamlessly. Collaborative efforts, streamlined processes, and well-defined protocols can enhance the efficiency of decision-making in time-sensitive situations.

Adaptability and the ability to recalibrate strategies in real-time are vital attributes in time-critical scenarios, as unforeseen developments may require rapid adjustments to the initial decision. Embracing technological advancements and data-driven insights, and incorporating simulation exercises, can also contribute to better decision-making outcomes in high-pressure situations.

Ultimately, successful time-critical decision-making involves a combination of expertise, preparedness, effective communication, and a willingness to adapt, ensuring that the chosen course of action aligns with the urgency of the situation while minimizing the risk of errors.

Measurement

While the SA construct has been widely researched, the multivariate nature of SA poses a considerable challenge to its quantification and measurement. [lower-alpha 1] In general, techniques vary in terms of direct measurement of SA (e.g., objective real-time probes or subjective questionnaires assessing perceived SA) or methods that infer SA based on operator behavior or performance. Direct measures are typically considered to be "product-oriented" in that these techniques assess an SA outcome; inferred measures are considered to be "process-oriented," focusing on the underlying processes or mechanisms required to achieve SA. [74] These SA measurement approaches are further described next.

Objective measures

Objective measures directly assess SA by comparing an individual's perceptions of the situation or environment to some "ground truth" reality. Specifically, objective measures collect data from the individual on his or her perceptions of the situation and compare them to what is actually happening to score the accuracy of their SA at a given moment in time. Thus, this type of assessment provides a direct measure of SA and does not require operators or observers to make judgments about situational knowledge on the basis of incomplete information. Objective measures can be gathered in one of three ways: real-time as the task is completed (e.g., "real-time probes" presented as open questions embedded as verbal communications during the task [75] ), during an interruption in task performance (e.g., situation awareness global assessment technique (SAGAT), [32] or the WOMBAT situational awareness and stress tolerance test mostly used in aviation since the late 1980s and often called HUPEX in Europe), or post-test following completion of the task.

Subjective measures

Subjective measures directly assess SA by asking individuals to rate their own or the observed SA of individuals on an anchored scale (e.g., participant situation awareness questionnaire; [76] the situation awareness rating technique [77] ). Subjective measures of SA are attractive in that they are relatively straightforward and easy to administer. However, several limitations should be noted. Individuals making subjective assessments of their own SA are often unaware of information they do not know (the unknown unknowns). Subjective measures also tend to be global in nature, and, as such, do not fully exploit the multivariate nature of SA to provide the detailed diagnostics available with objective measures. Nevertheless, self-ratings may be useful in that they can provide an assessment of operators' degree of confidence in their SA and their own performance. Measuring how SA is perceived by the operator may provide information as important as the operator's actual SA, since errors in perceived SA quality (over-confidence or under-confidence in SA) may have just as harmful an effect on an individual's or team's decision-making as errors in their actual SA. [78]

Subjective estimates of an individual's SA may also be made by experienced observers (e.g., peers, commanders, or trained external experts). These observer ratings may be somewhat superior to self-ratings of SA because more information about the true state of the environment is usually available to the observer than to the operator, who may be focused on performing the task (i.e., trained observers may have more complete knowledge of the situation). However, observers have only limited knowledge about the operator's concept of the situation and cannot have complete insight into the mental state of the individual being evaluated. Thus, observers are forced to rely more on operators' observable actions and verbalizations in order to infer their level of SA. In this case, such actions and verbalizations are best assessed using performance and behavioral measures of SA, as described next.

Performance and behavioral measures

Performance measures infer SA from the end result (i.e., task performance outcomes), based on the assumption that better performance indicates better SA. Common performance metrics include quantity of output or productivity level, time to perform the task or respond to an event, and the accuracy of the response or, conversely, the number of errors committed. The main advantage of performance measures is that these can be collected objectively and without disrupting task performance. However, although evidence exists to suggest a positive relation between SA and performance, this connection is probabilistic and not always direct and unequivocal. [25] In other words, good SA does not always lead to good performance and poor SA does not always lead to poor performance. [79] Thus, performance measures should be used in conjunction with others measures of SA that directly assess this construct.

Behavioral measures also infer SA from the actions that individuals choose to take, based on the assumption that good actions will follow from good SA and vice versa. Behavioral measures rely primarily on observer ratings, and are, thus, somewhat subjective in nature. To address this limitation, observers can be asked to evaluate the degree to which individuals are carrying out actions and exhibiting behaviors that would be expected to promote the achievement of higher levels of SA. [lower-alpha 2] This approach removes some of the subjectivity associated with making judgments about an individual's internal state of knowledge by allowing them to make judgments about SA indicators that are more readily observable.

Process indices

Process indices examine how individuals process information in their environment, such as by analyzing communication patterns between team members or using eye tracking devices. Team communication (particularly verbal communication) supports the knowledge building and information processing that leads to SA construction. [57] Thus, since SA may be distributed via communication, computational linguistics and machine learning techniques can be combined with natural language analytical techniques (e.g., Latent semantic analysis) to create models that draw on the verbal expressions of the team to predict SA and task performance. [81] [82] Although evidence exists to support the utility of communication analysis for predicting team SA, [83] time constraints and technological limitations (e.g., cost and availability of speech recording systems and speech-to-text translation software) may make this approach less practical and viable in time-pressured, fast-paced operations.

Psycho-physiological measures also serve as process indices of operator SA by providing an assessment of the relationship between human performance and a corrected change in the operator's physiology. [84] In other words, cognitive activity is associated with changes in the operator's physiological states. For example, the operator's overall functional state (as assessed using psycho-physiological measures, such as electroencephalography data, eyeblinks, and cardiac activity) may provide an indication as to whether the operator is sleep fatigued at one end of the continuum, or mentally overloaded at the other end. [85] Other psycho-physiological measures, such as event-related potentials, event-related desynchronization, transient heart rate, and electrodermal activity, may be useful for evaluating an operator's perception of critical environmental cues, that is, to determine if the operator has detected and perceived a task-relevant stimulus. [85] In addition, it is also possible to use psycho-physiological measures to monitor operators' environmental expectancies, that is, their physiological responses to upcoming events, as a measure of their current level of SA. [85]

Multi-faceted approach to measurement

The multivariate nature of SA significantly complicates its quantification and measurement, as it is conceivable that a metric may only tap into one aspect of the operator's SA. Further, studies have shown that different types of SA measures do not always correlate strongly with each other. [lower-alpha 3] Accordingly, rather than rely on a single approach or metric, valid and reliable measurement of SA should utilize a battery of distinct yet related measures that complement each other. [86] Such a multi-faced approach to SA measurement capitalizes on the strengths of each measure while minimizing the limitations inherent in each.

Limitations

Situation awareness is limited by sensory input and available attention, by the individual's knowledge and experience, and by their ability to analyse the available information effectively. Attention is a limited resource, and may be reduced by distraction and task loading. Comprehension of the situation and projection of future status depend heavily on relevant knowledge, understanding, and experience in similar environments. Team SA is less limited by these factors, as there is a wider knowledge and experience base, but it is limited by the effectiveness of communication within the team. [87]

Training

Following Endsley's paradigm and with cognitive resource management model [88] with neurofeedback techniques, Spanish Pedagogist María Gabriela López García (2010) implemented and developed a new SA training pattern. [89] The first organization to implement this new pattern design by López García is the SPAF (Spanish Air Force). She has trained EF-18 fighter pilots and Canadair firefighters. [90]

This situation awareness training aims to avoid losing SA and provide pilots cognitive resources to always operate below the maximum workload that they can withstand. This provides not only a lower probability of incidents and accidents by human factors, but the hours of operation are at their optimum efficiency, extending the operating life of systems and operators. [91]

On-the-job examples

Emergency medical call-outs

In first aid medical training provided by the American Red Cross, the need to be aware of the situation within the area of influence as one approaches an individual requiring medical assistance is the first aspect for responders to consider [92] Examining the area and being aware of potential hazards, including the hazards which may have caused the injuries being treated, is an effort to ensure that responders do not themselves get injured and require treatment as well.

Situation awareness for first responders in medical situations also includes evaluating and understanding what happened [93] to avoid injury of responders and also to provide information to other rescue agencies which may need to know what the situation is via radio prior to their arrival on the scene.

In a medical context, situation awareness is applied to avoid further injury to already-injured individuals, to avoid injury to medical responders, and to inform other potential responders of hazardous conditions prior to their arrival.

Vehicle driving and aviation

A loss in situational awareness has led to many transportation accidents, including the 1991 Los Angeles Airport runway collision [94] and the 2015 Philadelphia train derailment. [95]

Search and rescue

Within the search and rescue context, situational awareness is applied primarily to avoid injury to search crews by being aware of the environment, the lay of the land, and the many other factors of influence within one's surroundings assists in the location of injured or missing individuals. [96] Public safety agencies are increasingly using situational awareness applications like Android Tactical Assault Kit on mobile devices and even robots to improve situational awareness. [97]

Forestry crosscut saw / chainsaw

In the United States Forest Service the use of chainsaws and crosscut saws requires training and certification. [98] A great deal of that training describes situational awareness as an approach toward environmental awareness but also self-awareness [99] which includes being aware of one's own emotional attitude, tiredness, and even caloric intake.

Situational awareness in the forest context also includes evaluating the environment and the potential safety hazards within a saw crew's area of influence. As a sawyer approaches a task, the ground, wind, cloud cover, hillsides, and many other factors are examined and are considered proactively as part of trained sawyers' ingrained training.

Dead or diseased trees within the reach of saw team crews are evaluated, the strength and direction of the wind is evaluated. The lay of tree sections to be bucked or the lean of a tree to be felled is evaluated within the context of being aware of where the tree will fall or move to when cut, where the other members of the saw team are located, how they are moving, whether hikers are within the area of influence, whether hikers are moving or are stationary.

Law enforcement

Law enforcement training includes being situationally aware of what is going on around the police officer before, during, and after interactions with the general public [100] while also being fully aware of what is happening around the officer in areas not currently the focus of an officer's immediate task.

Cybersecurity threat operations

In cybersecurity, consider situational awareness, for threat operations, is being able to perceive threat activity and vulnerability in context so that the following can be actively defended: data, information, knowledge, and wisdom from compromise. Situational awareness is achieved by developing and using solutions that often consume data and information from many different sources. Technology and algorithms are then used to apply knowledge and wisdom in order to discern patterns of behavior that point to possible, probable, and real threats.

Situational awareness for cybersecurity threat operations teams appears in the form of a condensed, enriched, often graphical, prioritized, and easily searchable view of systems that are inside or related to security areas of responsibility (such as corporate networks or those used for national security interests). Different studies have analyzed the perception of security and privacy in the context of eHealth, [101] network security, [102] or using collaborative approaches to improve the awareness of users. [103] There are also research efforts to automate the processing of communication network information in order to obtain or improve cyber-situational awareness. [104]

Situation awareness-based agency transparency model

As the capabilities of technological agents increases, it becomes more important that their actions and underlying rational becomes transparent. In the military realm, agent transparency has been investigated as unmanned vehicles are being employed more frequently. In 2014, researchers at the U.S. Army Research Laboratory reported the Situation Awareness-based Agent Transparency (SAT), a model designed to increase transparency through user interface design. When it comes to automation, six barriers have been determined to discourage "human trust in autonomous systems, with 'low observability, predictability, directability and auditability' and 'low mutual understanding of common goals' being among the key issues." [105] The researchers at the US Army Research Laboratory designed three levels of situational awareness transparency based on Endsley's theory of perception, comprehension, and projection. The greater the level of situational awareness, they claimed, the more information the agent conveys to the user. [106]

A 2018 publication from the U.S. Army Research Laboratory evaluated how varying transparency levels in the SAT affects the operator workload and a human's understanding of when it is necessary to intervene in the agent's decision making. The researchers refer to this supervisory judgement as calibration. The group split their SAT model research into two efforts: the Intelligent Agent Transparency in Human Agent Transparency for Multi UxV Management (IMPACT) and the Autonomous Squad Member (ASM) projects. [105]

Scientists provided three standard levels of SAT in addition to a fourth level which included the agent's level of uncertainty in its decision in unmanned vehicles. The stated goal of this research was to determine how modifying levels of SAT affected user performance, situation awareness, and confidence in the agent. The scientists stated that their experimental results support that increased agent transparency improved the performance of the operator and human confidence on the agent without a significant effect on the workload. When the agent communicated levels of uncertainty in the task assigned, those involved in the experimentation displayed more trust in the agent. [107]

The ASM research was conducted by providing a simulation game in which the participant had to complete a training course with an ASM, a ground robot that communicates with infantry. The participants had to multitask, evaluating potential threats while monitoring the ASM's communications on the interface. According to that research, experimental results demonstrated that the greatest confidence calibration occurred when the agent communicated information of all three levels of SAT. [107] The group of scientists from the U.S. Army Research Laboratory developed transparency visualization concepts in which the agents can communicate their plans, motivations, and projected outcomes through icons. The agent has been reported to be able to relate its resource usage, reasoning, predicted resource loss, progress towards task completion, etc. [105] Unlike in the IMPACT research, the agent informing the user of its level of uncertainty in decision making, no increase in trust was observed. [107]

Strategies for Acquiring Situational Awareness

Crowdsourcing

Crowdsourcing, made possible by the rise of social media and ubiquitous mobile access has a potential for considerably enhancing situation awareness of both responsible authorities and citizens themselves for emergency and crisis situations by employing or using "citizens as sensors". [108] [109] [110] [111] [112] [113] [114] [115] For instance, analysis of content posted on online social media like Facebook and Twitter using data mining, machine learning and natural language processing techniques may provide situational information. [115] A crowdsourcing approach to sensing, particularly in crisis situations, has been referred to as crowdsensing. [116] Crowdmapping is a subtype of crowdsourcing [117] [118] by which aggregation of crowd-generated inputs such as captured communications and social media feeds are combined with geographic data to create a digital map that is as up-to-date as possible [119] [120] [121] [122] that can improve situational awareness during an incident and be used to support incident response. [123]

Cloud-based geographic information system display of structured data

A Cloud-based Geographic Information System (GIS) with a display of structured data refers to a system that utilizes cloud computing technology to store, manage, analyze, and visualize geographic data in a structured format. This approach offers several advantages, including accessibility, scalability, and collaboration, compared to traditional on-premises GIS systems.

Here's a breakdown of the key components:

Cloud-Based Infrastructure:

Geographic Information System (GIS):

Structured Data Storage:

Data Analysis and Processing:

Visualization Tools:

Collaborative Features:

Real-Time Updates:

Integration with Other Cloud Services:

Overall, a cloud-based GIS with structured data display provides a dynamic and efficient platform for managing geographic information, making it accessible, scalable, and collaborative for a wide range of applications, from urban planning and environmental monitoring to business analytics and disaster response.

Military training methods

There are two training scenarios designed to increase the situational awareness skills of military professionals, and first responders in police and emergency services. The first, Kim's Game, has a more common place in the Marine Corps sniper school and police academies. The name is derived from the novel Kim which references the game to a spy school lesson. The game involves a tray with various items such as spoons, pencils, bullets, and any other items the soldiers would be familiar with. The participants are given one minute to view all of these items before they are covered up with a blanket. The participants would then individually list the items that they saw, the one with the most correct answers would win the game. The same game is played in young scouting and girl guide groups as well to teach children quick memorisation skills.

The second method is a more practical military application of Kim's Game. It starts with a field area (jungle, bush or forest) of about five meters wide to 10 meters deep where various items, some camouflaged and some not, to be located in the area on the ground and in the trees at eyesight level. Again, these items would be ones that are familiar to the soldiers undergoing the exercise. The participants would be given 10 minutes to view the area from one place and take a mental note of the items they saw. Once their 10 minutes is up, the soldier would then be required to do a repetition of certain exercises such as burpees, designed to simulate the stress of a physically demanding environment. Once the participant completes the exercise, they would list the items they saw. The points would be tallied in the end to find the winner.

See also

Notes

  1. For a detailed discussion on SA measurement, see:
    • Endsley, M.R.; Garland, D.J., eds. (2000). Situation awareness analysis and measurement. Mahwah, NJ: Lawrence Erlbaum Associates.
    • Fracker, M.L. (1991a). Measures of situation awareness: An experimental evaluation (Report No. AL-TR-1991-0127). Wright-Patterson Air Force Base, OH: Armstrong Laboratories.
    • Fracker, M.L. (1991b). Measures of situation awareness: Review and future directions (Report No. AL-TR-1991-0128). Wright-Patterson Air Force Base, OH: Armstrong Laboratories.
  2. See, for example, the situation awareness behaviorally anchored rating scale [80] [76]
  3. cf.:
    • Durso, F.T., Truitt, T.R., Hackworth, C.A., Crutchfield, J.M., Nikolic, D., Moertl, P.M., Ohrt, D., & Manning, C.A. (1995). Expertise and chess: A pilot study comparing situation awareness methodologies. In D.J. Garland & M.R. Endsley (Eds.), Experimental analysis and measurement of situation awareness (pp. 295–303). Daytona Beach, FL: Embry-Riddle Aeronautical University Press.
    • Endsley, Mica R.; Selcon, Stephen J.; Hardiman, Thomas D.; Croft, Darryl G. (1998). "A Comparative Analysis of Sagat and Sart for Evaluations of Situation Awareness" (PDF). Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 42 (1). Santa Monica, CA: SAGE Publications: 82–86. doi:10.1177/154193129804200119. ISSN   2169-5067. S2CID   38430173. Archived from the original (PDF) on 2007-09-28.
    • Vidulich, M.A. (2000). Testing the sensitivity of situation awareness metrics in interface evaluations. In M.R. Endsley & D.J. Garland, (Eds.), Situation awareness analysis and measurement (pp. 227–246). Mahwah, NJ: Lawrence Erlbaum Associates.

Related Research Articles

<span class="mw-page-title-main">Usability</span> Capacity of a system for its users to perform tasks

Usability can be described as the capacity of a system to provide a condition for its users to perform the tasks safely, effectively, and efficiently while enjoying the experience. In software engineering, usability is the degree to which a software can be used by specified consumers to achieve quantified objectives with effectiveness, efficiency, and satisfaction in a quantified context of use.

Task analysis is a fundamental tool of human factors engineering. It entails analyzing how a task is accomplished, including a detailed description of both manual and mental activities, task and element durations, task frequency, task allocation, task complexity, environmental conditions, necessary clothing and equipment, and any other unique factors involved in or required for one or more people to perform a given task.

GOMS is a specialized human information processor model for human-computer interaction observation that describes a user's cognitive structure on four components. In the book The Psychology of Human Computer Interaction. written in 1983 by Stuart K. Card, Thomas P. Moran and Allen Newell, the authors introduce: "a set of Goals, a set of Operators, a set of Methods for achieving the goals, and a set of Selections rules for choosing among competing methods for goals." GOMS is a widely used method by usability specialists for computer system designers because it produces quantitative and qualitative predictions of how people will use a proposed system.

The term workload can refer to several different yet related entities.

Ecological interface design (EID) is an approach to interface design that was introduced specifically for complex sociotechnical, real-time, and dynamic systems. It has been applied in a variety of domains including process control, aviation, and medicine.

Cognitive ergonomics is a scientific discipline that studies, evaluates, and designs tasks, jobs, products, environments and systems and how they interact with humans and their cognitive abilities. It is defined by the International Ergonomics Association as "concerned with mental processes, such as perception, memory, reasoning, and motor response, as they affect interactions among humans and other elements of a system. Cognitive ergonomics is responsible for how work is done in the mind, meaning, the quality of work is dependent on the persons understanding of situations. Situations could include the goals, means, and constraints of work. The relevant topics include mental workload, decision-making, skilled performance, human-computer interaction, human reliability, work stress and training as these may relate to human-system design." Cognitive ergonomics studies cognition in work and operational settings, in order to optimize human well-being and system performance. It is a subset of the larger field of human factors and ergonomics.

Competence is the set of demonstrable characteristics and skills that enable and improve the efficiency or performance of a job. Competency is a series of knowledge, abilities, skills, experiences and behaviors, which leads to effective performance in an individual's activities. Competency is measurable and can be developed through training.

Operational risk management (ORM) is defined as a continual recurring process that includes risk assessment, risk decision making, and the implementation of risk controls, resulting in the acceptance, mitigation, or avoidance of risk.

Human error assessment and reduction technique (HEART) is a technique used in the field of human reliability assessment (HRA), for the purposes of evaluating the probability of a human error occurring throughout the completion of a specific task. From such analyses measures can then be taken to reduce the likelihood of errors occurring within a system and therefore lead to an improvement in the overall levels of safety. There exist three primary reasons for conducting an HRA: error identification, error quantification, and error reduction. As there exist a number of techniques used for such purposes, they can be split into one of two classifications: first-generation techniques and second generation techniques. First generation techniques work on the basis of the simple dichotomy of 'fits/doesn't fit' in the matching of the error situation in context with related error identification and quantification and second generation techniques are more theory based in their assessment and quantification of errors. HRA techniques have been used in a range of industries including healthcare, engineering, nuclear, transportation, and business sectors. Each technique has varying uses within different disciplines.

Influence Diagrams Approach (IDA) is a technique used in the field of Human reliability Assessment (HRA), for the purposes of evaluating the probability of a human error occurring throughout the completion of a specific task. From such analyses measures can then be taken to reduce the likelihood of errors occurring within a system and therefore lead to an improvement in the overall levels of safety. There exist three primary reasons for conducting an HRA; error identification, error quantification and error reduction. As there exist a number of techniques used for such purposes, they can be split into one of two classifications; first generation techniques and second generation techniques. First generation techniques work on the basis of the simple dichotomy of ‘fits/doesn’t fit’ in the matching of the error situation in context with related error identification and quantification and second generation techniques are more theory based in their assessment and quantification of errors. ‘HRA techniques have been utilised in a range of industries including healthcare, engineering, nuclear, transportation and business sector; each technique has varying uses within different disciplines.

<span class="mw-page-title-main">Industrial engineering</span> Branch of engineering which deals with the optimization of complex processes or systems

Industrial engineering is an engineering profession that is concerned with the optimization of complex processes, systems, or organizations by developing, improving and implementing integrated systems of people, money, knowledge, information and equipment. Industrial engineering is central to manufacturing operations.

Dynamic decision-making (DDM) is interdependent decision-making that takes place in an environment that changes over time either due to the previous actions of the decision maker or due to events that are outside of the control of the decision maker. In this sense, dynamic decisions, unlike simple and conventional one-time decisions, are typically more complex and occur in real-time and involve observing the extent to which people are able to use their experience to control a particular complex system, including the types of experience that lead to better decisions over time.

Adaptive collaborative control is the decision-making approach used in hybrid models consisting of finite-state machines with functional models as subcomponents to simulate behavior of systems formed through the partnerships of multiple agents for the execution of tasks and the development of work products. The term “collaborative control” originated from work developed in the late 1990s and early 2000 by Fong, Thorpe, and Baur (1999). It is important to note that according to Fong et al. in order for robots to function in collaborative control, they must be self-reliant, aware, and adaptive. In literature, the adjective “adaptive” is not always shown but is noted in the official sense as it is an important element of collaborative control. The adaptation of traditional applications of control theory in teleoperations sought initially to reduce the sovereignty of “humans as controllers/robots as tools” and had humans and robots working as peers, collaborating to perform tasks and to achieve common goals. Early implementations of adaptive collaborative control centered on vehicle teleoperation. Recent uses of adaptive collaborative control cover training, analysis, and engineering applications in teleoperations between humans and multiple robots, multiple robots collaborating among themselves, unmanned vehicle control, and fault tolerant controller design.

Human factors are the physical or cognitive properties of individuals, or social behavior which is specific to humans, and which influence functioning of technological systems as well as human-environment equilibria. The safety of underwater diving operations can be improved by reducing the frequency of human error and the consequences when it does occur. Human error can be defined as an individual's deviation from acceptable or desirable practice which culminates in undesirable or unexpected results. Human factors include both the non-technical skills that enhance safety and the non-technical factors that contribute to undesirable incidents that put the diver at risk.

[Safety is] An active, adaptive process which involves making sense of the task in the context of the environment to successfully achieve explicit and implied goals, with the expectation that no harm or damage will occur. – G. Lock, 2022

Dive safety is primarily a function of four factors: the environment, equipment, individual diver performance and dive team performance. The water is a harsh and alien environment which can impose severe physical and psychological stress on a diver. The remaining factors must be controlled and coordinated so the diver can overcome the stresses imposed by the underwater environment and work safely. Diving equipment is crucial because it provides life support to the diver, but the majority of dive accidents are caused by individual diver panic and an associated degradation of the individual diver's performance. – M.A. Blumenberg, 1996

<span class="mw-page-title-main">Ergonomics</span> Designing systems to suit their users

Ergonomics, also known as human factors or human factors engineering (HFE), is the application of psychological and physiological principles to the engineering and design of products, processes, and systems. Primary goals of human factors engineering are to reduce human error, increase productivity and system availability, and enhance safety, health and comfort with a specific focus on the interaction between the human and equipment.

<span class="mw-page-title-main">Mica Endsley</span> Former Chief Scientist of the U.S. Air Force

Mica Endsley is an American engineer and a former Chief Scientist of the United States Air Force.

Automation bias is the propensity for humans to favor suggestions from automated decision-making systems and to ignore contradictory information made without automation, even if it is correct. Automation bias stems from the social psychology literature that found a bias in human-human interaction that showed that people assign more positive evaluations to decisions made by humans than to a neutral object. The same type of positivity bias has been found for human-automation interaction, where the automated decisions are rated more positively than neutral. This has become a growing problem for decision making as intensive care units, nuclear power plants, and aircraft cockpits have increasingly integrated computerized system monitors and decision aids to mostly factor out possible human error. Errors of automation bias tend to occur when decision-making is dependent on computers or other automated aids and the human is in an observatory role but able to make decisions. Examples of automation bias range from urgent matters like flying a plane on automatic pilot to such mundane matters as the use of spell-checking programs.

Human performance modeling (HPM) is a method of quantifying human behavior, cognition, and processes. It is a tool used by human factors researchers and practitioners for both the analysis of human function and for the development of systems designed for optimal user experience and interaction. It is a complementary approach to other usability testing methods for evaluating the impact of interface features on operator performance.

<span class="mw-page-title-main">SHELL model</span> Conceptual model for human error in aviation

In aviation, the SHELL model is a conceptual model of human factors that helps to clarify the location and cause of human error within an aviation environment.

The out-of-the-loop performance problem arises when an operator suffers from performance decrement as a consequence of automation. The potential loss of skills and of situation awareness caused by vigilance and complacency problems might make operators of automated systems unable to operate manually in case of system failure. Highly automated systems reduce the operator to monitoring role, which diminishes the chances for the operator to understand the system. It is related to mind wandering.

References

  1. 1 2 Endsley 1995b, p. 36.
  2. Smith & Hancock 1995, p. 36.
  3. Nullmeyer, R.T., Stella, D., Montijo, G.A., & Harden, S.W. (2005). Human factors in Air Force flight mishaps: Implications for change. Proceedings of the 27th Annual Interservice/Industry Training, Simulation, and Education Conference (paper no. 2260). Arlington, VA: National Training Systems Association.
  4. Schulz, CM; Endsley, MR; Kochs, EF; Gelb, AW; Wagner, KJ (Mar 2013). "Situation Awareness in Anesthesia - Concept and Research". Anesthesiology. 118 (3): 729–42. doi: 10.1097/aln.0b013e318280a40f . PMID   23291626.
  5. Blandford, A.; Wong, W. (2004). "Situation awareness in emergency medical dispatch". International Journal of Human–Computer Studies. 61 (4): 421–452. doi:10.1016/j.ijhcs.2003.12.012. Archived from the original on 2023-10-31. Retrieved 2020-09-13.; Gorman, Jamie C.; Cooke, Nancy J.; Winner, Jennifer L. (2006). "Measuring team situation awareness in decentralized command and control environments". Ergonomics. 49 (12–13): 1312–1325. doi:10.1080/00140130600612788. PMID   17008258. S2CID   10879373.
  6. Flin, R. & O'Connor, P. (2001). Applying crew resource management in offshore oil platforms. In E. Salas, C.A. Bowers, & E. Edens (Eds.), Improving teamwork in organization: Applications of resource management training (pp. 217–233). Hillsdale, NJ: Erlbaum.
  7. Hartel, C.E.J., Smith, K., & Prince, C. (1991, April). Defining aircrew coordination: Searching mishaps for meaning. Paper presented at the 6th International Symposium on Aviation Psychology, Columbus, OH.
  8. Merket, D.C., Bergondy, M., & Cuevas-Mesa, H. (1997, March). Making sense out of teamwork errors in complex environments. Paper presented at the 18th Annual Industrial/Organizational-Organizational Behavior Graduate Student Conference, Roanoke, VA.
  9. Endsley, M. R. (1995). A taxonomy of situation awareness errors. In R. Fuller, N. Johnston & N. McDonald (Eds.), Human factors in aviation operations (pp. 287-292). Aldershot, England: Avebury Aviation, Ashgate Publishing Ltd.
  10. Jones, D. G., & Endsley, M. R. (1996). Sources of situation awareness errors in aviation. Aviation, Space and Environmental Medicine, 67(6), 507-512.
  11. Construction safety and health hazard awareness in Web of Science and Weibo between 1991 and 2021, Safety Science, 152, August 2022, 105790
  12. Endsley, Mica; Jones, Debra (2016-04-19). Designing for Situation Awareness (Second ed.). CRC Press. p. 13. ISBN   978-1-4200-6358-5.
  13. Lundberg, Jonas (16 February 2015). "Situation awareness systems, states and processes: a holistic framework". Theoretical Issues in Ergonomics Science. 16 (5). Informa UK Limited: 447–473. doi: 10.1080/1463922x.2015.1008601 . ISSN   1463-922X. S2CID   109500777.
  14. Sun Tzu The Art of War Chapter X, 地形
  15. Press, M. (1986). Situation awareness: Let's get serious about the clue-bird. Unpublished manuscript.
  16. Biferno, M.A. "Flight Crew Computer Interaction", Douglas Aircraft Company, Internal Research and Development. Long Beach, CA.
  17. Biferno, M.A., "Mental Workload Measurement", Douglas Aircraft Company, Internal Research and Development, Long Beach, CA.
  18. Dawson, M.E., Biferno, M.A. (1973). "Concurrent measurement of awareness and electrodermal classical conditioning", Journal of Experimental Psychology , 101, 55-62.
  19. Biferno, M.A.; Dawson, M.E. (1977). "The onset of contingency awareness and electrodermal classical conditioning: An analysis of temporal relationships during acquisition and extinction". Psychophysiology . 14 (2): 164–171. doi:10.1111/j.1469-8986.1977.tb03370.x. PMID   847068.
  20. Biferno, M.A. (1985). "Relationship between event-related potential components and ratings of workload and fatigue", NASA-Ames, Moffett Field, CA, NASA contract report 177354.
  21. Biferno, M. A. & Stanley, D. L. (1983). The Touch-Sensitive Control/Display Unit: A promising Computer Interface. Technical Paper 831532, Aerospace Congress & Exposition, Long Beach, CA: Society of Automotive Engineers.
  22. 1 2 Endsley, M. R. (1988). Design and evaluation for situation awareness enhancement. Proceedings of the Human Factors Society 32nd Annual Meeting (pp. 97-101). Santa Monica, CA: Human Factors Society.
  23. Watts, B.D. (2004) [1996, as McNair Paper no. 52]. "Chapter 9: 'Situation awareness' in air-to-air combat and friction". Clausewitzian Friction and Future War, McNair Paper no. 68. Institute of National Strategic Studies, National Defense University.
  24. Spick, M. (1988). The Ace Factor: Air Combat and the Role of Situational Awareness. Annapolis, MD: Naval Institute Press.
  25. 1 2 3 Endsley 1995b.
  26. Gaba, D.M.; Howard, S.K.; Small, S.D. (1995). "Situation awareness in anesthesiology". Human Factors. 37 (1): 20–31. doi:10.1518/001872095779049435. PMID   7790008. S2CID   11883940.
  27. Collier, S.G. & Follesf, K. (1995). SACRI: A measure of situation awareness for nuclear power plant control rooms. Proceedings of an International Conference: Experimental Analysis and Measurement of Situation Awareness (pp. 115–122). Daytona Beach, FL.
  28. Bolstad, C.A. (2000). Age-related factors affecting the perception of essential information during risky driving situations. Paper presented at the Human Performance Situation Awareness and Automation: User-Centered Design for the New Millennium Conference, Savannah, GA.
  29. Sollenberger, R.L., & Stein, E.S. (1995). A simulation study of air traffic controllers' situation awareness. Proceedings of an International Conference: Experimental Analysis and Measurement of Situation Awareness (pp. 211–217). Daytona Beach, FL.
  30. Silva Gomes V, Cardoso Júnior MM (2024). "The effect of sleepiness in situation awareness: A scoping review". Work (Reading, Mass.). 78 (3): 641–655. doi:10.3233/WOR-230115. PMID   38277325.
  31. Alqarrain Y, Roudsari A, Courtney KL, Tanaka J (2024). "Improving Situation Awareness to Advance Patient Outcomes: A Systematic Literature Review". Computers, Informatics, Nursing (CIN). 42 (4): 277–288. doi:10.1097/CIN.0000000000001112. PMID   38376409.
  32. 1 2 Endsley 1995a.
  33. Endsley et al. 2000.
  34. Wickens, C. D. (2008). Situation awareness: Review of Mica Endsley's 1995 articles on situation awareness theory and measurement. Human Factors, 50(3), 397-403.
  35. Lee, J. D., Cassano-Pinche´, A., & Vicente, K. J. (2005). Biometric analysis of Human Factors (1970-2000): A quantitative description of scientific impact. Human Factors, 47(4), 753-766.
  36. Endsley 1995a, p. 65.
  37. Artman, H (2000). "Team situation assessment and information distribution". Ergonomics. 43 (8): 1111–1128. doi:10.1080/00140130050084905. PMID   10975176. S2CID   33132381.
  38. 1 2 Endsley, M. R. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors, 37(1), 32-64.
  39. Endsley, M. R., & Kiris, E. O. (1995). The out-of-the-loop performance problem and level of control in automation. Human Factors, 37(2), 381-394.
  40. Endsley, M. R. (2018). Expertise and situation awareness. In K. A. Ericsson, R. R. Hoffman, A. Kozbelt & A. M. Williams (Eds.), Cambridge Handbook of Expertise and Expert Performance (2nd ed., pp. 714-744). Cambridge, UK: Cambridge University Press.
  41. Endsley, M. R., & Bolstad, C. A. (1994). Individual differences in pilot situation awareness. International Journal of Aviation Psychology, 4(3), 241-264.
  42. 1 2 3 Parasuraman, R., Sheridan, T. B., & Wickens, C. D. (2008). Situation awareness, mental workload and trust in automation: viable empirically supported cognitive engineering constructs. Journal of Cognitive Engineering and Decision Making, 2(2), 140-160.
  43. 1 2 3 Endsley, M. R. (2015). Situation awareness misconceptions and misunderstandings. Journal of Cognitive Engineering and Decision Making, 9(1), 4-32.
  44. 1 2 3 4 Endsley, M. R. (2021). A systematic review and meta-analysis of direct objective measures of situation awareness: A comparison of SAGAT and SPAM. Human Factors, 63(1), 124-150.
  45. Flach, J. M. (1995). Situation awareness: Proceed with caution. Human Factors, 37(1), 149-157.
  46. Dekker, Sidney; Hollnagel, Erik (2004-05-01). "Human factors and folk models". Cognition, Technology & Work. 6 (2): 79–86. doi:10.1007/s10111-003-0136-9. ISSN   1435-5558.
  47. Dekker, Sidney W. A. (2015-05-01). "The danger of losing situation awareness". Cognition, Technology & Work. 17 (2): 159–161. doi:10.1007/s10111-015-0320-8. ISSN   1435-5566.
  48. 1 2 3 4 Bakdash, Jonathan Z.; Marusich, Laura R.; Kenworthy, Jared B.; Twedt, Elyssa; Zaroukian, Erin G. (2020-12-22). "Statistical Significance Filtering Overestimates Effects and Impedes Falsification: A Critique of Endsley (2019)". Frontiers in Psychology. 11. doi: 10.3389/fpsyg.2020.609647 . ISSN   1664-1078. PMC   7783317 . PMID   33414750.
  49. Kriegeskorte, Nikolaus; Simmons, W. Kyle; Bellgowan, Patrick S. F.; Baker, Chris I. (2009). "Circular analysis in systems neuroscience: the dangers of double dipping". Nature Neuroscience. 12 (5): 535–540. doi:10.1038/nn.2303. ISSN   1546-1726. PMC   2841687 . PMID   19396166.
  50. S.M. Fiore, personal communication, November 6, 2007
  51. Blasch, E., Bosse, E., and Lambert, D. A., High-Level Information Fusion Management and Systems Design, Artech House, Norwood, MA, 2012.
  52. Boddhu, Sanjay K., et al. (2012). "Increasing situational awareness using smartphones." SPIE Defense, Security, and Sensing. International Society for Optics and Photonics, 2012.
  53. Sanjay Kumar Boddhu, Matt McCartney, Oliver Ceccopieri, et al., "A collaborative smartphone sensing platform for detecting and tracking hostile drones", Proceedings of SPIE Vol. 8742, 874211 (2013)
  54. McNeill, Jeffrey A.; Morgan, C. A. (2010). "Cognitive and Decision Making in Extreme Environments". In Kennedy, Carrie Hill; Moor, Jeffrey (eds.). Military Neuropsychology. New York: Springer. p. 370. ISBN   978-0-82610-449-6.
  55. Dostal, B.C. (2007). "Enhancing situational understanding through the employment of unmanned aerial vehicles". Interim Brigade Combat Team Newsletter. No. 1–18 (Army Transformation Taking Shape...). Archived from the original on 18 November 2007. Retrieved 7 November 2007.
  56. Endsley, M.R. (2000). "Theoretical underpinnings of situation awareness: A critical review". In M.R. Endsley; D.J. Garland (eds.). Situation awareness analysis and measurement. Mahwah, NJ: LEA.
  57. 1 2 3 Endsley & Jones 1997.
  58. Sarter, N.B.; Woods, D.D. (1991). "Situation awareness: A critical but ill-defined phenomenon". International Journal of Aviation Psychology. 1: 45–57. doi:10.1207/s15327108ijap0101_4.
  59. Glaser, R. (1989). Expertise and learning: How do we think about instructional processes now that we have discovered knowledge structures? In D. Klahr & K. Kotovsky (Eds.), Complex information processing: The impact of Herbert A. Simon (pp. 269–282). Hillsdale, NJ: LEA.
  60. Kozlowski, S.W.J. (1998). Training and developing adaptive teams: Theory, principles, and research. In J.A. Cannon-Bowers, & E. Salas, (Eds.), Making decisions under stress: Implications for individual and team training (pp. 115–153). Washington, DC: American Psychological Association.
  61. Endsley, M.R. (1997). The role of situation awareness in naturalistic decision making. In Zsambok, C.E. & G. Klein (Eds.), Naturalistic decision making (pp. 269–283). Mahwah, NJ: LEA.
  62. Endsley, 1995
  63. Serfaty, D., MacMillan, J., Entin, E.E., & Entin, E.B. (1997). The decision-making expertise of battle commanders. In C.E. Zsambok & G. Klein (Eds.), Naturalistic decision making (pp. 233–246). Mahwah, NJ: LEA.
  64. Klein, Moon & Hoffman 2006.
  65. Klein, Moon & Hoffman 2006, p. 71.
  66. 1 2 Endsley, M.R. (2004). "Situation awareness: Progress and directions". In S. Banbury; S. Tremblay (eds.). A cognitive approach to situation awareness: Theory and application. Aldershot, UK: Ashgate Publishing. pp. 317–341.
  67. Salas, E., Dickinson, T.L., Converse, S., & Tannenbaum, S.I. (1992). Toward an understanding of team performance and training. In R.W. Swezey & E. Salas (Eds.), Teams: their training and performance (pp. 3–29). Norwood, NJ: Ablex.
  68. Endsley, M. R., & Jones, W. M. (2001). A model of inter- and intrateam situation awareness: Implications for design, training and measurement. In M. McNeese, E. Salas & M. Endsley (Eds.), New trends in cooperative activities: Understanding system dynamics in complex environments (pp. 46-67). Santa Monica, CA: Human Factors and Ergonomics Society.
  69. Endsley & Jones 1997, p. 47.
  70. Endsley & Jones 2001, p. 48.
  71. Bolstad, C. A., & Endsley, M. R. (1999). Shared mental models and shared displays: An empirical evaluation of team performance. Proceedings of the 43rd Annual Meeting of the Human Factors and Ergonomics Society (pp. 213-217).
  72. Santa Monica, CA: Human Factors and Ergonomics Society. Bolstad, C. A., & Endsley, M. R. (2000). The effect of task load and shared displays on team situation awareness. Proceedings of the 14th Triennial Congress of the International Ergonomics Association and the 44th Annual Meeting of the Human Factors and Ergonomics Society (pp. 189-192). Santa Monica, CA: Human Factors and Ergonomics Society.
  73. 1 2 Endsley & Jones 2001.
  74. Graham, S.E. & Matthews, M.D. (2000). Modeling and measuring situation awareness. In J.H. Hiller & R.L. Wampler (Eds.), Workshop on assessing and measuring training performance effectiveness (Tech. Rep. 1116) (pp. 14–24). Alexandria, VA: U.S. Army Research Institute for the Behavioral and Social Sciences.
  75. Jones, D.G.; Endsley, M.R. (2000). Examining the validity of real-time probes as a metric of situation awareness (PDF). Proceedings of the 14th Triennial Congress of the International Ergonomics Association and the 44th Annual Meeting of the Human Factors and Ergonomics Society. Santa Monica, CA: Human Factors and Ergonomics Society. Archived from the original (PDF) on 2007-09-28.
  76. 1 2 Strater, L.D., Endsley, M.R., Pleban, R.J., & Matthews, M.D. (2001). Measures of platoon leader situation awareness in virtual decision making exercises (No. Research Report 1770). Alexandria, VA: Army Research Institute.
  77. Taylor, R.M. (1989). Situational awareness rating technique (SART): The development of a tool for aircrew systems design. Proceedings of the AGARD AMP Symposium on Situational Awareness in Aerospace Operations, CP478. Seuilly-sur Seine: NATO AGARD.
  78. Endsley, M.R. (1998). A comparative analysis of SAGAT and SART for evaluations of situation awareness. In Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting (pp. 82–86). Santa Monica, CA: The Human Factors and Ergonomics Society.
  79. Endsley, M.R. (1990). Predictive utility of an objective measure of situation awareness. Proceedings of the Human Factors Society 34th Annual Meeting (pp. 41–45). Santa Monica, CA: Human Factors Society.
  80. Matthews, M.D., Pleban, R.J., Endsley, M.R., & Strater, L.G. (2000). Measures of infantry situation awareness for a virtual MOUT environment. Proceedings of the Human Performance, Situation Awareness and Automation: User-Centered Design for the New Millennium. Savannah, GA: SA Technologies, Inc.
  81. Bolstad, C.A., Cuevas H.M., Gonzalez, C., & Schneider, M. (2005). Modeling shared situation awareness. Proceedings of the 14th Conference on Behavior Representation in Modeling and Simulation (BRIMS). Los Angeles, CA.
  82. Bolstad, C.A., Foltz, P., Franzke, M., Cuevas, H.M., Rosenstein, M., & Costello, A.M. (2007). Predicting situation awareness from team communications. Proceedings of the 51st Annual Meeting of the Human Factors and Ergonomics Society. Santa Monica, CA: HFES.
  83. Foltz, P.W., Bolstad, C.A., Cuevas, H.M., Franzke, M., Rosenstein, M., & Costello, A.M. (in press). Measuring situation awareness through automated communication analysis. To appear in M. Letsky, N. Warner, S.M. Fiore, & C. Smith (Eds.), Macrocognition in teams. Aldershot, England: Ashgate.
  84. e.g., French, H.T., Clark, E., Pomeroy, D. Seymour, M., & Clarke, C.R. (2007). Psycho-physiological Measures of Situation Awareness. In M. Cook, J. Noyes & Y. Masakowski (eds.), Decision Making in Complex Environments. London: Ashgate. ISBN   0-7546-4950-4.
  85. 1 2 3 Wilson, G.F. (2000). Strategies for psychophysiological assessment of situation awareness. In M.R. Endsley & D.J. Garland, (Eds.), Situation awareness analysis and measurement (pp. 175–188). Mahwah, NJ: Lawrence Erlbaum Associates.
  86. e.g., Harwood, K., Barnett, B., & Wickens, C.D. (1988). Situational awareness: A conceptual and methodological framework. In F.E. McIntire (Ed.), Proceedings of the 11th Biennial Psychology in the Department of Defense Symposium (pp. 23–27). Colorado Springs, CO: U.S. Air Force Academy.
  87. Lock, Gareth (25 August 2021). "Why divers miss the obvious". www.youtube.com. DAN Southern Africa. Archived from the original on 2021-11-14. Retrieved 28 August 2021.
  88. Simmon, D.A. (1998). Boeing 757 CFIT Accident at Cali, Colombia, becomes focus of lessons learned. Flight Safety Digest, 17, 1-31.
  89. Revista Aviador --Official Spanish Commercial Pilots Association magazine--, July–August 2011, # 61, 38-39 pag.
  90. Revista de Aeronáutica y Astronáutica --Official SPAF magazine-- May 2012 issue, 436-439 pag.
  91. Cognitive Systems Engineering Jens Rasmussen and others.
  92. First Aid, Protect Yourself, American Red Cross – Accessed 01/Aug/13
  93. First Aid, Understanding What Happened – Accessed 01/Aug/13
  94. Accident Report NTSB/AAR-91/08, PB91-910409: Runway Collision of USAir Flight 1493, Boeing 737 and Skywest Flight 5569 Fairchild Metroliner Los Angeles International Airport Los Angeles, California, February 1, 1991 (Report). National Transportation Safety Board. Oct 22, 1991. Archived from the original on August 19, 2013. Retrieved February 8, 2024.
  95. Accident Report NTSB/RAR-16/02, PB2016-103218: Derailment of Amtrak Passenger Train 188, Philadelphia, Pennsylvania, May 12, 2015 (PDF) (Report). National Transportation Safety Board. May 17, 2016. Archived (PDF) from the original on September 9, 2020. Retrieved August 29, 2020.
  96. Riley, Jennifer & Endsley, Mica. (2004). The Hunt for Situation Awareness: Human-Robot Interaction in Search and Rescue. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 48. 10.1177/154193120404800389.
  97. Holland, T. M. (16 October 2019). "ATAK improves situational awareness for California fire department". Insights. Samsung. Archived from the original on 19 October 2019. Retrieved 19 October 2019.
  98. "Chain Saw and Crosscut Saw Training Course". US Forest Service. Archived from the original on 26 September 2013. Retrieved 1 August 2013.
  99. "Chapter 2, Page 7, Situational Awareness" (PDF). US Forest Service. Archived (PDF) from the original on 31 May 2012. Retrieved 1 August 2013.
  100. "Improving Situational Awareness". Police Chief Magazine. Archived from the original on 2014-01-08. Retrieved 1 August 2013.
  101. Bellekens, Xavier; Hamilton, Andrew; Seeam, Preetila; Nieradzinska, Kamila; Franssen, Quentin; Seeam, Amar (2016). "Pervasive eHealth services a security and privacy risk awareness survey". 2016 International Conference on Cyber Situational Awareness, Data Analytics and Assessment (CyberSA) (PDF). pp. 1–4. doi:10.1109/CyberSA.2016.7503293. ISBN   978-1-5090-0703-5. S2CID   14502409. Archived (PDF) from the original on 2020-09-19. Retrieved 2020-09-13.
  102. Best, Daniel M.; Bohn, Shawn; Love, Douglas; Wynne, Adam; Pike, William A. (2010). "Real-time visualization of network behaviors for situational awareness". Proceedings of the Seventh International Symposium on Visualization for Cyber Security. pp. 79–90. doi:10.1145/1850795.1850805. ISBN   9781450300131. S2CID   8520455.
  103. Mathews, Mary; Halvorsen, Paul; Joshi, Anupam; Finin, Tim (2012). "A Collaborative Approach to Situational Awareness for CyberSecurity". Proceedings of the 8th IEEE International Conference on Collaborative Computing: Networking, Applications and Worksharing. doi:10.4108/icst.collaboratecom.2012.250794. ISBN   978-1-936968-36-7. S2CID   14135227.
  104. Sikos, Leslie; Stumptner, Markus; Mayer, Wolfgang; Howard, Catherine; Voigt, Shaun; Philp, Dean (2018), Automated Reasoning over Provenance-Aware Communication Network Knowledge in Support of Cyber-Situational Awareness, Lecture Notes in Computer Science, vol. 11062, Cham: Springer, pp. 132–143, doi:10.1007/978-3-319-99247-1_12, ISBN   978-3-319-99246-4, archived from the original on 2021-07-04, retrieved 2020-09-13
  105. 1 2 3 "Army scientists improve human-agent teaming by making AI agents more transparent". US Army Research Laboratory. Archived from the original on 2018-08-13. Retrieved 2018-08-15.
  106. Boyce, Michael; Chen, Joyce; Selkowitz, Andrew; Lakhmani, Shan (May 2015). "Agent Transparency for an Autonomous Squad Member" (PDF). Archived (PDF) from the original on 2018-08-15. Retrieved 2018-07-28.
  107. 1 2 3 Chen, Jessie Y. C.; Lakhmani, Shan G.; Stowers, Kimberly; Selkowitz, Anthony R.; Wright, Julia L.; Barnes, Michael (2018-02-23). "Situation awareness-based agent transparency and human-autonomy teaming effectiveness". Theoretical Issues in Ergonomics Science. 19 (3): 259–282. doi: 10.1080/1463922x.2017.1315750 . ISSN   1463-922X. S2CID   115436644.
  108. "CrowdSA - Crowdsourced Situation Awareness for Crisis Management". cis.jku.at. Archived from the original on 10 January 2017. Retrieved 9 January 2017.
  109. "Situation Awareness and Relief System During Disaster Events" (PDF). International Journal of Research in Science & Engineering. Archived (PDF) from the original on 10 January 2017. Retrieved 9 January 2017.
  110. "Crowdsourcing public safety: Building community resilience by enhancing citizen situation awareness capability". RISE:2017, Northeastern University. Archived from the original on 10 January 2017. Retrieved 9 January 2017.
  111. Shepard, Steven (2014-07-06). Telecommunications Crash Course, Third Edition. McGraw Hill Professional. ISBN   9780071797115. Archived from the original on 2021-07-04. Retrieved 9 January 2017.
  112. Poblet, Marta; García-Cuesta, Esteban; Casanovas, Pompeu (2014). "Crowdsourcing Tools for Disaster Management: A Review of Platforms and Methods". AI Approaches to the Complexity of Legal Systems (PDF). Lecture Notes in Computer Science. Vol. 8929. pp. 261–274. doi:10.1007/978-3-662-45960-7_19. ISBN   978-3-662-45959-1. ISSN   0302-9743. Archived (PDF) from the original on 10 January 2017. Retrieved 9 January 2017.
  113. Chu, E. T. H.; Chen, S. W.; Li, J. W. S. (2012). Crowdsourcing Information for Enhanced Disaster Situation Awareness and Emergency Preparedness and Response (PDF). 23rd International CODATA Conference. Archived (PDF) from the original on 10 January 2017. Retrieved 9 January 2017.
  114. Arkian, Hamid Reza; Diyanat, Abolfazl; Pourkhalili, Atefe (2017). "MIST: Fog-based data analytics scheme with cost-efficient resource provisioning for IoT crowdsensing applications". Journal of Network and Computer Applications. 82: 152–165. doi:10.1016/j.jnca.2017.01.012.
  115. 1 2 Basu, Moumita; Bandyopadhyay, Somprakash; Ghosh, Saptarshi (2016). "Post Disaster Situation Awareness and Decision Support Through Interactive Crowdsourcing". Procedia Engineering. 159: 167–173. doi: 10.1016/j.proeng.2016.08.151 .
  116. Haddawy, Peter; Frommberger, Lutz; Kauppinen, Tomi; De Felice, Giorgio; Charkratpahu, Prae; Saengpao, Sirawaratt; Kanchanakitsakul, Phanumas (1 January 2015). "Situation awareness in crowdsensing for disease surveillance in crisis situations". Proceedings of the Seventh International Conference on Information and Communication Technologies and Development (PDF). pp. 38:1–38:5. doi:10.1145/2737856.2737879. ISBN   9781450331630. S2CID   3026308. Archived from the original (PDF) on 5 April 2016. Retrieved 9 January 2017.
  117. Aitamurto, Tanja (8 May 2015). "Crowdsourcing as a Knowledge-Search Method in Digital Journalism". Digital Journalism. 4 (2): 280–297. doi:10.1080/21670811.2015.1034807. ISSN   2167-0811. S2CID   156243124. Archived from the original on 24 May 2024. Retrieved 6 January 2017.
  118. Aitamurto, Tanja (1 October 2015). "Motivation Factors in Crowdsourced Journalism: Social Impact, Social Change, and Peer Learning". Archived from the original on 24 May 2024. Retrieved 6 January 2017.
  119. Sutter, John D. "Ushahidi: How to 'crowdmap' a disaster". CNN. Archived from the original on 17 July 2022. Retrieved 6 January 2017.
  120. The Impact of Crowdsourcing on Organisational Practices: The Case of Crowdmapping. ISBN   978-3-00-050284-2. Archived from the original on 7 January 2017. Retrieved 6 January 2017.
  121. Wood, Mark (27 June 2016). Crowdsourced counter-surveillance: Examining the subversion of random breath testing stations by social media facilitated crowdsourcing. Rethinking Cybercrime 2016: UCLAN Cybercrime Research Unit.
  122. "Concepts to Know: Crowdmapping". Kimo Quaintance. 4 September 2011. Archived from the original on 17 July 2022. Retrieved 6 January 2017.
  123. "Chemical Hazards and Poisons Report" (PDF). Public Health England. Archived (PDF) from the original on 7 March 2018. Retrieved 6 January 2017.

Sources

Further reading