Situational awareness or situation awareness (SA) is the understanding of an environment, its elements, and how it changes with respect to time or other factors. Situational awareness is important for effective decision making in many environments. It is formally defined as:
“the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future”. [1]
An alternative definition is that situation awareness is adaptive, externally-directed consciousness that has as its products knowledge about a dynamic task environment and directed action within that environment. [2]
Situation awareness has been recognized as a critical foundation for successful decision-making across a broad range of situations, many of which involve the protection of human life and property, including law enforcement, aviation, air traffic control, ship navigation, [3] health care, [4] emergency response, military command and control operations, transmission system operators, self defense, [5] and offshore oil and nuclear power plant management. [6]
Inadequate situation awareness has been identified as one of the primary causal factors in accidents attributed to human error. [7] [8] [9] [10] According to Endsley’s situation awareness theory, when someone meets a dangerous situation, he needs an appropriate and a precise decision-making process which include pattern recognition and matching, formation of sophisticated schemata and archetypal knowledge that aids correct decision making. [11]
The formal definition of SA is often described as three ascending levels:
People with the highest levels of SA have not only perceived the relevant information for their goals and decisions, but are also able to integrate that information to understand its meaning or significance, and are able to project likely or possible future scenarios. These higher levels of SA are critical for proactive decision making in demanding environments.
Three facets of SA have been the focus in research: SA states, SA systems, and SA processes. SA states refers to the actual level of awareness people have of the situation. SA systems refers to technologies that are developed to support SA in many environments. SA processes refers to the updating of SA states, and what guides the moment-to-moment change of SA. [13]
Although the term itself is fairly recent, the concept has roots in the history of military theory—it is recognizable in Sun Tzu's The Art of War , for example. [14] The term can be traced to World War I, where it was recognized as a crucial skill for crews in military aircraft. [15]
There is evidence that the term situational awareness was first employed at the Douglas Aircraft Company during human factors engineering research while developing vertical and horizontal situation displays and evaluating digital-control placement for the next generation of commercial aircraft. Research programs in flight-crew computer interaction [16] and mental workload measurement [17] built on the concept of awareness measurement from a series of experiments that measured contingency awareness during learning, [18] [19] and later extended to mental workload and fatigue. [20]
Situation awareness appears in the technical literature as early as 1983, when describing the benefits of a prototype touch-screen navigation display. [21] During the early 1980s, integrated “vertical-situation” and “horizontal-situation” displays were being developed for commercial aircraft to replace multiple electro-mechanical instruments. Integrated situation displays combined the information from several instruments enabling more efficient access to critical flight parameters, thereby improving situational awareness and reducing pilot workload.
The term was first defined formally by Endsley in 1988. [22] Before being widely adopted by human factors scientists in the 1990s, the term is said to have been used by United States Air Force (USAF) fighter aircrew returning from war in Korea and Vietnam. [23] They identified having good SA as the decisive factor in air combat engagements—the "ace factor". [24] Survival in a dogfight was typically a matter of observing the opponent's current move and anticipating his next move a fraction of a second before he could observe and anticipate it himself.
USAF pilots also came to equate SA with the "observe" and "orient" phases of the famous observe-orient-decide-act loop (OODA loop), or Boyd cycle, as described by the USAF war theorist Col. John Boyd. In combat, the winning strategy is to "get inside" your opponent's OODA loop, not just by making one's own decisions quicker, but also by having better SA than one's opponent, and even changing the situation in ways that the opponent cannot monitor or even comprehend. Losing one's own SA, in contrast, equates to being "out of the loop".
Clearly, SA has far reaching applications, as it is necessary for individuals and teams to function effectively in their environment. Thus, SA has gone far beyond the field of aviation to work being conducted in a wide variety of environments. SA is being studied in such diverse areas as air traffic control, nuclear power plant operation, emergency response, maritime operations, space, oil and gas drilling, vehicle operation, and health care (e.g. anesthesiology and nursing). [25] [26] [27] [28] [29] [30] [31]
The most widely cited and accepted model of SA was developed by Dr. Mica Endsley, [25] which has been shown to be largely supported by research findings. [34] Lee, Cassano-Pinche, and Vicente found that Endsley's Model of SA received 50% more citations following its publication than any other paper in Human Factors compared to other papers in the 30 year period of their review. [35]
Endsley's model describes the cognitive processes and mechanisms that are used by people to assess situations to develop SA, and the task and environmental factors that also affect their ability to get SA. It describes in detail the three levels of SA formation: perception, comprehension, and projection.
Perception (Level 1 SA): The first step in achieving SA is to perceive the status, attributes, and dynamics of relevant elements in the environment. Thus, Level 1 SA, the most basic level of SA, involves the processes of monitoring, cue detection, and simple recognition, which lead to an awareness of multiple situational elements (objects, events, people, systems, environmental factors) and their current states (locations, conditions, modes, actions).
Comprehension (Level 2 SA): The next step in SA formation involves a synthesis of disjointed Level 1 SA elements through the processes of pattern recognition, interpretation, and evaluation. Level 2 SA requires integrating this information to understand how it will impact upon the individual's goals and objectives. This includes developing a comprehensive picture of the world, or of that portion of the world of concern to the individual.
Projection (Level 3 SA): The third and highest level of SA involves the ability to project the future actions of the elements in the environment. Level 3 SA is achieved through knowledge of the status and dynamics of the elements and comprehension of the situation (Levels 1 and 2 SA), and then extrapolating this information forward in time to determine how it will affect future states of the operational environment.
Endsley's model shows how SA "provides the primary basis for subsequent decision making and performance in the operation of complex, dynamic systems". [36] Although alone it cannot guarantee successful decision making, SA does support the necessary input processes (e.g., cue recognition, situation assessment, prediction) upon which good decisions are based. [37]
SA also involves both a temporal and a spatial component. Time is an important concept in SA, as SA is a dynamic construct, changing at a tempo dictated by the actions of individuals, task characteristics, and the surrounding environment. As new inputs enter the system, the individual incorporates them into this mental representation, making changes as necessary in plans and actions in order to achieve the desired goals.
SA also involves spatial knowledge about the activities and events occurring in a specific location of interest to the individual. Thus, the concept of SA includes perception, comprehension, and projection of situational information, as well as temporal and spatial components.
Endsley's model of SA illustrates several variables that can influence the development and maintenance of SA, including individual, task, and environmental factors.
In summary, the model consists of several key factors that describe the cognitive processes involved in SA: [38]
The model also points to a number of features of the task and environment that affect SA:
Experience and training have a significant impact on people's ability to develop SA, due to its impact on the development of mental models that reduce processing demands and help people to better prioritize their goals. [40] In addition, it has been found that individuals vary in their ability to acquire SA; thus, simply providing the same system and training will not ensure similar SA across different individuals. Research has shown that there are a number of factors that make some people better at SA than others including differences in spatial abilities and multi-tasking skills. [41]
Criticisms of the SA construct and the model are generally viewed as unfounded and addressed. [42] [43] [44] The Endsley model is very detailed in describing the exact cognitive processes involved in SA. A narrative literature review of SA, performance, and other human factors constructs states that SA “... is valuable in understanding and predicting human-system performance in complex systems.” [42]
Nevertheless, there are several criticisms of SA. One criticism is the danger of circularity with SA: “How does one know that SA was lost? Because the human responded inappropriately. Why did the human respond inappropriately? Because SA was lost.” [45] Building on the circularity concern, others deemed SA a folk model on the basis it is frequently overgeneralized and immune to falsification. [46] [47] A response to these criticisms it arguing that measures of SA are “... falsifiable in terms of their usefulness in prediction.” [42]
A recent review and meta-analysis of SA measures showed they were highly correlated or predictive of performance, which initially appears to provide strong quantitative evidence refuting criticisms of SA. [44] However, the inclusion criteria in this meta-analysis [44] was limited to positive correlations reaching desirable levels of statistical significance. [48] That is, more desirable results hypothesis supporting results were included while the less desirable results, contradicting the hypothesis, were excluded. The justification was "Not all measures of SA are relevant to performance." [44] This an example of a circular analysis or double-dipping, [49] where the dataset being analyzed are selected based on the outcome from analyzing the same dataset.
Because only more desirable effects were included, the results of this meta-analysis were predetermined – predictive measures of SA were predictive. [48] Further, there were inflated estimates of mean effect sizes compared to an analysis that did not select results using statistical significance. [48] Determining the relevance of SA based on the desirability of outcomes and analyzing only supporting results is a circular conceptualization of SA and revives concerns about the falsifiability of SA. [48]
Several cognitive processes related to situation awareness are briefly described in this section. The matrix shown below attempts to illustrate the relationship among some of these concepts. [50] Note that situation awareness and situational assessment are more commonly discussed in information fusion complex domains such as aviation and military operations and relate more to achieving immediate tactical objectives. [51] [52] [53] Sensemaking and achieving understanding are more commonly found in industry and the organizational psychology literature and often relate to achieving long-term strategic objectives.
There are also biological mediators of situational awareness, most notably hormones such as testosterone, and neurotransmitters such as dopamine and norepinephrine. [54]
Phase | |||
---|---|---|---|
Process | Outcome | ||
Objective | Tactical (short-term) | situational assessment | situation awareness |
Strategic (long-term) | sensemaking | understanding | |
Scientific (longer-term) | analysis | prediction |
Situation awareness is sometimes confused with the term "situational understanding." In the context of military command and control applications, situational understanding refers to the "product of applying analysis and judgment to the unit's situation awareness to determine the relationships of the factors present and form logical conclusions concerning threats to the force or mission accomplishment, opportunities for mission accomplishment, and gaps in information". [55] Situational understanding is the same as Level 2 SA in the Endsley model—the comprehension of the meaning of the information as integrated with each other and in terms of the individual's goals. It is the "so what" of the data that is perceived.
In brief, situation awareness is viewed as "a state of knowledge," and situational assessment as "the processes" used to achieve that knowledge. Endsley argues that "it is important to distinguish the term situation awareness, as a state of knowledge, from the processes used to achieve that state. [1] These processes, which may vary widely among individuals and contexts, will be referred to as situational assessment or the process of achieving, acquiring, or maintaining SA." Note that SA is not only produced by the processes of situational assessment, it also drives those same processes in a recurrent fashion. For example, one's current awareness can determine what one pays attention to next and how one interprets the information perceived. [56]
Accurate mental models are one of the prerequisites for achieving SA. [22] [57] [58] A mental model can be described as a set of well-defined, highly organized yet dynamic knowledge structures developed over time from experience. [59] [60] The volume of available data inherent in complex operational environments can overwhelm the capability of novice decision makers to attend, process, and integrate this information efficiently, resulting in information overload and negatively impacting their SA. [61] In contrast, experienced decision makers assess and interpret the current situation (Level 1 and 2 SA) and select an appropriate action based on conceptual patterns stored in their long-term memory as "mental models". [62] [63] Cues in the environment activate these mental models, which in turn guide their decision making process.
Klein, Moon, and Hoffman distinguish between situation awareness and sensemaking as follows:
...situation awareness is about the knowledge state that's achieved—either knowledge of current data elements, or inferences drawn from these data, or predictions that can be made using these inferences. In contrast, sensemaking is about the process of achieving these kinds of outcomes, the strategies, and the barriers encountered. [64]
In brief, sensemaking is viewed more as "a motivated, continuous effort to understand connections (which can be among people, places, and events) in order to anticipate their trajectories and act effectively", [65] rather than the state of knowledge underlying situation awareness. Endsley points out that as an effortful process, sensemaking is actually considering a subset of the processes used to maintain situation awareness. [66] [43] In the vast majority of the cases, SA is instantaneous and effortless, proceeding from pattern recognition of key factors in the environment—"The speed of operations in activities such as sports, driving, flying and air traffic control practically prohibits such conscious deliberation in the majority of cases, but rather reserves it for the exceptions." Endsley also points out that sensemaking is backward focused, forming reasons for past events, while situation awareness is typically forward looking, projecting what is likely to happen in order to inform effective decision processes. [66] [43]
In many systems and organizations, people work not just as individuals, but as members of a team. Thus, it is necessary to consider the SA of not just individual team members, but also the SA of the team as a whole. To begin to understand what is needed for SA within teams, it is first necessary to clearly define what constitutes a team. A team is not just any group of individuals; rather teams have a few defining characteristics. A team is:
a distinguishable set of two or more people who interact dynamically, interdependently and adaptively toward a common and valued goal/objective/mission, who have each been assigned specific roles or functions to perform, and who have a limited life span of membership.
— Salas et al. (1992) [67]
Team SA is defined as "the degree to which every team member possesses the SA required for his or her responsibilities". [38] The success or failure of a team depends on the success or failure of each of its team members. If any one of the team members has poor SA, it can lead to a critical error in performance that can undermine the success of the entire team. By this definition, each team member needs to have a high level of SA on those factors that are relevant for his or her job. It is not sufficient for one member of the team to be aware of critical information if the team member who needs that information is not aware. Therefore, team members need to be successful in communicating information between them (including how they are interpreting or projecting changes in the situation to form level 2 and 3 SA) or in each independently being able to get the information they need.
In a team, each member has a subgoal pertinent to his/her specific role that feeds into the overall team goal. Associated with each member's subgoal are a set of SA elements about which he/she is concerned. As the members of a team are essentially interdependent in meeting the overall team goal, some overlap between each member's subgoal and their SA requirements will be present. It is this subset of information that constitutes much of team coordination. That coordination may occur as a verbal exchange, a duplication of displayed information, or by some other means. [68]
Shared situation awareness can be defined as "the degree to which team members possess the same SA on shared SA requirements". [69] [70] As implied by this definition, there are information requirements that are relevant to multiple team members. A major part of teamwork involves the area where these SA requirements overlap—the shared SA requirements that exist as a function of the essential interdependency of the team members. In a poorly functioning team, two or more members may have different assessments on these shared SA requirements and thus behave in an uncoordinated or even counter-productive fashion. Yet in a smoothly functioning team, each team member shares a common understanding of what is happening on those SA elements that are common—shared SA. Thus, shared SA refers to degree to which people have a common understanding on information that is in the overlap of the SA requirements of the team members. Not all information needs to be shared. Clearly, each team member is aware of much that is not pertinent to the others on the team. Sharing every detail of each person's job would creates information overload to sort through to get needed information. [71] [72] It is only that information which is relevant to the SA requirements of each team member that needs to be shared.
The situation awareness of the team as a whole, therefore, is dependent upon both a high level of SA among individual team members for the aspects of the situation necessary for their job; and a high level of shared SA between team members, providing an accurate common operating picture of those aspects of the situation common to the needs of each member. [73] Endsley and Jones [57] [73] describe a model of team situation awareness as a means of conceptualizing how teams develop high levels of shared SA across members. Each of these four factors—requirements, devices, mechanisms and processes—act to help build team and shared SA.
In time-critical decision-making processes, swift and effective choices are imperative to address and navigate urgent situations. In such scenarios, the ability to analyze information rapidly, prioritize key factors, and execute decisions promptly becomes paramount. Time constraints often necessitate a balance between thorough deliberation and the need for quick action.
The decision-maker must rely on a combination of experience, intuition, and available data to make informed choices under pressure. Prioritizing critical elements, assessing potential outcomes, and considering the immediate and long-term consequences are crucial aspects of effective time-critical decision-making.
Furthermore, clear communication is essential to ensure that decisions are swiftly conveyed to relevant stakeholders and executed seamlessly. Collaborative efforts, streamlined processes, and well-defined protocols can enhance the efficiency of decision-making in time-sensitive situations.
Adaptability and the ability to recalibrate strategies in real-time are vital attributes in time-critical scenarios, as unforeseen developments may require rapid adjustments to the initial decision. Embracing technological advancements and data-driven insights, and incorporating simulation exercises, can also contribute to better decision-making outcomes in high-pressure situations.
Ultimately, successful time-critical decision-making involves a combination of expertise, preparedness, effective communication, and a willingness to adapt, ensuring that the chosen course of action aligns with the urgency of the situation while minimizing the risk of errors.
While the SA construct has been widely researched, the multivariate nature of SA poses a considerable challenge to its quantification and measurement. [lower-alpha 1] In general, techniques vary in terms of direct measurement of SA (e.g., objective real-time probes or subjective questionnaires assessing perceived SA) or methods that infer SA based on operator behavior or performance. Direct measures are typically considered to be "product-oriented" in that these techniques assess an SA outcome; inferred measures are considered to be "process-oriented," focusing on the underlying processes or mechanisms required to achieve SA. [74] These SA measurement approaches are further described next.
Objective measures directly assess SA by comparing an individual's perceptions of the situation or environment to some "ground truth" reality. Specifically, objective measures collect data from the individual on his or her perceptions of the situation and compare them to what is actually happening to score the accuracy of their SA at a given moment in time. Thus, this type of assessment provides a direct measure of SA and does not require operators or observers to make judgments about situational knowledge on the basis of incomplete information. Objective measures can be gathered in one of three ways: real-time as the task is completed (e.g., "real-time probes" presented as open questions embedded as verbal communications during the task [75] ), during an interruption in task performance (e.g., situation awareness global assessment technique (SAGAT), [32] or the WOMBAT situational awareness and stress tolerance test mostly used in aviation since the late 1980s and often called HUPEX in Europe), or post-test following completion of the task.
Subjective measures directly assess SA by asking individuals to rate their own or the observed SA of individuals on an anchored scale (e.g., participant situation awareness questionnaire; [76] the situation awareness rating technique [77] ). Subjective measures of SA are attractive in that they are relatively straightforward and easy to administer. However, several limitations should be noted. Individuals making subjective assessments of their own SA are often unaware of information they do not know (the unknown unknowns). Subjective measures also tend to be global in nature, and, as such, do not fully exploit the multivariate nature of SA to provide the detailed diagnostics available with objective measures. Nevertheless, self-ratings may be useful in that they can provide an assessment of operators' degree of confidence in their SA and their own performance. Measuring how SA is perceived by the operator may provide information as important as the operator's actual SA, since errors in perceived SA quality (over-confidence or under-confidence in SA) may have just as harmful an effect on an individual's or team's decision-making as errors in their actual SA. [78]
Subjective estimates of an individual's SA may also be made by experienced observers (e.g., peers, commanders, or trained external experts). These observer ratings may be somewhat superior to self-ratings of SA because more information about the true state of the environment is usually available to the observer than to the operator, who may be focused on performing the task (i.e., trained observers may have more complete knowledge of the situation). However, observers have only limited knowledge about the operator's concept of the situation and cannot have complete insight into the mental state of the individual being evaluated. Thus, observers are forced to rely more on operators' observable actions and verbalizations in order to infer their level of SA. In this case, such actions and verbalizations are best assessed using performance and behavioral measures of SA, as described next.
Performance measures infer SA from the end result (i.e., task performance outcomes), based on the assumption that better performance indicates better SA. Common performance metrics include quantity of output or productivity level, time to perform the task or respond to an event, and the accuracy of the response or, conversely, the number of errors committed. The main advantage of performance measures is that these can be collected objectively and without disrupting task performance. However, although evidence exists to suggest a positive relation between SA and performance, this connection is probabilistic and not always direct and unequivocal. [25] In other words, good SA does not always lead to good performance and poor SA does not always lead to poor performance. [79] Thus, performance measures should be used in conjunction with others measures of SA that directly assess this construct.
Behavioral measures also infer SA from the actions that individuals choose to take, based on the assumption that good actions will follow from good SA and vice versa. Behavioral measures rely primarily on observer ratings, and are, thus, somewhat subjective in nature. To address this limitation, observers can be asked to evaluate the degree to which individuals are carrying out actions and exhibiting behaviors that would be expected to promote the achievement of higher levels of SA. [lower-alpha 2] This approach removes some of the subjectivity associated with making judgments about an individual's internal state of knowledge by allowing them to make judgments about SA indicators that are more readily observable.
Process indices examine how individuals process information in their environment, such as by analyzing communication patterns between team members or using eye tracking devices. Team communication (particularly verbal communication) supports the knowledge building and information processing that leads to SA construction. [57] Thus, since SA may be distributed via communication, computational linguistics and machine learning techniques can be combined with natural language analytical techniques (e.g., Latent semantic analysis) to create models that draw on the verbal expressions of the team to predict SA and task performance. [81] [82] Although evidence exists to support the utility of communication analysis for predicting team SA, [83] time constraints and technological limitations (e.g., cost and availability of speech recording systems and speech-to-text translation software) may make this approach less practical and viable in time-pressured, fast-paced operations.
Psycho-physiological measures also serve as process indices of operator SA by providing an assessment of the relationship between human performance and a corrected change in the operator's physiology. [84] In other words, cognitive activity is associated with changes in the operator's physiological states. For example, the operator's overall functional state (as assessed using psycho-physiological measures, such as electroencephalography data, eyeblinks, and cardiac activity) may provide an indication as to whether the operator is sleep fatigued at one end of the continuum, or mentally overloaded at the other end. [85] Other psycho-physiological measures, such as event-related potentials, event-related desynchronization, transient heart rate, and electrodermal activity, may be useful for evaluating an operator's perception of critical environmental cues, that is, to determine if the operator has detected and perceived a task-relevant stimulus. [85] In addition, it is also possible to use psycho-physiological measures to monitor operators' environmental expectancies, that is, their physiological responses to upcoming events, as a measure of their current level of SA. [85]
The multivariate nature of SA significantly complicates its quantification and measurement, as it is conceivable that a metric may only tap into one aspect of the operator's SA. Further, studies have shown that different types of SA measures do not always correlate strongly with each other. [lower-alpha 3] Accordingly, rather than rely on a single approach or metric, valid and reliable measurement of SA should utilize a battery of distinct yet related measures that complement each other. [86] Such a multi-faced approach to SA measurement capitalizes on the strengths of each measure while minimizing the limitations inherent in each.
Situation awareness is limited by sensory input and available attention, by the individual's knowledge and experience, and by their ability to analyse the available information effectively. Attention is a limited resource, and may be reduced by distraction and task loading. Comprehension of the situation and projection of future status depend heavily on relevant knowledge, understanding, and experience in similar environments. Team SA is less limited by these factors, as there is a wider knowledge and experience base, but it is limited by the effectiveness of communication within the team. [87]
Following Endsley's paradigm and with cognitive resource management model [88] with neurofeedback techniques, Spanish Pedagogist María Gabriela López García (2010) implemented and developed a new SA training pattern. [89] The first organization to implement this new pattern design by López García is the SPAF (Spanish Air Force). She has trained EF-18 fighter pilots and Canadair firefighters. [90]
This situation awareness training aims to avoid losing SA and provide pilots cognitive resources to always operate below the maximum workload that they can withstand. This provides not only a lower probability of incidents and accidents by human factors, but the hours of operation are at their optimum efficiency, extending the operating life of systems and operators. [91]
In first aid medical training provided by the American Red Cross, the need to be aware of the situation within the area of influence as one approaches an individual requiring medical assistance is the first aspect for responders to consider [92] Examining the area and being aware of potential hazards, including the hazards which may have caused the injuries being treated, is an effort to ensure that responders do not themselves get injured and require treatment as well.
Situation awareness for first responders in medical situations also includes evaluating and understanding what happened [93] to avoid injury of responders and also to provide information to other rescue agencies which may need to know what the situation is via radio prior to their arrival on the scene.
In a medical context, situation awareness is applied to avoid further injury to already-injured individuals, to avoid injury to medical responders, and to inform other potential responders of hazardous conditions prior to their arrival.
A loss in situational awareness has led to many transportation accidents, including the 1991 Los Angeles Airport runway collision [94] and the 2015 Philadelphia train derailment. [95]
Within the search and rescue context, situational awareness is applied primarily to avoid injury to search crews by being aware of the environment, the lay of the land, and the many other factors of influence within one's surroundings assists in the location of injured or missing individuals. [96] Public safety agencies are increasingly using situational awareness applications like Android Tactical Assault Kit on mobile devices and even robots to improve situational awareness. [97]
In the United States Forest Service the use of chainsaws and crosscut saws requires training and certification. [98] A great deal of that training describes situational awareness as an approach toward environmental awareness but also self-awareness [99] which includes being aware of one's own emotional attitude, tiredness, and even caloric intake.
Situational awareness in the forest context also includes evaluating the environment and the potential safety hazards within a saw crew's area of influence. As a sawyer approaches a task, the ground, wind, cloud cover, hillsides, and many other factors are examined and are considered proactively as part of trained sawyers' ingrained training.
Dead or diseased trees within the reach of saw team crews are evaluated, the strength and direction of the wind is evaluated. The lay of tree sections to be bucked or the lean of a tree to be felled is evaluated within the context of being aware of where the tree will fall or move to when cut, where the other members of the saw team are located, how they are moving, whether hikers are within the area of influence, whether hikers are moving or are stationary.
Law enforcement training includes being situationally aware of what is going on around the police officer before, during, and after interactions with the general public [100] while also being fully aware of what is happening around the officer in areas not currently the focus of an officer's immediate task.
In cybersecurity, consider situational awareness, for threat operations, is being able to perceive threat activity and vulnerability in context so that the following can be actively defended: data, information, knowledge, and wisdom from compromise. Situational awareness is achieved by developing and using solutions that often consume data and information from many different sources. Technology and algorithms are then used to apply knowledge and wisdom in order to discern patterns of behavior that point to possible, probable, and real threats.
Situational awareness for cybersecurity threat operations teams appears in the form of a condensed, enriched, often graphical, prioritized, and easily searchable view of systems that are inside or related to security areas of responsibility (such as corporate networks or those used for national security interests). Different studies have analyzed the perception of security and privacy in the context of eHealth, [101] network security, [102] or using collaborative approaches to improve the awareness of users. [103] There are also research efforts to automate the processing of communication network information in order to obtain or improve cyber-situational awareness. [104]
As the capabilities of technological agents increases, it becomes more important that their actions and underlying rational becomes transparent. In the military realm, agent transparency has been investigated as unmanned vehicles are being employed more frequently. In 2014, researchers at the U.S. Army Research Laboratory reported the Situation Awareness-based Agent Transparency (SAT), a model designed to increase transparency through user interface design. When it comes to automation, six barriers have been determined to discourage "human trust in autonomous systems, with 'low observability, predictability, directability and auditability' and 'low mutual understanding of common goals' being among the key issues." [105] The researchers at the US Army Research Laboratory designed three levels of situational awareness transparency based on Endsley's theory of perception, comprehension, and projection. The greater the level of situational awareness, they claimed, the more information the agent conveys to the user. [106]
A 2018 publication from the U.S. Army Research Laboratory evaluated how varying transparency levels in the SAT affects the operator workload and a human's understanding of when it is necessary to intervene in the agent's decision making. The researchers refer to this supervisory judgement as calibration. The group split their SAT model research into two efforts: the Intelligent Agent Transparency in Human Agent Transparency for Multi UxV Management (IMPACT) and the Autonomous Squad Member (ASM) projects. [105]
Scientists provided three standard levels of SAT in addition to a fourth level which included the agent's level of uncertainty in its decision in unmanned vehicles. The stated goal of this research was to determine how modifying levels of SAT affected user performance, situation awareness, and confidence in the agent. The scientists stated that their experimental results support that increased agent transparency improved the performance of the operator and human confidence on the agent without a significant effect on the workload. When the agent communicated levels of uncertainty in the task assigned, those involved in the experimentation displayed more trust in the agent. [107]
The ASM research was conducted by providing a simulation game in which the participant had to complete a training course with an ASM, a ground robot that communicates with infantry. The participants had to multitask, evaluating potential threats while monitoring the ASM's communications on the interface. According to that research, experimental results demonstrated that the greatest confidence calibration occurred when the agent communicated information of all three levels of SAT. [107] The group of scientists from the U.S. Army Research Laboratory developed transparency visualization concepts in which the agents can communicate their plans, motivations, and projected outcomes through icons. The agent has been reported to be able to relate its resource usage, reasoning, predicted resource loss, progress towards task completion, etc. [105] Unlike in the IMPACT research, the agent informing the user of its level of uncertainty in decision making, no increase in trust was observed. [107]
Crowdsourcing, made possible by the rise of social media and ubiquitous mobile access has a potential for considerably enhancing situation awareness of both responsible authorities and citizens themselves for emergency and crisis situations by employing or using "citizens as sensors". [108] [109] [110] [111] [112] [113] [114] [115] For instance, analysis of content posted on online social media like Facebook and Twitter using data mining, machine learning and natural language processing techniques may provide situational information. [115] A crowdsourcing approach to sensing, particularly in crisis situations, has been referred to as crowdsensing. [116] Crowdmapping is a subtype of crowdsourcing [117] [118] by which aggregation of crowd-generated inputs such as captured communications and social media feeds are combined with geographic data to create a digital map that is as up-to-date as possible [119] [120] [121] [122] that can improve situational awareness during an incident and be used to support incident response. [123]
A Cloud-based Geographic Information System (GIS) with a display of structured data refers to a system that utilizes cloud computing technology to store, manage, analyze, and visualize geographic data in a structured format. This approach offers several advantages, including accessibility, scalability, and collaboration, compared to traditional on-premises GIS systems.
Here's a breakdown of the key components:
Cloud-Based Infrastructure:
Geographic Information System (GIS):
Structured Data Storage:
Data Analysis and Processing:
Visualization Tools:
Collaborative Features:
Real-Time Updates:
Integration with Other Cloud Services:
Overall, a cloud-based GIS with structured data display provides a dynamic and efficient platform for managing geographic information, making it accessible, scalable, and collaborative for a wide range of applications, from urban planning and environmental monitoring to business analytics and disaster response.
There are two training scenarios designed to increase the situational awareness skills of military professionals, and first responders in police and emergency services. The first, Kim's Game, has a more common place in the Marine Corps sniper school and police academies. The name is derived from the novel Kim which references the game to a spy school lesson. The game involves a tray with various items such as spoons, pencils, bullets, and any other items the soldiers would be familiar with. The participants are given one minute to view all of these items before they are covered up with a blanket. The participants would then individually list the items that they saw, the one with the most correct answers would win the game. The same game is played in young scouting and girl guide groups as well to teach children quick memorisation skills.
The second method is a more practical military application of Kim's Game. It starts with a field area (jungle, bush or forest) of about five meters wide to 10 meters deep where various items, some camouflaged and some not, to be located in the area on the ground and in the trees at eyesight level. Again, these items would be ones that are familiar to the soldiers undergoing the exercise. The participants would be given 10 minutes to view the area from one place and take a mental note of the items they saw. Once their 10 minutes is up, the soldier would then be required to do a repetition of certain exercises such as burpees, designed to simulate the stress of a physically demanding environment. Once the participant completes the exercise, they would list the items they saw. The points would be tallied in the end to find the winner.
Usability can be described as the capacity of a system to provide a condition for its users to perform the tasks safely, effectively, and efficiently while enjoying the experience. In software engineering, usability is the degree to which a software can be used by specified consumers to achieve quantified objectives with effectiveness, efficiency, and satisfaction in a quantified context of use.
Task analysis is a fundamental tool of human factors engineering. It entails analyzing how a task is accomplished, including a detailed description of both manual and mental activities, task and element durations, task frequency, task allocation, task complexity, environmental conditions, necessary clothing and equipment, and any other unique factors involved in or required for one or more people to perform a given task.
GOMS is a specialized human information processor model for human-computer interaction observation that describes a user's cognitive structure on four components. In the book The Psychology of Human Computer Interaction. written in 1983 by Stuart K. Card, Thomas P. Moran and Allen Newell, the authors introduce: "a set of Goals, a set of Operators, a set of Methods for achieving the goals, and a set of Selections rules for choosing among competing methods for goals." GOMS is a widely used method by usability specialists for computer system designers because it produces quantitative and qualitative predictions of how people will use a proposed system.
The term workload can refer to several different yet related entities.
Ecological interface design (EID) is an approach to interface design that was introduced specifically for complex sociotechnical, real-time, and dynamic systems. It has been applied in a variety of domains including process control, aviation, and medicine.
Cognitive ergonomics is a scientific discipline that studies, evaluates, and designs tasks, jobs, products, environments and systems and how they interact with humans and their cognitive abilities. It is defined by the International Ergonomics Association as "concerned with mental processes, such as perception, memory, reasoning, and motor response, as they affect interactions among humans and other elements of a system. Cognitive ergonomics is responsible for how work is done in the mind, meaning, the quality of work is dependent on the persons understanding of situations. Situations could include the goals, means, and constraints of work. The relevant topics include mental workload, decision-making, skilled performance, human-computer interaction, human reliability, work stress and training as these may relate to human-system design." Cognitive ergonomics studies cognition in work and operational settings, in order to optimize human well-being and system performance. It is a subset of the larger field of human factors and ergonomics.
Competence is the set of demonstrable characteristics and skills that enable and improve the efficiency or performance of a job. Competency is a series of knowledge, abilities, skills, experiences and behaviors, which leads to effective performance in an individual's activities. Competency is measurable and can be developed through training.
Operational risk management (ORM) is defined as a continual recurring process that includes risk assessment, risk decision making, and the implementation of risk controls, resulting in the acceptance, mitigation, or avoidance of risk.
Human error assessment and reduction technique (HEART) is a technique used in the field of human reliability assessment (HRA), for the purposes of evaluating the probability of a human error occurring throughout the completion of a specific task. From such analyses measures can then be taken to reduce the likelihood of errors occurring within a system and therefore lead to an improvement in the overall levels of safety. There exist three primary reasons for conducting an HRA: error identification, error quantification, and error reduction. As there exist a number of techniques used for such purposes, they can be split into one of two classifications: first-generation techniques and second generation techniques. First generation techniques work on the basis of the simple dichotomy of 'fits/doesn't fit' in the matching of the error situation in context with related error identification and quantification and second generation techniques are more theory based in their assessment and quantification of errors. HRA techniques have been used in a range of industries including healthcare, engineering, nuclear, transportation, and business sectors. Each technique has varying uses within different disciplines.
Influence Diagrams Approach (IDA) is a technique used in the field of Human reliability Assessment (HRA), for the purposes of evaluating the probability of a human error occurring throughout the completion of a specific task. From such analyses measures can then be taken to reduce the likelihood of errors occurring within a system and therefore lead to an improvement in the overall levels of safety. There exist three primary reasons for conducting an HRA; error identification, error quantification and error reduction. As there exist a number of techniques used for such purposes, they can be split into one of two classifications; first generation techniques and second generation techniques. First generation techniques work on the basis of the simple dichotomy of ‘fits/doesn’t fit’ in the matching of the error situation in context with related error identification and quantification and second generation techniques are more theory based in their assessment and quantification of errors. ‘HRA techniques have been utilised in a range of industries including healthcare, engineering, nuclear, transportation and business sector; each technique has varying uses within different disciplines.
Industrial engineering is an engineering profession that is concerned with the optimization of complex processes, systems, or organizations by developing, improving and implementing integrated systems of people, money, knowledge, information and equipment. Industrial engineering is central to manufacturing operations.
Dynamic decision-making (DDM) is interdependent decision-making that takes place in an environment that changes over time either due to the previous actions of the decision maker or due to events that are outside of the control of the decision maker. In this sense, dynamic decisions, unlike simple and conventional one-time decisions, are typically more complex and occur in real-time and involve observing the extent to which people are able to use their experience to control a particular complex system, including the types of experience that lead to better decisions over time.
Adaptive collaborative control is the decision-making approach used in hybrid models consisting of finite-state machines with functional models as subcomponents to simulate behavior of systems formed through the partnerships of multiple agents for the execution of tasks and the development of work products. The term “collaborative control” originated from work developed in the late 1990s and early 2000 by Fong, Thorpe, and Baur (1999). It is important to note that according to Fong et al. in order for robots to function in collaborative control, they must be self-reliant, aware, and adaptive. In literature, the adjective “adaptive” is not always shown but is noted in the official sense as it is an important element of collaborative control. The adaptation of traditional applications of control theory in teleoperations sought initially to reduce the sovereignty of “humans as controllers/robots as tools” and had humans and robots working as peers, collaborating to perform tasks and to achieve common goals. Early implementations of adaptive collaborative control centered on vehicle teleoperation. Recent uses of adaptive collaborative control cover training, analysis, and engineering applications in teleoperations between humans and multiple robots, multiple robots collaborating among themselves, unmanned vehicle control, and fault tolerant controller design.
Human factors are the physical or cognitive properties of individuals, or social behavior which is specific to humans, and which influence functioning of technological systems as well as human-environment equilibria. The safety of underwater diving operations can be improved by reducing the frequency of human error and the consequences when it does occur. Human error can be defined as an individual's deviation from acceptable or desirable practice which culminates in undesirable or unexpected results. Human factors include both the non-technical skills that enhance safety and the non-technical factors that contribute to undesirable incidents that put the diver at risk.
[Safety is] An active, adaptive process which involves making sense of the task in the context of the environment to successfully achieve explicit and implied goals, with the expectation that no harm or damage will occur. – G. Lock, 2022
Dive safety is primarily a function of four factors: the environment, equipment, individual diver performance and dive team performance. The water is a harsh and alien environment which can impose severe physical and psychological stress on a diver. The remaining factors must be controlled and coordinated so the diver can overcome the stresses imposed by the underwater environment and work safely. Diving equipment is crucial because it provides life support to the diver, but the majority of dive accidents are caused by individual diver panic and an associated degradation of the individual diver's performance. – M.A. Blumenberg, 1996
Ergonomics, also known as human factors or human factors engineering (HFE), is the application of psychological and physiological principles to the engineering and design of products, processes, and systems. Primary goals of human factors engineering are to reduce human error, increase productivity and system availability, and enhance safety, health and comfort with a specific focus on the interaction between the human and equipment.
Mica Endsley is an American engineer and a former Chief Scientist of the United States Air Force.
Automation bias is the propensity for humans to favor suggestions from automated decision-making systems and to ignore contradictory information made without automation, even if it is correct. Automation bias stems from the social psychology literature that found a bias in human-human interaction that showed that people assign more positive evaluations to decisions made by humans than to a neutral object. The same type of positivity bias has been found for human-automation interaction, where the automated decisions are rated more positively than neutral. This has become a growing problem for decision making as intensive care units, nuclear power plants, and aircraft cockpits have increasingly integrated computerized system monitors and decision aids to mostly factor out possible human error. Errors of automation bias tend to occur when decision-making is dependent on computers or other automated aids and the human is in an observatory role but able to make decisions. Examples of automation bias range from urgent matters like flying a plane on automatic pilot to such mundane matters as the use of spell-checking programs.
Human performance modeling (HPM) is a method of quantifying human behavior, cognition, and processes. It is a tool used by human factors researchers and practitioners for both the analysis of human function and for the development of systems designed for optimal user experience and interaction. It is a complementary approach to other usability testing methods for evaluating the impact of interface features on operator performance.
In aviation, the SHELL model is a conceptual model of human factors that helps to clarify the location and cause of human error within an aviation environment.
The out-of-the-loop performance problem arises when an operator suffers from performance decrement as a consequence of automation. The potential loss of skills and of situation awareness caused by vigilance and complacency problems might make operators of automated systems unable to operate manually in case of system failure. Highly automated systems reduce the operator to monitoring role, which diminishes the chances for the operator to understand the system. It is related to mind wandering.