Situation awareness

Last updated

Situational awareness or situation awareness (SA) is the perception of environmental elements and events with respect to time or space, the comprehension of their meaning, and the projection of their future status. [1]

Contents

Situation awareness has been recognized as a critical, yet often elusive, foundation for successful decision-making across a broad range of situations, many of which involve the protection of human life and property, including law enforcement, aviation, air traffic control, ship navigation, [2] health care, [3] emergency response, military command and control operations, self defense, [4] and offshore oil and nuclear power plant management. [5] Lacking or inadequate situation awareness has been identified as one of the primary factors in accidents attributed to human error. [6]

The formal definition of SA is broken down into three segments: perception of the elements in the environment, comprehension of the situation, and projection of future status. [7] Three facets of SA have been in focus in research: SA states, SA systems, and SA processes. SA states refers to the actual awareness of the situation. SA systems refers to the distribution of SA in teams and between objects in the environment, and to the exchange of SA between system parts. SA processes refers to the updating of SA states, and what guides the moment-to-moment change of SA. [8]

History

Although the term itself is fairly recent, the concept has roots in the history of military theory—it is recognizable in Sun Tzu's The Art of War , for example.[ citation needed ] The term can be traced to World War I, where it was recognized as a crucial skill for crews in military aircraft. [9]

There is evidence that the term situational awareness was first employed at the Douglas Aircraft Company during human factors engineering research while developing vertical and horizontal situation displays and evaluating digital-control placement for the next generation of commercial aircraft. Research programs in flight-crew computer interaction [10] and mental workload measurement [11] built on the concept of awareness measurement from a series of experiments that measured contingency awareness during learning, [12] [13] and later extended to mental workload and fatigue. [14]

Situation awareness appears in the technical literature as early as 1983, when describing the benefits of a prototype touch-screen navigation display. [15] During the early 1980s, integrated “vertical-situation” and “horizontal-situation” displays were being developed for commercial aircraft to replace multiple electro-mechanical instruments. Integrated situation displays combined the information from several instruments enabling more efficient access to critical flight parameters, thereby improving situational awareness and reducing pilot workload.

Before being widely adopted by human factors scientists in the 1990s, the term is said to have been used by United States Air Force (USAF) fighter aircrew returning from war in Korea and Vietnam. [16] They identified having good SA as the decisive factor in air combat engagements—the "ace factor". [17] Survival in a dogfight was typically a matter of observing the opponent's current move and anticipating his next move a fraction of a second before he could observe and anticipate it himself. USAF pilots also came to equate SA with the "observe" and "orient" phases of the famous observe-orient-decide-act loop (OODA loop), or Boyd cycle, as described by the USAF war theorist Col. John Boyd. In combat, the winning strategy is to "get inside" your opponent's OODA loop, not just by making one's own decisions quicker, but also by having better SA than one's opponent, and even changing the situation in ways that the opponent cannot monitor or even comprehend. Losing one's own SA, in contrast, equates to being "out of the loop".

Clearly, SA has far reaching applications, as it is necessary for individuals and teams to function effectively in their environment. Thus, we are beginning to see SA going beyond the field of aviation to work being conducted in a wide variety of environments. Currently, SA is being studied in such diverse areas as air traffic control, nuclear power plant operation, vehicle operation, and anesthesiology. [18] [19] [20] [21] [22]

Several cognitive processes related to situation awareness are briefly described in this section. The matrix shown below attempts to illustrate the relationship among some of these concepts. [23] Note that situation awareness and situational assessment are more commonly discussed in information fusion complex domains such as aviation and military operations and relate more to achieving immediate tactical objectives. [24] [25] [26] Sensemaking and achieving understanding are more commonly found in industry and the organizational psychology literature and often relate to achieving long-term strategic objectives.

Phase
ProcessOutcome
ObjectiveTactical (short-term)situational assessmentsituation awareness
Strategic (long-term)sensemakingunderstanding
Scientific (longer-term)analysisprediction

Situational understanding

Situation awareness is sometimes confused with the term "situational understanding." In the context of military command and control applications, situational understanding refers to the "product of applying analysis and judgment to the unit's situation awareness to determine the relationships of the factors present and form logical conclusions concerning threats to the force or mission accomplishment, opportunities for mission accomplishment, and gaps in information". [27] Situational understanding is the same as Level 2 SA in the Endsley model—the comprehension of the meaning of the information as integrated with each other and in terms of the individual's goals. It is the "so what" of the data that is perceived.

Situational assessment

In brief, situation awareness is viewed as "a state of knowledge," and situational assessment as "the processes" used to achieve that knowledge. Endsley argues that "it is important to distinguish the term situation awareness, as a state of knowledge, from the processes used to achieve that state. [1] These processes, which may vary widely among individuals and contexts, will be referred to as situational assessment or the process of achieving, acquiring, or maintaining SA." Note that SA is not only produced by the processes of situational assessment, it also drives those same processes in a recurrent fashion. For example, one's current awareness can determine what one pays attention to next and how one interprets the information perceived. [28]

Mental models

Accurate mental models are one of the prerequisites for achieving SA. [29] [30] A mental model can be described as a set of well-defined, highly organized yet dynamic knowledge structures developed over time from experience. [31] [32] The volume of available data inherent in complex operational environments can overwhelm the capability of novice decision makers to attend, process, and integrate this information efficiently, resulting in information overload and negatively impacting their SA. [33] In contrast, experienced decision makers assess and interpret the current situation (Level 1 and 2 SA) and select an appropriate action based on conceptual patterns stored in their long-term memory as "mental models". [34] Cues in the environment activate these mental models, which in turn guide their decision making process.

Sensemaking

Klein, Moon, and Hoffman distinguish between situation awareness and sensemaking as follows:

...situation awareness is about the knowledge state that's achieved—either knowledge of current data elements, or inferences drawn from these data, or predictions that can be made using these inferences. In contrast, sensemaking is about the process of achieving these kinds of outcomes, the strategies, and the barriers encountered. [35]

In brief, sensemaking is viewed more as "a motivated, continuous effort to understand connections (which can be among people, places, and events) in order to anticipate their trajectories and act effectively", [36] rather than the state of knowledge underlying situation awareness. Endsley points out that as an effortful process, sensemaking is actually considering a subset of the processes used to maintain situation awareness. [37] In the vast majority of the cases, SA is instantaneous and effortless, proceeding from pattern recognition of key factors in the environment—"The speed of operations in activities such as sports, driving, flying and air traffic control practically prohibits such conscious deliberation in the majority of cases, but rather reserves it for the exceptions." Endsley also points out that sensemaking is backward focused, forming reasons for past events, while situation awareness is typically forward looking, projecting what is likely to happen in order to inform effective decision processes. [37]

Theoretical model

SA can be described in terms of a holistic framework of SA systems, states, and processes. [8] SA descriptions usually focus on one of the three aspects, or on combinations. SA states can be described as:

Objects: Awareness of various objects in the world, and their current status. Objects and their status may be indicative of particular situations (that they are about to occur, that they are ongoing, etc.). Then they are often referred to as cues.

Frames: Awareness of what kind of situation is on-going, e.g. a runway incursion where an aircraft is about to collide with some object on the runway.

Implications: Awareness of objects within frames, of what their current status means in a particular situation. E.g. the implications of the current speed of the aircraft, and the distance to an object on the runway, in a runway incursion situation. The implications refer to time and space, to an event horizon.

Event horizon: An awareness of plans and events in time and space. It includes an awareness of what has happened (useful for diagnosis, to achieve SA, to frame situations). It also includes prognosis, an awareness of what might happen next. That includes on the one hand an awareness both of what might occur based on diagnosis and the current situation, and on the other hand on an awareness of current plans and intentions.

All four aspects may drive SA processes. Being aware of the status of particular objects (cues), one might infer that particular situations are on-going, and frame the objects accordingly. The cues then drive re-framing of situations. Having a particular frame, or pre-conception of a situation, this may drive the perception of objects. E.g. having noticed that a landing is about to occur, an Air Traffic Controller will usually look for specific objects in the environment and update the awareness of their status. Further, having realized the implications of objects of their status, this drives the process of what to attend to next. E.g. knowing that a vehicle is approaching a runway where a landing is about to take place, an Air Traffic Controller may monitor its progress. Event horizon awareness may also guide SA, e.g. if one plans to stop the car at a gas station, one may look for gas station signs.

Further, to describe SA in e.g. teams, the distribution of SA must be considered, e.g. in terms of:

Shared SA: What SA different agents have in common

Task SA: What SA different agents have, that they need to perform their tasks

Transactive SA: Exchange of SA between system parts

Buffering SA: Awareness of different accounts (e.g., different frames) of situations, in various parts of the system.

Causal models of SA

A recent holistic framework [38] has shown how SA emerges from causal mental models, where beliefs about causal relationships in the world are represented in the structure of the model and beliefs about the state of the world are reflected in the state of the model. Causal models support inferences about causes and consequences, as well as counterfactual thinking [39] and allow us to ask questions of why?, so-what?, and what-if?, that support SA comprehension and projection as well as naturalistic decision-making. When instantiated as a Bayesian network, these models also allow evaluation of the search for information and the integration of new information with existing knowledge. A computational causal model quantitatively predicted search behavior and changes in SA in fighter pilots. [38]

Endsley's model

The SA states framework above extends an older (but simpler), theoretical framework of SA, provided by Dr. Mica Endsley (1995b), which has historically been widely used. Endsley's model describes SA states, and illustrates three stages or steps of SA formation: perception, comprehension, and projection.

Perception (Level 1 SA): The first step in achieving SA is to perceive the status, attributes, and dynamics of relevant elements in the environment. Thus, Level 1 SA, the most basic level of SA, involves the processes of monitoring, cue detection, and simple recognition, which lead to an awareness of multiple situational elements (objects, events, people, systems, environmental factors) and their current states (locations, conditions, modes, actions).

Comprehension (Level 2 SA): The next step in SA formation involves a synthesis of disjointed Level 1 SA elements through the processes of pattern recognition, interpretation, and evaluation. Level 2 SA requires integrating this information to understand how it will impact upon the individual's goals and objectives. This includes developing a comprehensive picture of the world, or of that portion of the world of concern to the individual.

Projection (Level 3 SA): The third and highest level of SA involves the ability to project the future actions of the elements in the environment. Level 3 SA is achieved through knowledge of the status and dynamics of the elements and comprehension of the situation (Levels 1 and 2 SA), and then extrapolating this information forward in time to determine how it will affect future states of the operational environment.

Endsley's model of SA also illustrates several variables that can influence the development and maintenance of SA, including individual, task, and environmental factors. For example, individuals vary in their ability to acquire SA; thus, simply providing the same system and training will not ensure similar SA across different individuals. Endsley's model shows how SA "provides the primary basis for subsequent decision making and performance in the operation of complex, dynamic systems" (Endsley, 1995a, p. 65). Although alone it cannot guarantee successful decision making, SA does support the necessary input processes (e.g., cue recognition, situation assessment, prediction) upon which good decisions are based (Artman, 2000).

Endsley-SA-model.jpg
Endsley's model of SA. This is a synthesis of versions she has given in several sources, notably Endsley (1995a) and Endsley et al (2000). Drawn by Dr. Peter Lankton, May 2007.

SA also involves both a temporal and a spatial component. Time is an important concept in SA, as SA is a dynamic construct, changing at a tempo dictated by the actions of individuals, task characteristics, and the surrounding environment. As new inputs enter the system, the individual incorporates them into this mental representation, making changes as necessary in plans and actions in order to achieve the desired goals. SA also involves spatial knowledge about the activities and events occurring in a specific location of interest to the individual. Thus, the concept of SA includes perception, comprehension, and projection of situational information, as well as temporal and spatial components.

In summary, the model consists of several key factors:

For a more complete description of the model, see Endsley (1995b) and Endsley (2004). See also Endsley (2000) for a review of other models of SA.

Criticism of model

Any model of situation awareness depends on cognitive and shared cognitive processes, and yet '...models of SA refer to cognitive processes in general terms, but do not specify exactly what processes are involved and to what extent.' (Banbury & Tremblay, 2004, p. xiii). This criticism is an example of the difficulty that cognitive science has in addressing a concept such as SA, which through its definition and assumptions appears to stand robustly, however when the theorized processes are exposed at the cognitive level of analysis assumptions must be radically reviewed. Researchers have regularly raised these concerns - notably in Flach (1995) and more recently in Banbury & Tremblay (2004). To date the most widely cited model of SA is lacking in support from cognitive science, one notable observation that still stands is that:

'The test of situation awareness as a construct will be in its ability to be operationalized in terms of objective, clearly specified independent (stimulus manipulation) and dependent (response difference) variables ... Otherwise, SA will be yet another buzzword to cloak scientists' ignorance.' (Flach, J., 1995, p. 155)

Another criticism of the model comes from a study done in 2015 which looked at situational awareness in tasks where relevant knowledge about the task could be found through other mediums, other than directly asking the collaborator. It was found that in these types of tasks, verbal communication lengthens the time it takes to complete a task when compared to people completing a task individually. [40]

In team operations

In many systems and organizations, people work not just as individuals, but as members of a team. Thus, it is necessary to consider the SA of not just individual team members, but also the SA of the team as a whole. To begin to understand what is needed for SA within teams, it is first necessary to clearly define what constitutes a team. A team is not just any group of individuals; rather teams have a few defining characteristics. As defined by Salas et al. (1992), a team is:

"a distinguishable set of two or more people who interact dynamically, interdependently and adaptively toward a common and valued goal/objective/mission, who have each been assigned specific roles or functions to perform, and who have a limited life span of membership."

Team SA

Team SA is defined as "the degree to which every team member possesses the SA required for his or her responsibilities" (Endsley, 1995b, p. 39; see also Endsley, 1989). The success or failure of a team depends on the success or failure of each of its team members. If any one of the team members has poor SA, it can lead to a critical error in performance that can undermine the success of the entire team. By this definition, each team member needs to have a high level of SA on those factors that are relevant for his or her job. It is not sufficient for one member of the team to be aware of critical information if the team member who needs that information is not aware.

In a team, each member has a subgoal pertinent to his/her specific role that feeds into the overall team goal. Associated with each member's subgoal are a set of SA elements about which he/she is concerned. Team SA, therefore, can be represented as shown in Figure 2. As the members of a team are essentially interdependent in meeting the overall team goal, some overlap between each member's subgoal and their SA requirements will be present. It is this subset of information that constitutes much of team coordination. That coordination may occur as a verbal exchange, a duplication of displayed information, or by some other means.

Shared SA

Shared situation awareness can be defined as "the degree to which team members possess the same SA on shared SA requirements" (Endsley & Jones, 1997, p. 47; 2001, p. 48). As implied by this definition, there are information requirements that are relevant to multiple team members. A major part of teamwork involves the area where these SA requirements overlap—the shared SA requirements that exist as a function of the essential interdependency of the team members. In a poorly functioning team, two or more members may have different assessments on these shared SA requirements and thus behave in an uncoordinated or even counter-productive fashion. Yet in a smoothly functioning team, each team member shares a common understanding of what is happening on those SA elements that are common—shared SA. Thus, shared SA refers to the overlap between the SA requirements of the team members, as presented in Figure 3. As depicted by the clear areas of the figure, not all information needs to be shared. Clearly, each team member is aware of much that is not pertinent to the others on the team. Sharing every detail of each person's job would only create a great deal of "noise" to sort through to get needed information. It is only that information which is relevant to the SA requirements of each team member that is needed.

Team SA model

The situation awareness of the team as a whole, therefore, is dependent upon both (1) a high level of SA among individual team members for the aspects of the situation necessary for their job; and (2) a high level of shared SA between team members, providing an accurate common operating picture of those aspects of the situation common to the needs of each member (Endsley & Jones, 2001). Endsley and Jones (1997; 2001) describe a model of team situation awareness as a means of conceptualizing how teams develop high levels of shared SA across members. Each of these four factors—requirements, devices, mechanisms and processes—act to help build team and shared SA.

1. Team SA requirements – the degree to which the team members know which information needs to be shared, including their higher level assessments and projections (which are usually not otherwise available to fellow team members), and information on team members' task status and current capabilities.

2. Team SA devices – the devices available for sharing this information, which can include direct communication (both verbal and non-verbal), shared displays (e.g., visual or audio displays, or tactile devices), or a shared environment. As non-verbal communication, such as gestures and display of local artifacts, and a shared environment are usually not available in distributed teams, this places far more emphasis on verbal communication and communication technologies for creating shared information displays.

3. Team SA mechanisms – the degree to which team members possess mechanisms, such as shared mental models, which support their ability to interpret information in the same way and make accurate projections regarding each other's actions. The possession of shared mental models can greatly facilitate communication and coordination in team settings.

4. Team SA processes – the degree to which team members engage in effective processes for sharing SA information which may include a group norm of questioning assumptions, checking each other for conflicting information or perceptions, setting up coordination and prioritization of tasks, and establishing contingency planning among others.

In time critical decision-making processes

There are many industries where it is critical to make a correct decision within a strict time limit, based on the decision-maker's knowledge of the current situation: for example air traffic controllers or medical providers (e.g. anesthesiologists). In these situations it is common that the key decision maker is supported by other team members or by complex monitoring systems feeding them information, which can involve multiple sources and formats of information. Even in these time-critical situations, the importance of having situation awareness (SA) is not constant: i.e. it is more critical to the outcome to have better SA for non-standard situations, such as points of high information traffic, extraneous activity and unforeseeable events. These ‘points of fracture’ are likely to cause additional workload on the individuals and therefore affect their SA and the time to make the decision. At the critical point the perceived situational awareness utilized to make the decision is directly affected by the cognitive workload to gain, comprehend and process the SA that is coming in to the operator, both general background SA and the SA specifically related to the decision. (Smith, K. T. 2013) [41] In other words, if everything is going OK the level of SA you have is not as critical as it is when something unusual happens or something goes wrong.

Research into the decision making process is an increasing area of interest and the identification of this type of relationship has led to the development of at least one integrated conceptual framework (developed by K Tara Smith) that attempts to accommodate all of the impactors on the decision-making process, defining how they impact on the individual's ability to acquire their SA. This involves aligning the terms and concepts used by different research areas, so that the causal relationships can be identified and defined.

This approach of integrating situation awareness, workload, signal processing theory, decision theory, etc. tends to subtly change the questions that are asked during the analysis process from quantifying and qualifying the SA to measures of the probabilistic aspects of a decision, such as the number of interrelationships, the certainty and time-lag of the information arriving, risk to desired outcome or effect, etc., together with the processing aspects, to do with the number of signals, accuracy and completeness of the information and importance to the operational context. In other words, instead of asking does a modification to the system provide more SA, we are asking does this modification to the system provide more SA in a form that can be used at the time when it is needed?

Measurement

While the SA construct has been widely researched, the multivariate nature of SA poses a considerable challenge to its quantification and measurement (for a detailed discussion on SA measurement, see Endsley & Garland, 2000; Fracker, 1991a; 1991b). In general, techniques vary in terms of direct measurement of SA (e.g., objective real-time probes or subjective questionnaires assessing perceived SA) or methods that infer SA based on operator behavior or performance. Direct measures are typically considered to be "product-oriented" in that these techniques assess an SA outcome; inferred measures are considered to be "process-oriented," focusing on the underlying processes or mechanisms required to achieve SA (Graham & Matthews, 2000). These SA measurement approaches are further described next.

Objective measures

Objective measures directly assess SA by comparing an individual's perceptions of the situation or environment to some "ground truth" reality. Specifically, objective measures collect data from the individual on his or her perceptions of the situation and compare them to what is actually happening to score the accuracy of their SA at a given moment in time. Thus, this type of assessment provides a direct measure of SA and does not require operators or observers to make judgments about situational knowledge on the basis of incomplete information. Objective measures can be gathered in one of three ways: real-time as the task is completed (e.g., "real-time probes" presented as open questions embedded as verbal communications during the task – Jones & Endsley, 2000), during an interruption in task performance (e.g., situation awareness global assessment technique (SAGAT) – Endsley, 1995a, or the WOMBAT situational awareness and stress tolerance test mostly used in aviation since the late 1980s and often called HUPEX in Europe), or post-test following completion of the task.

Subjective measures

Subjective measures directly assess SA by asking individuals to rate their own or the observed SA of individuals on an anchored scale (e.g., participant situation awareness questionnaire (PSAQ) – Strater, Endsley, Pleban, & Matthews, 2001; the situation awareness rating technique (SART) – Taylor, 1989). Subjective measures of SA are attractive in that they are relatively straightforward and easy to administer. However, several limitations should be noted. Individuals making subjective assessments of their own SA are often unaware of information they do not know (the "unknown unknowns"). Subjective measures also tend to be global in nature, and, as such, do not fully exploit the multivariate nature of SA to provide the detailed diagnostics available with objective measures. Nevertheless, self-ratings may be useful in that they can provide an assessment of operators' degree of confidence in their SA and their own performance. Measuring how SA is perceived by the operator may provide information as important as the operator's actual SA, since errors in perceived SA quality (over-confidence or under-confidence in SA) may have just as harmful an effect on an individual's or team's decision-making as errors in their actual SA (Endsley, 1998).

Subjective estimates of an individual's SA may also be made by experienced observers (e.g., peers, commanders, or trained external experts). These observer ratings may be somewhat superior to self-ratings of SA because more information about the true state of the environment is usually available to the observer than to the operator, who may be focused on performing the task (i.e., trained observers may have more complete knowledge of the situation). However, observers have only limited knowledge about the operator's concept of the situation and cannot have complete insight into the mental state of the individual being evaluated. Thus, observers are forced to rely more on operators' observable actions and verbalizations in order to infer their level of SA. In this case, such actions and verbalizations are best assessed using performance and behavioral measures of SA, as described next.

Performance and behavioral measures

Performance measures "infer" SA from the end result (i.e., task performance outcomes), based on the assumption that better performance indicates better SA. Common performance metrics include quantity of output or productivity level, time to perform the task or respond to an event, and the accuracy of the response or, conversely, the number of errors committed. The main advantage of performance measures is that these can be collected objectively and without disrupting task performance. However, although evidence exists to suggest a positive relation between SA and performance, this connection is probabilistic and not always direct and unequivocal (Endsley, 1995b). In other words, good SA does not always lead to good performance and poor SA does not always lead to poor performance (Endsley, 1990). Thus, performance measures should be used in conjunction with others measures of SA that directly assess this construct.

Behavioral measures also "infer" SA from the actions that individuals choose to take, based on the assumption that good actions will follow from good SA and vice versa. Behavioral measures rely primarily on observer ratings, and are, thus, somewhat subjective in nature. To address this limitation, observers can be asked to evaluate the degree to which individuals are carrying out actions and exhibiting behaviors that would be expected to promote the achievement of higher levels of SA (see, for example, the situation awareness behaviorally anchored rating scale (SABARS) – Matthews, Pleban, Endsley, & Strater, 2000; Strater et al., 2001). This approach removes some of the subjectivity associated with making judgments about an individual's internal state of knowledge by allowing them to make judgments about SA indicators that are more readily observable.

Process indices

Process indices examine how individuals process information in their environment, such as by analyzing communication patterns between team members or using eye tracking devices. Team communication (particularly verbal communication) supports the knowledge building and information processing that leads to SA construction (Endsley & Jones, 1997). Thus, since SA may be distributed via communication, computational linguistics and machine learning techniques can be combined with natural language analytical techniques (e.g., Latent semantic analysis) to create models that draw on the verbal expressions of the team to predict SA and task performance (Bolstad, Cuevas, Gonzalez, & Schneider, 2005; Bolstad, Foltz, Franzke, Cuevas, Rosenstein, & Costello, 2007). Although evidence exists to support the utility of communication analysis for predicting team SA (Foltz, Bolstad, Cuevas, Franzke, Rosenstein, & Costello, in press), time constraints and technological limitations (e.g., cost and availability of speech recording systems and speech-to-text translation software) may make this approach less practical and viable in time-pressured, fast-paced operations.

Psycho-physiological measures also serve as process indices of operator SA by providing an assessment of the relationship between human performance and a corrected change in the operator's physiology (e.g., French, Clark, Pomeroy, Seymour, & Clarke, 2007). In other words, cognitive activity is associated with changes in the operator's physiological states. For example, the operator's overall functional state (as assessed using psycho-physiological measures, such as electroencephalographic (EEG) data, eyeblinks, and cardiac activity) may provide an indication as to whether the operator is sleep fatigued at one end of the continuum, or mentally overloaded at the other end (Wilson, 2000). Other psycho-physiological measures, such as event related potentials (ERP), event related desynchronization (ERD), transient heart rate (HR), and electrodermal activity (EDA), may be useful for evaluating an operator's perception of critical environmental cues, that is, to determine if the operator has detected and perceived a task-relevant stimulus (Wilson, 2000). In addition, it is also possible to use psycho-physiological measures to monitor operators' environmental expectancies, that is, their physiological responses to upcoming events, as a measure of their current level of SA (Wilson, 2000).

Multi-faceted approach to measurement

The multivariate nature of SA significantly complicates its quantification and measurement, as it is conceivable that a metric may only tap into one aspect of the operator's SA. Further, studies have shown that different types of SA measures do not always correlate strongly with each other (cf. Durso, Truitt, Hackworth, Crutchfield, Nikolic, Moertl, Ohrt, & Manning, 1995; Endsley, Selcon, Hardiman, & Croft, 1998; Vidulich, 2000). Accordingly, rather than rely on a single approach or metric, valid and reliable measurement of SA should utilize a battery of distinct yet related measures that complement each other (e.g., Harwood, Barnett, & Wickens, 1988). Such a multi-faced approach to SA measurement capitalizes on the strengths of each measure while minimizing the limitations inherent in each.

Training

Following Endsley's paradigm and with cognitive resource management model [42] with neurofeedback techniques, Spanish Pedagogist María Gabriela López García (2010) implemented and developed a new SA training pattern. [43] The first organization to implement this new pattern design by López García is the SPAF (Spanish Air Force). She has trained EF-18 fighter pilots and Canadair firefighters. [44]

This situation awareness training aims to avoid losing SA and provide pilots cognitive resources to always operate below the maximum workload that they can withstand. This provides not only a lower probability of incidents and accidents by human factors, but the hours of operation are at their optimum efficiency, extending the operating life of systems and operators. [45]

On-the-job examples

Emergency medical call-outs

In first aid medical training provided by the American Red Cross, the need to be aware of the situation within the area of influence as one approaches an individual requiring medical assistance is the first aspect for responders to consider [46] Examining the area and being aware of potential hazards, including the hazards which may have caused the injuries being treated, is an effort to ensure that responders do not themselves get injured and require treatment as well.

Situation awareness for first responders in medical situations also includes evaluating and understanding what happened [47] to avoid injury of responders and also to provide information to other rescue agencies which may need to know what the situation is via radio prior to their arrival on the scene.

In a medical context, situation awareness is applied to avoid further injury to already-injured individuals, to avoid injury to medical responders, and to inform other potential responders of hazardous conditions prior to their arrival.

Vehicle driving

A loss in situational awareness has led to many transportation disasters, including the 2015 Philadelphia train derailment. [48]

Search and rescue

Within the search and rescue context, situational awareness is applied primarily to avoid injury to search crews however being aware of the environment, the lay of the land, and the many other factors of influence within one's surroundings assists in the location of injured or missing individuals [49] Public safety agencies are increasingly using situational awareness applications like ATAK on mobile devices to improve situational awareness.

Forestry crosscut saw / chainsaw

In the United States Forest Service the use of chainsaws and crosscut saws requires training and certification. [50] A great deal of that training describes situational awareness as an approach toward environmental awareness but also self-awareness [51] which includes being aware of one's own emotional attitude, tiredness, and even caloric intake.

Situational awareness in the forest context also includes evaluating the environment and the potential safety hazards within a saw crew's area of influence. As a sawyer approaches a task, the ground, wind, cloud cover, hillsides, and many other factors are examined and are considered proactively as part of trained sawyers' ingrained training.

Dead or diseased trees within the reach of saw team crews are evaluated, the strength and direction of the wind is evaluated. The lay of tree sections to be bucked or the lean of a tree to be felled is evaluated within the context of being aware of where the tree will fall or move to when cut, where the other members of the saw team are located, how they are moving, whether hikers are within the area of influence, whether hikers are moving or are stationary.

Law enforcement

Law enforcement training includes being situationally aware of what is going on around the police officer before, during, and after interactions with the general public [52] while also being fully aware of what is happening around the officer in areas not currently the focus of an officer's immediate task.

Cybersecurity threat operations

In cybersecurity, consider situational awareness, for threat operations, is being able to perceive threat activity and vulnerability in context so that the following can be actively defended: data, information, knowledge, and wisdom from compromise. Situational awareness is achieved by developing and using solutions that often consume data and information from many different sources. Technology and algorithms are then used to apply knowledge and wisdom in order to discern patterns of behavior that point to possible, probable, and real threats.

Situational awareness for cybersecurity threat operations teams appears in the form of a condensed, enriched, often graphical, prioritized, and easily searchable view of systems that are inside or related to security areas of responsibility (such as corporate networks or those used for national security interests). Different studies have analyzed the perception of security and privacy in the context of eHealth, [53] network security, [54] or using collaborative approaches to improve the awareness of users. [55] There are also research efforts to automate the processing of communication network information in order to obtain or improve cyber-situational awareness. [56]

Situation awareness-based agency transparency (SAT) model

As the capabilities of technological agents increases, it becomes more important that their actions and underlying rational becomes transparent. In the military realm, agent transparency has been investigated as unmanned vehicles are being employed more frequently. In 2014, researchers at the U.S. Army Research Laboratory reported the Situation Awareness-based Agent Transparency (SAT), a model designed to increase transparency through user interface design. When it comes to automation, six barriers that have been determined to discourage human trust in: “‘low observability, predictability, directability and auditability’...[and] ‘low mutual understanding of common goals.’” [57] The researchers at the U.S. Army Research Laboratory designed three levels of situational awareness transparency based on Endsley's theory of perception, comprehension, and projection. The greater the level of situational awareness, they claimed, the more information the agent conveys to the user. [58]

A 2018 publication from the U.S. Army Research Laboratory evaluated how varying transparency levels in the SAT affects the operator workload and a human's understanding of when it is necessary to intervene in the agent's decision making. The researchers refer to this supervisory judgement as calibration. The group split their SAT model research into two efforts: the Intelligent Agent Transparency in Human Agent Transparency for Multi UxV Management (IMPACT) and the Autonomous Squad Member (ASM) projects. [57]

SAT for intelligent agent transparency in human agent transparency for multi UxV management

Scientists provided three standard levels of SAT in addition to a fourth level which included the agent's level of uncertainty in its decision in unmanned vehicles. The stated goal of this research was to determine how modifying levels of SAT affected user performance, situation awareness, and confidence in the agent. The scientists stated that their experimental results support that increased agent transparency improved the performance of the operator and human confidence on the agent without a significant effect on the workload. When the agent communicated levels of uncertainty in the task assigned, those involved in the experimentation displayed more trust in the agent. [59]

SAT in an autonomous squad member

The ASM research was conducted by providing a simulation game in which the participant had to complete a training course with an ASM, a ground robot that communicates with infantry. The participants had to multitask, evaluating potential threats while monitoring the ASM's communications on the interface. According to that research, experimental results demonstrated that the greatest confidence calibration occurred when the agent communicated information of all three levels of SAT. [59] The group of scientists from the U.S. Army Research Laboratory developed transparency visualization concepts in which the agents can communicate their plans, motivations, and projected outcomes through icons. The agent has been reported to be able to relate its resource usage, reasoning, predicted resource loss, progress towards task completion, etc. [57] Unlike in the IMPACT research, the agent informing the user of its level of uncertainty in decision making, no increase in trust was observed. [59]

Methods of gaining situational awareness

Crowdsourcing

Crowdsourcing, made possible by the rise of social media and ubiquitous mobile access has a potential for considerably enhancing situation awareness of both, responsible authorities and citizens themselves for emergency and crisis situations by employing or using "citizens as sensors". [60] [61] [62] [63] [64] [65] [66] [67] For instance, analysis of content posted on online social media like Facebook and Twitter using data mining, machine learning and natural language processing techniques may provide situational information. [68] A crowdsourcing approach to sensing, particularly in crisis situations, has been referred to as 'crowdsensing'. [69] Crowdmapping is a subtype of crowdsourcing [70] [71] by which aggregation of crowd-generated inputs such as captured communications and social media feeds are combined with geographic data to create a digital map that is as up-to-date as possible [72] [73] [74] [75] that can improve situational awareness during an incident and be used to support incident response. [76]

Cloud-based G.I.S. display of structured data

Since 2012 the National Information Sharing Consortium (NISC) has worked to provide "the right information to the right people at the right time" by use of common terminology among the emergency management community and first-responders with a mission of standardizing the structured geo-spatial data to be shared online over a variety of platforms. The result is to create a Common Operating Picture (COP) that generates accurate and timely information displayed visually, both at the strategic level for decision makers and at the tactical level for people on site. The NISC promotes the sharing of code, data widgets, and training in order to increase the quality of the situational awareness. Large scale exercises like the Capstone-14 week-long event coordinated by the Central United States Earthquake Consortium (CUSEC) and actual operational use of these data sharing methods have also advanced the work of NISC to expand the usability of GIS-based information sharing for enhanced situational awareness.

Military training methods

There are two training scenarios designed to increase the situational awareness skills of military professionals, and first responders in police and emergency services. The first, Kim's Game, has a more common place in the Marine Corps sniper school and police academies. The name is derived from the novel Kim which references the game to a spy school lesson. The game involves a tray with various items such as spoons, pencils, bullets, and any other items the soldiers would be familiar with. The participants are given one minute to view all of these items before they are covered up with a blanket. The participants would then individually list the items that they saw, the one with the most correct answers would win the game. [77] The same game is played in young scouting and girl guide groups as well to teach children quick memorisation skills.

The second method is a more practical military application of Kim's Game. It starts with a field area (jungle, bush or forest) of about five meters wide to 10 meters deep where various items, some camouflaged and some not, to be located in the area on the ground and in the trees at eyesight level. Again, these items would be ones that are familiar to the soldiers undergoing the exercise. The participants would be given 10 minutes to view the area from one place and take a mental note of the items they saw. Once their 10 minutes is up, the soldier would then be required to do a repetition of certain exercises such as burpees, designed to simulate the stress of a physically demanding environment. Once the participant completes the exercise, they would list the items they saw. The points would be tallied in the end to find the winner. [77]

See also

Related Research Articles

Usability Capacity of a system for its users to perform tasks

Usability can be described as the capacity of a system to provide a condition for its users to perform the tasks safely, effectively, and efficiently while enjoying the experience. In software engineering, usability is the degree to which a software can be used by specified consumers to achieve quantified objectives with effectiveness, efficiency, and satisfaction in a quantified context of use.

Traffic psychology is a discipline of psychology that studies the relationship between psychological processes and the behavior of road users. In general, traffic psychology aims to apply theoretical aspects of psychology in order to improve traffic mobility by helping to develop and apply crash countermeasures, as well as by guiding desired behaviors through education and the motivation of road users.

Neville A. Stanton is a British Professor of Human Factors and Ergonomics at the University of Southampton. Prof Stanton is a Chartered Engineer (C.Eng), Chartered Psychologist (C.Psychol) and Chartered Ergonomist (C.ErgHF). He has written and edited over forty books and over three hundred peer-reviewed journal papers on applications of the subject. Stanton is a Fellow of the British Psychological Society, a Fellow of The Institute of Ergonomics and Human Factors and a member of the Institution of Engineering and Technology. He has been published in academic journals including Nature. He has also helped organisations design new human-machine interfaces, such as the Adaptive Cruise Control system for Jaguar Cars.

GOMS is a specialized human information processor model for human-computer interaction observation that describes a user's cognitive structure on four components. In the book The Psychology of Human Computer Interaction. written in 1983 by Stuart K. Card, Thomas P. Moran and Allen Newell, the authors introduce: "a set of Goals, a set of Operators, a set of Methods for achieving the goals, and a set of Selections rules for choosing among competing methods for goals." GOMS is a widely used method by usability specialists for computer system designers because it produces quantitative and qualitative predictions of how people will use a proposed system.

In cognitive psychology, cognitive load refers to the used amount of working memory resources. There are three types of cognitive load: intrinsic cognitive load is the effort associated with a specific topic; extraneous cognitive load refers to the way information or tasks are presented to a learner; and germane cognitive load refers to the work put into creating a permanent store of knowledge.

The term workload can refer to a number of different yet related entities.

Ecological interface design (EID) is an approach to interface design that was introduced specifically for complex sociotechnical, real-time, and dynamic systems. It has been applied in a variety of domains including process control, aviation, and medicine.

Cognitive ergonomics is a scientific discipline that studies, evaluates, and designs tasks, jobs, products, environments and systems and how they interact with humans and their cognitive abilities. It is defined by the International Ergonomics Association as "concerned with mental processes, such as perception, memory, reasoning, and motor response, as they affect interactions among humans and other elements of a system. Cognitive ergonomics is responsible for how work is done in the mind, meaning, the quality of work is dependent on the persons understanding of situations. Situations could include the goals, means, and constraints of work. The relevant topics include mental workload, decision-making, skilled performance, human-computer interaction, human reliability, work stress and training as these may relate to human-system design." Cognitive ergonomics studies cognition in work and operational settings, in order to optimize human well-being and system performance. It is a subset of the larger field of human factors and ergonomics.

The naturalistic decision making (NDM) framework emerged as a means of studying how people make decisions and perform cognitively complex functions in demanding, real-world situations. These include situations marked by limited time, uncertainty, high stakes, team and organizational constraints, unstable conditions, and varying amounts of experience.

R-CAST is a group decision support system based on research on naturalistic decision making. Its architecture, based on multiple software agents, supports decision-making teams by anticipating information relevant to their decisions based on a shared mental model about the context of decision making.

Cattell–Horn–Carroll theory

The Cattell–Horn–Carroll theory, is a psychological theory on the structure of human cognitive abilities. Based on the work of three psychologists, Raymond B. Cattell, John L. Horn and John B. Carroll, the Cattell–Horn–Carroll theory is regarded as an important theory in the study of human intelligence. Based on a large body of research, spanning over 70 years, Carroll's Three Stratum theory was developed using the psychometric approach, the objective measurement of individual differences in abilities, and the application of factor analysis, a statistical technique which uncovers relationships between variables and the underlying structure of concepts such as 'intelligence'. The psychometric approach has consistently facilitated the development of reliable and valid measurement tools and continues to dominate the field of intelligence research.

Macrocognition indicates a descriptive level of cognition performed in natural instead of artificial (laboratory) environments. This term is reported to have been coined by Pietro Cacciabue and Erik Hollnagel in 1995. However, it is also reported that the term was used in the 1980s in European Cognitive Systems Engineering research. Possibly the earliest reference is the following, although it does not use the exact term "macrocognition":

A macro-theory is a theory which is concerned with the obvious regularities of human experience, rather than with some theoretically defined unit. To refer to another psychological school, it would correspond to a theory at the level of Gestalten. It resembles Newell’s suggestion for a solution that would analyse more complex tasks, although the idea of a macro-theory does not entail an analysis of the mechanistic materialistic kind which is predominant in cognitive psychology. Thus we should have a macro-theory of remembering rather than of memory, to say nothing of short-term memory, proactive inhibition release, or memory scanning. To take another example, we should have a macro-theory of attending, rather than a mini-theory of attention, or micro-theories of limited channel capacities or logarithmic dependencies in disjunctive reaction times. This would ease the dependence on the information processing analogy, but not necessarily lead to an abandonment of the information processing terminology, the Flowchart, or the concept of control structures. The meta-technical sciences can contribute to a psychology of cognition as well as to cognitive psychology. What should be abandoned is rather the tendency to think in elementaristic terms and to increase the plethora of mini-and micro-theories. ... To conclude, if the psychological study of cognition shall have a future that is not a continued description of human information processing, its theories must be at what we have called the macro-level. This means that they must correspond to the natural units of experience and consider these in relation to the regularities of human experience, rather than as manifestations of hypothetical information processing mechanisms in the brain. A psychology should start at the level of natural units in human experience and try to work upwards towards the level of functions and human action, rather than downwards towards the level of elementary information processes and the structure of the IPS.

Dynamic decision-making (DDM) is interdependent decision-making that takes place in an environment that changes over time either due to the previous actions of the decision maker or due to events that are outside of the control of the decision maker. In this sense, dynamic decisions, unlike simple and conventional one-time decisions, are typically more complex and occur in real-time and involve observing the extent to which people are able to use their experience to control a particular complex system, including the types of experience that lead to better decisions over time.

Cognitive skills, also called cognitive functions, cognitive abilities or cognitive capacities, are brain-based skills which are needed in acquisition of knowledge, manipulation of information, and reasoning. They have more to do with the mechanisms of how people learn, remember, problem-solve, and pay attention, rather than with actual knowledge. Cognitive skills or functions encompass the domains of perception, attention, memory, learning, decision making, and language abilities.

Human factors are the physical or cognitive properties of individuals, or social behavior which is specific to humans, and influence functioning of technological systems as well as human-environment equilibria. The safety of underwater diving operations can be improved by reducing the frequency of human error and the consequences when it does occur. Human error can be defined as an individual's deviation from acceptable or desirable practice which culminates in undesirable or unexpected results.

Dive safety is primarily a function of four factors: the environment, equipment, individual diver performance and dive team performance. The water is a harsh and alien environment which can impose severe physical and psychological stress on a diver. The remaining factors must be controlled and coordinated so the diver can overcome the stresses imposed by the underwater environment and work safely. Diving equipment is crucial because it provides life support to the diver, but the majority of dive accidents are caused by individual diver panic and an associated degradation of the individual diver's performance. - M.A. Blumenberg, 1996

Human factors and ergonomics Application of psychological and physiological principles to engineering and design

Human factors and ergonomics is the application of psychological and physiological principles to the engineering and design of products, processes, and systems. The goal of human factors is to reduce human error, increase productivity, and enhance safety and comfort with a specific focus on the interaction between the human and the thing of interest.

Mica Endsley former Chief Scientist of the U.S. Air Force

Mica Endsley is an engineer and a former Chief Scientist of the United States Air Force. The position of the Chief Scientist was created over 60 years ago to provide independent scientific advice to the Secretary of the Air Force and the Chief of Staff of the Air Force, as well as to its senior leadership. In this role, she worked with the top scientists and engineers within the Air Force as well as in academia, industry, and the other armed services to ensure that the Air Force's research and development efforts remain relevant and effective. Additionally, as the Chief Scientist she responded to any tasking from the Secretary of the Air Force and the Air Force Chief of Staff on issues or opportunities of a scientific and technical nature that may arise. Endsley was the first human factors engineer and the first female to serve as Chief Scientist.

Automation bias Propensity for humans to favor suggestions from automated decision-making systems

Automation bias is the propensity for humans to favor suggestions from automated decision-making systems and to ignore contradictory information made without automation, even if it is correct. Automation bias stems from the social psychology literature that found a bias in human-human interaction that showed that people assign more positive evaluations to decisions made by humans than to a neutral object. The same type of positivity bias has been found for human-automation interaction, where the automated decisions are rated more positively than neutral. This has become a growing problem for decision making as intensive care units, nuclear power plants, and aircraft cockpits have increasingly integrated computerized system monitors and decision aids to mostly factor out possible human error. Errors of automation bias tend to occur when decision-making is dependent on computers or other automated aids and the human is in an observatory role but able to make decisions. Examples of automation bias range from urgent matters like flying a plane on automatic pilot to such mundane matters as the use of spell-checking programs.

Human performance modeling (HPM) is a method of quantifying human behavior, cognition, and processes; a tool used by human factors researchers and practitioners for both the analysis of human function and for the development of systems designed for optimal user experience and interaction. It is a complementary approach to other usability testing methods for evaluating the impact of interface features on operator performance.

The SHELL model is a conceptual model of human factors that clarifies the scope of aviation human factors and assists in understanding the human factor relationships between aviation system resources/environment and the human component in the aviation system.

References

  1. 1 2 Endsley, 1995b, p. 36
  2. Nullmeyer, Stella, Montijo, & Harden 2005
  3. Schulz CM et al.Situation Awareness in Anesthesia - Concept and Research, Anesthesiology 2013.
  4. Blandford & Wong 2004; Gorman, Cooke, & Winner 2006
  5. Flin & O'Connor, 2001
  6. Hartel, Smith, & Prince, 1991; Merket, Bergondy, & Cuevas-Mesa, 1997; Nullmeyer, Stella, Montijo, & Harden, 2005
  7. Endsley, Micah; Jones, Debra (2016-04-19). Designing for Situation Awareness (Second ed.). CRC Press. p. 13. ISBN   978-1-4200-6358-5.
  8. 1 2 Lundberg, 2015
  9. Press, 1986
  10. Biferno, M.A. "Flight Crew Computer Interaction", Douglas Aircraft Company, Internal Research and Development. Long Beach, CA.
  11. Biferno, M.A., "Mental Workload Measurement", Douglas Aircraft Company, Internal Research and Development, Long Beach, CA.
  12. Dawson, M.E., Biferno, M.A. (1973). "Concurrent measurement of awareness and electrodermal classical conditioning", Journal of Experimental Psychology', 101, 55-62.
  13. Biferno, M.A.; Dawson, M.E. (1977). "The onset of contingency awareness and electrodermal classical conditioning: An analysis of temporal relationships during acquisition and extinction". Psychophysiology . 14 (2): 164–171. doi:10.1111/j.1469-8986.1977.tb03370.x. PMID   847068.
  14. Biferno, M.A. (1985). "Relationship between event-related potential components and ratings of workload and fatigue", NASA-Ames, Moffett Field, CA, NASA contract report 177354.
  15. Biferno, M. A. & Stanley, D. L. (1983). The Touch-Sensitive Control/Display Unit: A promising Computer Interface. Technical Paper 831532, Aerospace Congress & Exposition, Long Beach, CA: Society of Automotive Engineers.
  16. Watts, 2004
  17. Spick, 1988
  18. Endsley, 1995b
  19. Gaba, Howard & Small, 1995
  20. Collier & Follesf, 1995
  21. Bolstad, 2000
  22. Sollenberger & Stein, 1995
  23. S.M. Fiore, personal communication, November 6, 2007
  24. Blasch, E., Bosse, E., and Lambert, D. A., High-Level Information Fusion Management and Systems Design, Artech House, Norwood, MA, 2012.
  25. Boddhu, Sanjay K., et al. (2012). "Increasing situational awareness using smartphones." SPIE Defense, Security, and Sensing. International Society for Optics and Photonics, 2012.
  26. Sanjay Kumar Boddhu, Matt McCartney, Oliver Ceccopieri, et al., "A collaborative smartphone sensing platform for detecting and tracking hostile drones", Proceedings of SPIE Vol. 8742, 874211 (2013)
  27. Dostal, 2007
  28. Endsley, 2000
  29. Endsley & Jones, 1997
  30. Sarter & Woods, 1991
  31. Glaser, 1989
  32. Kozlowski, 1998
  33. Endsley, 1997
  34. Serfaty, MacMillan, Entin, & Entin, 1997
  35. Klein, Moon, and Hoffman, 2006
  36. Klein, Moon, and Hoffman, 2006, p. 71
  37. 1 2 Endsley, 2004
  38. 1 2 McAnally, K.; Davey, C.; White, D.; Stimson, M.; Mascaro, S.; Korb, K (2018). "Inference in the wild: A framework for human situation assessment and a case study of air combat". Cognitive Science . 42 (7): 2181–2204. doi: 10.1111/cogs.12636 . PMID   29936702.
  39. Sloman, S. (2005). Causal models: How people think about the world and its alternatives. Oxford University Press. ISBN   9780195183115 . Retrieved 2 May 2019.
  40. Kozlov, Michail; Engelmann, Tanja; Buder, Jurgen; Hesse, Friedrich W. (October 2015). "Is knowledge best shared or given to individuals? Expanding the Content-based Knowledge Awareness paradigm". Computers in Human Behavior. 51: 15–23. doi:10.1016/j.chb.2015.04.029.
  41. Smith, K. T. (2013) Building a human capability decision engine. Contemporary Ergonomics and Human Factors 2013 Proceedings of the international conference on Ergonomics & Human Factors 2013, 395–402 http://www.crcnetbase.com/doi/abs/10.1201/b13826-84
  42. Simmon, D.A. (1998). Boeing 757 CFIT Accident at Cali, Colombia, becomes focus of lessons learned. Flight Safety Digest, 17, 1-31.
  43. Revista Aviador --Official Spanish Commercial Pilots Association magazine--, July–August 2011, # 61, 38-39 pag.
  44. Revista de Aeronáutica y Astronáutica --Official SPAF magazine-- May 2012 issue, 436-439 pag.
  45. Cognitive Systems Engineering Jens Rasmussen and others.
  46. First Aid, Protect Yourself, American red Cross – Accessed 01/Aug/13
  47. First Aid, Understanding What Happened – Accessed 01/Aug/13
  48. Accident Report NTSB/RAR-16/02, PB2016-103218: Derailment of Amtrak Passenger Train 188, Philadelphia, Pennsylvania, May 12, 2015, National Transportation Safety Board (adopted May 17, 2016).
  49. Mountain Rescue Association Blog, Situational Awareness in Mountain Rescue Operations – Accessed 01/Aug/13
  50. US Forest Service, Chain Saw and Crosscut Saw Training Course – Accessed 01/Aug/13
  51. – U.S. Forest Service, Chapter 2, Page 7, Situational Awareness (PDF) – Accessed 01/Aug/13
  52. Police Chief, Improving Situational Awareness Archived 2014-01-08 at the Wayback Machine – Accessed 01/Aug/13
  53. Bellekens, Xavier; Hamilton, Andrew; Seeam, Preetila; Nieradzinska, Kamila; Franssen, Quentin; Seeam, Amar (2016). Pervasive eHealth services a security and privacy risk awareness survey (PDF). 2016 International Conference on Cyber Situational Awareness, Data Analytics and Assessment (CyberSA). pp. 1–4. doi:10.1109/CyberSA.2016.7503293. ISBN   978-1-5090-0703-5. S2CID   14502409.
  54. Best, Daniel M.; Bohn, Shawn; Love, Douglas; Wynne, Adam; Pike, William A. (2010). Real-time visualization of network behaviors for situational awareness. Proceedings of the Seventh International Symposium on Visualization for Cyber Security - VizSec '10. pp. 79–90. doi:10.1145/1850795.1850805. ISBN   9781450300131. S2CID   8520455.
  55. Mathews, Mary; Halvorsen, Paul; Joshi, Anupam; Finin, Tim (2012). A Collaborative Approach to Situational Awareness for CyberSecurity. Proceedings of the 8th IEEE International Conference on Collaborative Computing: Networking, Applications and Worksharing. doi:10.4108/icst.collaboratecom.2012.250794. ISBN   978-1-936968-36-7. S2CID   14135227.
  56. Sikos, Leslie; Stumptner, Markus; Mayer, Wolfgang; Howard, Catherine; Voigt, Shaun; Philp, Dean (2018), Automated Reasoning over Provenance-Aware Communication Network Knowledge in Support of Cyber-Situational Awareness, Lecture Notes in Computer Science, 11062, Cham: Springer, pp. 132–143, doi:10.1007/978-3-319-99247-1_12, ISBN   978-3-319-99246-4
  57. 1 2 3 "Army scientists improve human-agent teaming by making AI agents more transparent | U.S. Army Research Laboratory". arl.army.mil. Retrieved 2018-08-15.
  58. Boyce, Michael; Chen, Joyce; Selkowitz, Andrew; Lakhmani, Shan (May 2015). "Agent Transparency for an Autonomous Squad Member" (PDF). Retrieved 2018-07-28.
  59. 1 2 3 Chen, Jessie Y. C.; Lakhmani, Shan G.; Stowers, Kimberly; Selkowitz, Anthony R.; Wright, Julia L.; Barnes, Michael (2018-02-23). "Situation awareness-based agent transparency and human-autonomy teaming effectiveness". Theoretical Issues in Ergonomics Science. 19 (3): 259–282. doi: 10.1080/1463922x.2017.1315750 . ISSN   1463-922X. S2CID   115436644.
  60. "CrowdSA - Crowdsourced Situation Awareness for Crisis Management". cis.jku.at. Retrieved 9 January 2017.
  61. "SITUATION AWARENESS AND RELIEF SYSTEM DURING DISASTER EVENTS" (PDF). International Journal of Research in Science & Engineering. Retrieved 9 January 2017.
  62. "Crowdsourcing public safety: Building community resilience by enhancing citizen situation awareness capability | RISE:2017". RISE:2017, Northeastern University. Retrieved 9 January 2017.
  63. Shepard, Steven (2014-07-06). Telecommunications Crash Course, Third Edition. McGraw Hill Professional. ISBN   9780071797115 . Retrieved 9 January 2017.
  64. Poblet, Marta; García-Cuesta, Esteban; Casanovas, Pompeu (2014). Crowdsourcing Tools for Disaster Management: A Review of Platforms and Methods (PDF). Lecture Notes in Computer Science. 8929. pp. 261–274. doi:10.1007/978-3-662-45960-7_19. ISBN   978-3-662-45959-1. ISSN   0302-9743 . Retrieved 9 January 2017.
  65. "Crowdsourcing Information for Enhanced Disaster Situation Awareness and Emergency Preparedness and Response" (PDF). Retrieved 9 January 2017.Cite journal requires |journal= (help)
  66. Arkian, Hamid Reza; Diyanat, Abolfazl; Pourkhalili, Atefe (2017). "MIST: Fog-based data analytics scheme with cost-efficient resource provisioning for IoT crowdsensing applications". Journal of Network and Computer Applications. 82: 152–165. doi:10.1016/j.jnca.2017.01.012.
  67. Basu, Moumita; Bandyopadhyay, Somprakash; Ghosh, Saptarshi (2016). "Post Disaster Situation Awareness and Decision Support Through Interactive Crowdsourcing". Procedia Engineering. 159: 167–173. doi: 10.1016/j.proeng.2016.08.151 .
  68. Basu, Moumita; Bandyopadhyay, Somprakash; Ghosh, Saptarshi (2016). "Post Disaster Situation Awareness and Decision Support Through Interactive Crowdsourcing" (PDF). Procedia Engineering. 159: 167–173. doi: 10.1016/j.proeng.2016.08.151 . Retrieved 9 January 2017.
  69. Haddawy, Peter; Frommberger, Lutz; Kauppinen, Tomi; De Felice, Giorgio; Charkratpahu, Prae; Saengpao, Sirawaratt; Kanchanakitsakul, Phanumas (1 January 2015). Situation Awareness in Crowdsensing for Disease Surveillance in Crisis Situations (PDF). Proceedings of the Seventh International Conference on Information and Communication Technologies and Development. pp. 38:1–38:5. doi:10.1145/2737856.2737879. ISBN   9781450331630. S2CID   3026308 . Retrieved 9 January 2017.
  70. Aitamurto, Tanja (8 May 2015). "Crowdsourcing as a Knowledge-Search Method in Digital Journalism". Digital Journalism. 4 (2): 280–297. doi:10.1080/21670811.2015.1034807. ISSN   2167-0811. S2CID   156243124 . Retrieved 6 January 2017.
  71. Aitamurto, Tanja (1 October 2015). "Motivation Factors in Crowdsourced Journalism: Social Impact, Social Change, and Peer Learning" . Retrieved 6 January 2017.
  72. Sutter, John D. "Ushahidi: How to 'crowdmap' a disaster". CNN. Retrieved 6 January 2017.
  73. The Impact of Crowdsourcing on Organisational Practices: The Case of Crowdmapping. ISBN   978-3-00-050284-2 . Retrieved 6 January 2017.
  74. "Crowdsourced counter-surveillance: Examining the subversion of random breath testing stations by social media facilitated crowdsourcing".Cite journal requires |journal= (help)
  75. "Concepts to Know: Crowdmapping". Kimo Quaintance. 4 September 2011. Retrieved 6 January 2017.
  76. "Chemical Hazards and Poisons Report" (PDF). Public Health England. Retrieved 6 January 2017.
  77. 1 2 "Situational Awareness: How You Can Master Survival, Work and Life". The Prepping Guide. 2017-09-13. Retrieved 2017-12-13.

Sources