IMPRINT (Improved Performance Research Integration Tool)

Last updated
IMPRINT
IMPRINT logo.jpg
Developer(s) Alion Science and Technology, Army Research Laboratory, U.S. Army CCDC Data and Analysis Center
Stable release
4.6.60.0
Written in .NET Framework, C#
Operating system Microsoft Windows
Type Discrete Event Simulation
Website www.microsaintsharp.com/home/tools

The Improved Performance Research Integration Tool (IMPRINT) is a discrete-event simulation and human performance modeling software tool developed by the Army Research Laboratory and Micro Analysis and Design (acquired by Alion Science and Technology). It is developed using the .NET Framework. IMPRINT allows users to create discrete-event simulations as visual task networks with logic defined using the C# programming language. IMPRINT is primarily used by the United States Department of Defense to simulate the cognitive workload of its personnel when interacting with new and existing technology to determine manpower requirements and evaluate human performance. [1]

Contents

IMPRINT allows users to develop and run stochastic models of operator and team performance. IMPRINT includes three different modules: 1) Operations, 2) Maintenance, and 3) Forces. In the Operations module, IMPRINT users develop networks of discrete events (tasks) that are performed to achieve mission outcomes. These tasks are associated with operator workload that the user assigns with guidance in IMPRINT. Once the user has developed a model, it can be run to predict the probability of mission success (e.g., accomplishment of certain objectives or completion of tasks within a given time frame), time to complete the mission, workload experienced by the operators, and the sequence of tasks (and timeline) throughout the mission. Using the Maintenance module users can predict maintenance manpower requirements, manning requirements, and operational readiness, among other important maintenance drivers. Maintenance models consist of scenarios, segments, systems, subsystems, components and repair tasks. The underlying built-in stochastic maintenance model simulates the flow of systems into segments of a scenario and the performance of maintenance actions to estimate maintenance manhours for defined systems. The Forces module allows users predict comprehensive and multilevel manpower requirements for large organizations composed of a diverse set of positions and roles. Each force unit consists of a set of activities (planned and unplanned) and jobs. This information, when modeled, helps predict the manpower needed to perform the routine and unplanned work done by a force unit.

IMPRINT helps users to assess the integration of personnel and system performance throughout the system lifecycle—from concept and design to field testing and system upgrades. In addition, IMPRINT can help predict the effects training or personnel factors (e.g., as defined by Military Occupational Specialty) on human performance and mission success. IMPRINT also has built-in functions to predict the effects of stressors (e.g., heat, cold, vibration, fatigue, use of protective clothing) on operator performance (task completion time, task accuracy).

The IMPRINT Operations module uses a task network, a series of functions which decompose into tasks, to create human performance models. [2] Functions and tasks in IMPRINT models usually represent atomic units of larger human or system behaviors. One of IMPRINT's main features is its ability to model human workload. Users can specify visual, auditory, cognitive, and psychomotor workload levels for individual tasks which can measure overall workload for humans in the system and influence task performance. [3] [4]

History

The IMPRINT tool grew out of common U.S. Air Force, Navy, and Army manpower, personnel, and training (MPT) concerns identified in the mid-1970s: How to estimate MPT constraints and requirements early in system acquisition and how to enter those considerations into the design and decision-making process. The U.S. Navy first developed the HARDMAN (HARDware vs. MANpower) Comparability Methodology (HCM). The Army then tailored the manual HCM, which became known as HARDMAN I, for application to a broad range of weapon systems and later developed an automated version, HARDMAN II. In HARDMAN I and II, however, there was no direct link between MPT and performance. To directly remedy this shortcoming, the U.S. Army began the development of a set of software analysis modules in the mid-80's. [5] This set of modules was called HARDMAN III, and although the name was the same, it used a fundamentally different approach for addressing MPT concerns than previous methods: It provided an explicit link between MPT variables and soldier-system performance [6]

HARDMAN II.2 tool: HARDMAN II was formerly called MIST (Man Integrated Systems Technology). HARDMAN II.2 was first released by the Army Research Institute (ARI) in 1985. It required a Vax-11 computer to host the suite of analytical processes. An upgraded version was released during 1990.

HARDMAN III tools: HARDMAN III was a major development effort of the Army Research Institute's (ARI) System Research Laboratory (which has now become part of the ARL HRED). The contract that supported the work was let in a three phase development process. [7] Each phase resulted in multiple awards to contractors, based on a competitive evaluation of the work each contractor produced in the previous phase. The first phase, Concept Development, began September 1986 and completed April 1987. Phase 2, Requirements Specification, began June 1987 and ended January 1988. Phase 3 began April 1988 and ended August 1990.

HARDMAN III was Government-owned and consisted of a set of automated aids to assist analysts in conducting MANPRINT analyses. As PC DOS-based software, the HARDMAN III aids provided a means for estimating manpower, personnel, and training (MPT) constraints and requirements for new weapon systems very early in the acquisition process. The DOS environment imposed several limitations on the HARDMAN III tool set. The most significant problem was the 640K RAM limitation. The original HARDMAN III tools had to be designed so that pieces of the analyses could fit within these RAM blocks. However, the power of a MANPRINT analysis lies in the integration of the quantitative variables across the domains of the study. In order to support the tradeoff of, say, manpower and personnel, you must be able to consider them in an integrated fashion. Unfortunately, the DOS environment forced the flow of data across the analytical domains to be more stilted and deliberate than was ideal.

Furthermore, the DOS environment imposed limitations on the scope of analysis that could be conducted. Since the HARDMAN III analysis is task-based and includes simulation models of system missions, the amount of data that can be managed at once must fit under the RAM constraints. This led to a restriction of 400 operations tasks, and 500 maintenance tasks.

The nine modules in HARDMAN III were:

  1. MANpower-based System EVALuation aid (MAN-SEVAL): MAN-SEVAL was used to assess human workload.
    1. Workload Analysis Tool (WAA): integrates two key technologies: Micro SAINT simulation and modified McCracken-Aldrich workload assessment methodology. The modified McCracken-Aldrich workload assessment methodology was used to assess four workload components (visual, auditory, cognitive, and psychomotor) for each operator. Each task was assigned a scaled value for the four workload components. When the simulation was run, operator workload was tracked over time and can be displayed graphically.
    2. Maintenance Manpower Analysis Aid (MAMA): used to predict maintenance requirements and system availability.
  2. PERsonnel-based System EVALuation aid (PER-SEVAL): PER-SEVAL was used to assess crew performance in terms of time and accuracy. PER-SEVAL had three major components that were used to predict crew performance: (1) Performance-shaping functions that predicted task times and accuracies based on personnel characteristics (e.g., armed forces qualification test or AFQT) and estimated sustainment training frequencies. (2) Stressor degradation algorithms that diminished task performance to reflect the presence of heat, cold, noise, lack of sleep, and mission-oriented protective posture (MOPP) gear. (3) Simulation models that aggregated estimates of individual task performance and produce system performance estimates.
  3. System Performance and RAM Criteria Estimation Aid (SPARC): Helped Army combat developers identify comprehensive and unambiguous system performance requirements needed to accomplish various missions.
  4. MANpower CAPabilities analysis aide (MANCAP): The objective of MANCAP was to help users estimate maintenance manhour requirements at the system unit level. MANCAP let the analyst perform trade-off analyses between (1) the amounts of time systems are available for combat, given specified numbers and types of maintainers, (2) how often systems fail because of component reliability, and (3) how quickly systems can be repaired when one or more components have failed. MANCAP was originally inspired by the Air Force's Logistics Composite Model (LCOM). The results of MANCAP were used as the basis for estimating Army-wide manpower requirements in FORCE.
  5. Human Operator Simulator (HOS): HOS was a tool that was used to develop improved estimates for task time and accuracy. HOS had built-in models of particular subtasks (called micromodels), such as "hand movement," which help analysts to better estimate how long it would take an operator to do a certain task.
  6. Manpower CONstraints aid (M-CON): Identified the maximum crew size for operators and maintainers and the maximum Direct Productive Annual Maintenance Manhours (DPAMMH).
  7. Personnel CONstraints aid (P-CON): Estimated the significant personnel characteristics that describe and limit the capabilities of the probable soldier population from which the new system's operators and maintainers will come.
  8. Training CONstraints aid (T-CON): T-CON was designed to be used by the Government to identify the types of training programs that were likely to be available to support new systems. Identifies what the training program for the new system is likely to look like. Also estimated the maximum time needed to train the new system's operators and maintainers, given available training resources.
  9. Force Analysis Aid (FORCE): Provided Army-wide assessment of manpower and constraints based on estimating numbers of people and impacts by types of people (i.e. ASVB score and MOS).

IMPRINT was originally named: Integrated MANPRINT Tools and was first released in 1995. It was a Windows application that merged the functionality of the 9 HARDMAN III tools into one application. In 1997 IMPRINT was renamed to the Improved Performance Research Integration Tool – the name changed but the IMPRINT acronym remained the same. Between 1995 and 2006 several enhancements were made to IMPRINT and new releases (Versions 2 through 6) were made available. IMPRINT Pro was introduced in 2007. It featured a new interface design and complete integration with the Micro Saint Sharp simulation engine. It had enhanced analytical capabilities and moved from being an Army tool to a tri-service tool. From the beginning IMPRINT has continued to evolve, new enhancements have been continually added, and new releases made freely available to the user community. IMPRINT has over 800 users supporting Army, Navy, Air Force, Marine, NASA, DHS, DoT, Joint and other organizations across the country.

Discrete event simulation in IMPRINT

Simulations, or Missions as IMPRINT refers to them, contain a task network called a Network Diagram. The network diagram contains a series of tasks connected by paths which determine control flow. System objects called entities flow through the system to create a simulation. IMPRINT also includes more low level features such as global variables and subroutines called macros. [8]

Tasks

The task node is the primary element driving the simulation's outcome. Task nodes simulate system behavior by allowing programmer specified effects, task duration, failure rates, and pathing. Task Effects are programmer specified C# expressions where programmers can manipulate variables and data structures when a task is invoked. Task duration can be specified by the programmer as a specific value, through a probability distribution, or using a C# expression. Programmers can also specify task success in a similar way. Task success influences the effects of the task node and the pathing of the entity. Failure consequences include task repetition, task change, and mission failure among other options. Control flow and pathing can also be specified by the programmer. IMPRINT provides a series of other nodes which include special functionality:

Nodes include:

Entities

Entities are dynamic objects which arrive into the system and move through the task network. Entities flow from one task to the next based on the task's path logic. When an entity enters a task, the task's effects are triggered. When the task concludes, the entity moves to the next task. One entity is generated by default at the beginning of the simulation. More entities can be generated at any point in the simulation based on programmer specified logic. When all entities reach the end node or are destroyed, the simulation concludes. [8]

Events

Events are occurrences that happen in an instant of simulated time within IMPRINT that change the global state of the system. This can be the arrival or departure of an entity, the completion of a task, or some other occurrence. The events are stored in a master event log which captures every event that will happen and the simulated time that the event occurred. Due to the stochastic nature of discrete-event simulation, an event will often trigger the generation of a random variate to determine the next time that same event will occur. Thus, as events occur, in the simulation, the event log is altered. [8]

Control flow

Once a task concludes, the invoking entity moves to another node which is directly connected to the current node in the task network. Nodes can connect to any number of other tasks, so IMPRINT provides a number of pathing options to determine the task to which the entity moves. [8]

Variables and macros

IMPRINT has a number of global variables used by the system throughout a simulation. IMPRINT provides the public global variable Clock which tracks the simulation's current time. IMPRINT also has private variables such as operator workload values. IMPRINT allows the modeler to create custom global variables which can be accessed and modified in any task node. Variables can be of any type native to C#, but the software provides a list of suggested variable types including C# primitive data types and basic data structures. IMPRINT also provides the programmer with the functionality to create globally accessible subroutines called macros. Macros work as C# functions and can specify parameters, manipulate data, and return data. [8]

Human performance modeling

IMPRINT's workload management abilities allow users to model realistic operator actions under different work overload conditions. [4] IMPRINT allows users to specify Warfighters which represent human operators in the modeled system. Each task in IMPRINT is associated with at least one Warfighter. Warfighters can be assigned to any number of tasks, including tasks which execute concurrently. [4] IMPRINT tasks can be assigned VACP workload values. [3] The VACP method allows modelers to identify the visual, auditory, cognitive, and psychomotor workload of each IMPRINT task. In an IMPRINT task, each resource can be given a workload value between 0 and 7, with 0 being the lowest possible workload, and 7 being the highest possible workload for that resource. The VACP scale for each resource provides verbal anchors for certain scale values. For instance, a visual workload of 0.0 corresponds to “no visual activity”, while a visual workload of 7.0 continuous visual scanning, searching, and monitoring. [9] When a Warfighter is executing a task, their workload is increased using the VACP value assigned to that task. An IMPRINT plugin module was proposed in 2013 to improve the cognitive workload estimation within IMPRINT and make the overall calculation less linear. [10] IMPRINT's custom reporting feature allows modelers to view the workload over time of the Warfighters in their models. Workload monitor nodes allow modelers to view the workload of a specific Warfighter as the simulation executes. [8]

Research

IMPRINT has been used by scientists at the Army Research Lab to study Unmanned Aerial Systems, [11] [12] workload of warfighter crews, [13] [14] and human-robot interaction. [15] The United States Air Force and Air Force Institute of Technology have used IMPRINT to study automated systems, [16] [17] human systems integration, [18] and adaptive automation [19] among other things. The Air Force Institute of Technology in specific is using IMPRINT to research the prediction of operator performance, mental workload, situational awareness, trust, and fatigue in complex systems. [20]

Related Research Articles

Computer programming is the process of performing a particular computation, usually by designing and building an executable computer program. Programming involves tasks such as analysis, generating algorithms, profiling algorithms' accuracy and resource consumption, and the implementation of algorithms. The source code of a program is written in one or more languages that are intelligible to programmers, rather than machine code, which is directly executed by the central processing unit. The purpose of programming is to find a sequence of instructions that will automate the performance of a task on a computer, often for solving a given problem. Proficient programming thus usually requires expertise in several different subjects, including knowledge of the application domain, specialized algorithms, and formal logic.

Database Organized collection of data in computing

In computing, a database is an organized collection of data stored and accessed electronically. Small databases can be stored on a file system, while large databases are hosted on computer clusters or cloud storage. The design of databases spans formal techniques and practical considerations, including data modeling, efficient data representation and storage, query languages, security and privacy of sensitive data, and distributed computing issues, including supporting concurrent access and fault tolerance.

In software quality assurance, performance testing is in general a testing practice performed to determine how a system performs in terms of responsiveness and stability under a particular workload. It can also serve to investigate, measure, validate or verify other quality attributes of the system, such as scalability, reliability and resource usage.

Houdini (software) 3D animation software

Houdini is a 3D animation software application developed by Toronto-based SideFX, who adapted it from the PRISMS suite of procedural generation software tools. The procedural tools are used to produce different effects such as complex reflections, animations and particles system. Some of its procedural features have been in existence since 1987.

GOMS is a specialized human information processor model for human-computer interaction observation that describes a user's cognitive structure on four components. In the book The Psychology of Human Computer Interaction. written in 1983 by Stuart K. Card, Thomas P. Moran and Allen Newell, the authors introduce: "a set of Goals, a set of Operators, a set of Methods for achieving the goals, and a set of Selections rules for choosing among competing methods for goals." GOMS is a widely used method by usability specialists for computer system designers because it produces quantitative and qualitative predictions of how people will use a proposed system.

The term workload can refer to a number of different yet related entities.

Autodesk Softimage Discontinued 3D graphics software

Autodesk Softimage, or simply Softimage is a discontinued 3D computer graphics application, for producing 3D computer graphics, 3D modeling, and computer animation. Now owned by Autodesk and formerly titled Softimage|XSI, the software has been predominantly used in the film, video game, and advertising industries for creating computer generated characters, objects, and environments.

Ecological interface design (EID) is an approach to interface design that was introduced specifically for complex sociotechnical, real-time, and dynamic systems. It has been applied in a variety of domains including process control, aviation, and medicine.

IBM TPNS Test automation tool developed by IBM

Teleprocessing Network Simulator (TPNS) is an IBM licensed program, first released in 1976 as a test automation tool to simulate the end-user activity of network terminal(s) to a mainframe computer system, for functional testing, regression testing, system testing, capacity management, benchmarking and stress testing.

Naval Surface Warfare Center Crane Division Division of the U.S. Naval Surface Warfare Center

Naval Surface Warfare Center Crane Division is the principal tenant command located at Naval Support Activity Crane. NSA Crane is a United States Navy installation located approximately 35 miles (56 km) southwest of Bloomington, Indiana, and predominantly located in Martin County, but small parts also extend into Greene and Lawrence counties. It was originally established in 1941 under the Bureau of Ordnance as the Naval Ammunition Depot for production, testing, and storage of ordnance under the first supplemental Defense Appropriation Act. The base is named after William M. Crane. The base is the third largest naval installation in the world by geographic area and employs approximately 3,300 people. The closest community is the small town of Crane, which lies adjacent to the northwest corner of the facility.

Simulation software is based on the process of modeling a real phenomenon with a set of mathematical formulas. It is, essentially, a program that allows the user to observe an operation through simulation without actually performing that operation. Simulation software is used widely to design equipment so that the final product will be as close to design specs as possible without expensive in process modification. Simulation software with real-time response is often used in gaming, but it also has important industrial applications. When the penalty for improper operation is costly, such as airplane pilots, nuclear power plant operators, or chemical plant operators, a mock up of the actual control panel is connected to a real-time simulation of the physical response, giving valuable training experience without fear of a disastrous outcome.

Core architecture data model

Core architecture data model (CADM) in enterprise architecture is a logical data model of information used to describe and build architectures.

Sustained wakefulness within military combat operations has long been an issue directly related to safety and its impact on those who enter battle. Fatigue remains a formidable enemy in theater operations due to its pervasive impact on cognitive effectiveness during flight. Incidents of fatigue-related errors continue to affect the stability and effectiveness of pilots engaged in 24-7 operations: approximately 20% of all Air Force/Air National Guard aviation mishaps have fatigue as either a causal or contributing factor.

Real-time Control System

Real-time Control System (RCS) is a reference model architecture, suitable for many software-intensive, real-time computing control problem domains. It defines the types of functions needed in a real-time intelligent control system, and how these functions relate to each other.

The Application Interface Specification (AIS) is a collection of open specifications that define the application programming interfaces (APIs) for high-availability application computer software. It is developed and published by the Service Availability Forum and made freely available. Besides reducing the complexity of high-availability applications and shortening development time, the specifications intended to ease the portability of applications between different middleware implementations and to admit third party developers to a field that was highly proprietary in the past.

In mathematical modeling, deterministic simulations contain no random variables and no degree of randomness, and consist mostly of equations, for example difference equations. These simulations have known inputs and they result in a unique set of outputs. Contrast stochastic (probability) simulation, which includes random variables.

The Joint Theater Level Simulation (JTLS) is used to simulate joint, combined, and coalition civil-military operations at the operational level. Used for civil/military simulations and humanitarian assistance/disaster relief (HA/DR) scenarios, JTLS is an interactive, computer-assisted simulation that models multi-sided air, ground, and naval resources with logistical Special Operation Forces (SOF) and intelligence support. The primary purpose of JTLS is to create a realistic environment in which agency staff can operate as they would within a real-world or operational situation. A training audience conducts a scenario or event to practice their ability to coordinate various staff functions.

Human performance modeling (HPM) is a method of quantifying human behavior, cognition, and processes. It is a tool used by human factors researchers and practitioners for both the analysis of human function and for the development of systems designed for optimal user experience and interaction. It is a complementary approach to other usability testing methods for evaluating the impact of interface features on operator performance.

Micro Saint Sharp

Micro Saint Sharp is a general purpose discrete-event simulation and human performance modeling software tool developed by Alion Science and Technology. It is developed using C# and the .NET Framework. Micro Saint Sharp allows users to create discrete-event simulations as visual task networks with logic defined using the C# programming language.

Human Systems Integration (HSI) is an interdisciplinary managerial and technical approach to developing and sustaining systems which focuses on the interfaces between humans and modern technical systems. The objective of HSI is to provide equal weight to human, hardware, and software elements of system design throughout systems engineering and lifecycle logistics management activities across the lifecycle of a system. The end goal of HSI is to optimize total system performance and minimize total ownership costs. The field of HSI integrates work from multiple human centered domains of study include training, manpower, personnel, human factors engineering, safety, occupational health, survivability and habitability.

References

  1. Rusnock, C. F., & Geiger, C. D. (2013). Using Discrete-Event Simulation for Cognitive Workload Modeling and System Evaluation. Proceedings of the 2013 Industrial and Systems Engineering Research Conference, 2485–2494. Retrieved from http://search.proquest.com/openview/b77033807ade34134e81d078a4513631/1?pq-origsite=gscholar
  2. Laughery, R. (1999). Using discrete-event simulation to model human performance in complex systems. In Proceedings of the 31st conference on Winter simulation Simulation---a bridge to the future - WSC ’99 (Vol. 1, pp. 815–820). New York, New York, USA: ACM Press. http://doi.org/10.1145/324138.324506
  3. 1 2 Mitchell, D. K. (2003). Advanced Improved Performance Research Integration Tool (IMPRINT) Vetronics Technology Test Bed Model Development.
  4. 1 2 3 IMPRINT PRO user guide Vol 1. http://www.arl.army.mil/www/pages/446/IMPRINTPro_vol1.pdf
  5. Kaplan, J.D. (1991) Synthesizing the effects of manpower, personnel, training and human engineering. In E. Boyle. J. Ianni, J. Easterly, S. Harper, & M. Korna (Eds. Human centered technology for maintainability: Workshop proceedings (AL-TP-1991-0010) (pp. 273-283). Wright-Patterson AFB, OH: Armstrong Laboratory
  6. Allender, L., Lockett, J., Headley, D., Promisel, D., Kelley, T., Salvi, L., Richer, C., Mitchell, D., Feng, T. “HARDMAN III and IMPRINT Verification, Validation, and Accreditation Report.” Prepared for the US Army Research Laboratory, Human Research & Engineering Directorate, December 1994."
  7. Adkins, R., and Dahl (Archer), S.G., “Final Report for HARDMAN III, Version 4.0.” Report E-482U, prepared for US Army Research Laboratory, July 1993
  8. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 IMPRINT PRO user guide Vol 2. http://www.arl.army.mil/www/pages/446/IMPRINTPro_vol2.pdf
  9. Mitchell, D. K. (2000). Mental Workload and ARL Workload Modeling Tools (ARL-TN-161). Aberdeen Proving Ground.
  10. Cassenti, D. N., Kelley, T. D., & Carlson, R. A. (2013, November). Differences in performance with changing mental workload as the basis for an IMPRINT plug-in proposal. In 22nd Annual Conference on Behavior Representation in Modeling and Simulation, Ottawa, Canada.
  11. Hunn, B. P., & Heuckeroth, O. H. (2006). A shadow unmanned aerial vehicle (UAV) improved performance research integration tool (IMPRINT) model supporting future combat systems. Human
  12. Hunn, B. P., Schweitzer, K. M., Cahir, J. a, & Finch, M. M. (2008). IMPRINT Analysis of an Unmanned Air System Geospatial Information Process. Scenario, (July).
  13. Salvi, L. (2001). Development of Improved Performance Research Integration Tool (IMPRINT) Performance Degradation Factors for the Air Warrior Program. Retrieved from papers2://publication/uuid/197638AB-1200-4BFE-A922-E5E12FB25BD6
  14. Mitchell, D. K. (2009). Workload Analysis of the Crew of the Abrams V2 SEP : Phase I Baseline IMPRINT Model. Engineering, (September).
  15. Pomranky, R. a. (2006). Human Robotics Interaction Army Technology Objective Raven Small Unmanned Aerial Vehicle Task Analysis and Modeling. ARL-TR-3717.
  16. Colombi, J. M., Miller, M. E., Schneider, M., McGrogan, J., Long, D. S., & Plaga, J. (2011). Predictive Mental Workload Modeling for Semiautonomous System Design: Implications for Systems of Systems. Systems Engineering, 14(3), 305–326. http://doi.org/10.1002/sys
  17. Goodman, T., Miller, M., & Rusnock, C. (2015). Incorporating Automation: Using Modeling and Simulation to Enable Task Re-Allocation. In Proceedings of the 2015 Winter Simulation Conference (pp. 2388–2399). Huntington Beach, CA, CA: IEEE. http://doi.org/10.1073/pnas.0703993104
  18. Miller, M., Colombi, J., & Tvaryanas, A. (2013). Human systems integration. Handbook of Industrial and Systems Engineering, Second Edition, 197–216. http://doi.org/doi:10.1201/b15964-15
  19. Boeke, D., Miller, M., Rusnock, C., & Borghetti, B. J. (2015). Exploring Individualized Objective Workload Prediction with Feedback for Adaptive Automation. In S. Cetinkaya & J. K. Ryan (Eds.), Proceedings of the 2015 Industrial and Systems Engineering Research Conference (pp. 1437–1446). Nashville, TN.
  20. Rusnock, C. F., Boubin, J. G., Giametta, J. J., Goodman, T. J., Hillesheim, A. J., Kim, S., … Watson, M. E. (2016). The Role of Simulation in Designing Human-Automation Systems. In Foundations of Augmented Cognition Neuroergonomics and Operational Neuroscience: Part II (pp. 361–370). http://doi.org/10.1007/978-3-642-02812-0