Live, virtual, and constructive

Last updated

Live, Virtual, & Constructive (LVC) Simulation is a broadly used taxonomy for classifying Modeling and Simulation (M&S). However, categorizing a simulation as a live, virtual, or constructive environment is problematic since there is no clear division among these categories. The degree of human participation in a simulation is infinitely variable, as is the degree of equipment realism. The categorization of simulations also lacks a category for simulated people working real equipment. [1]

Contents

Categories

The LVC categories as defined by the United States Department of Defense in the Modeling and Simulation Glossary [2] as follows:

Other associated terms are as follows:

Other definitions used in LVC discussions (Webster's dictionary)

  1. Enterprise: a project or undertaking that is especially difficult, complicated, or risky
    • A: a unit of economic organization or activity; especially: a business organization
    • B: a systematic purposeful activity
  2. Environment: The aggregate of surrounding things, conditions or influences; surroundings
  3. Construct: To make or form by combining or arranging components
  4. Component: One of the parts of something

Current and emerging technology to enable true LVC technology for Combat Air Forces ("CAF") training require standardized definitions of CAF LVC events to be debated and developed. The dictionary terms used above provide a solid foundation of understanding of the fundamental structure of the LVC topic as applied universally to DoD activities. The terms and use cases described below are a guidepost for doctrine that uses these terms to eliminate any misunderstanding. The following paragraph uses these terms to layout the global view, and will be explained in detail throughout the rest of the document. In short:

Training and Operational Test are conducted through the combined use of three separate Constructs (Live, Simulator and Ancillary) which are in turn made up of several enabling Components to prepare, test and/or train warfighters in their respective disciplines. The LVC Enterprise, a component of the Live construct, is the totality of personnel, hardware and software that enables warfighters to combine three historically disparate Environments (Live, Virtual and Constructive) to improve performance in their combat role.

Central to a functionally accurate understanding of the paragraph above is a working knowledge of the Environment definitions, provided below for clarity:

The Environments (L, V, & C) by themselves are generally well understood and apply universally to a diverse range of disciplines such as the medical field, law enforcement or operational military applications. Using the medical field as an example, the Live Environment can be a doctor performing CPR on a human patient in a critical real world situation. In this same context, the Virtual Environment would include a doctor practicing CPR on a training mannequin, and the Constructive Environment is the software within the training mannequin that drives its behavior. In a second example, consider fighter pilot training or operational testing. The Live environment is the pilot flying the combat aircraft. The Virtual environment would include that same pilot flying a simulator. The constructive environment includes the networks, computer generated forces, and weapons servers, etc. that enable the Live and Virtual environments to be connected and interact. Although there are clearly secondary and tertiary training benefits, it is important to understand combining one or more environments for the purpose of making Live real world performance better is the sole reason the LVC concept was created. However, when referring to specific activities or programs designed to integrate the environments across the enterprise, the use and application of terms differ widely across the DoD. Therefore, the words that describe specifically how future training or operational testing will be accomplished require standardization as well. This is best described by backing away from technical terminology and thinking about how human beings actually prepare for their specific combat responsibilities. In practice, human beings prepare for their roles in one of three Constructs: Live (with actual combat tools), in a Simulator of some kind, or in other Ancillary ways (tests, academics, computer-based training, etc.). Actions within each of the Constructs are further broken down into Components that specify differing ways to get the job done or achieve training objectives. The three Constructs are described below:

Live Construct

Live is one of three constructs representing humans operating their respective disciplines’ operational system. Operational system examples could consist of a tank, a naval vessel, an aircraft or eventually even a deployed surgical hospital. Three components of the Live Construct follow

Simulator Construct

The Simulator Construct is a combination of Virtual and Constructive (VC), and is composed of humans operating simulated devices in lieu of Live operational systems. The Simulator Construct consists of three components:

Ancillary Construct

Is the third construct other than Live or Simulator whereby training is accomplished via many components (not all-inclusive)

Utilizing the definitions above, the following table provides a graphical representation of how the terms relate in the context of CAF Training or Operational Test:

Using the figure above as a guide, it is clear LVC activity is the use of the Virtual and Constructive environments to enhance scenario complexity for the Live environment – and nothing more. An LVC system must have a bi-directional, adaptable, ad-hoc and secure communication system between the Live environment and the VC environment. Most importantly, LVC used as a verb is an integrated interaction of the three environments with the Live environment always present. For example, a Simulator Construct VC event should be called something other than LVC (such as Distributed Mission Operations (DMO)). In the absence of the Live environment LVC and LC do not exist, making the use of the LVC term wholly inappropriate as a descriptor.

As the LVC Enterprise pertains to a training program, LVC lines of effort are rightly defined as “a collaboration of OSD, HAF, MAJCOM, Joint and Coalition efforts toward a technologically sound and fiscally responsible path for training to enable combat readiness.” The “lines of effort,” in this case, would not include Simulator Construct programs and development but would be limited to the Construct that includes the LVC Enterprise. The other common term, “Doing LVC” would then imply “readiness training conducted utilizing an integration of Virtual and Constructive assets for augmenting Live operational system scenarios and mission objective outcomes.” Likewise, LVC-Operational Training (in a CAF fighter training context) or “LVC-OT” are the tools and effort required to integrate Live, Virtual and Constructive mission systems, when needed, to tailor robust and cost-efficient methods of Operational Training and/or Test.

Misused and extraneous terms

To ensure clarity of discussions and eliminate misunderstanding, when speaking in the LVC context, only the terms in this document should be used to describe the environments, constructs, and components. Words like “synthetic” and “digi” should be replaced with “Constructive” or “Virtual” instead. Additionally, Embedded Training (ET) systems, defined as a localized or self-contained Live/Constructive system (like on the F-22 or F-35) should not be confused with or referred to as LVC systems.

History

LVC Simulation Architectures Venn Diagram LVC Architecture Venn Diagram.png
LVC Simulation Architectures Venn Diagram
Usage Frequency of Simulation Architectures Architectures in Use.png
Usage Frequency of Simulation Architectures

Prior to 1990, the field of M&S was marked by fragmentation and limited coordination between activities across key communities. In recognition of these deficiencies, Congress directed the Department of Defense (DoD) to “... establish an Office of the Secretary of Defense (OSD) level joint program office for simulation to coordinate simulation policy, to establish interoperability standards and protocols, to promote simulation within the military departments, and to establish guidelines and objectives for coordination [sic] of simulation, wargaming, and training.” (ref Senate Authorization Committee Report, FY91, DoD Appropriations Bill, SR101-521, pp. 154–155, October 11, 1990) Consistent with this direction, the Defense Modeling and Simulation Office (DMSO) was created, and shortly afterwards many DoD Components designated organizations and/or points of contact to facilitate coordination of M&S activities within and across their communities. For over a decade, the ultimate goal of the DoD in M&S is to create a LVC-IA to assemble models and simulations quickly, which create an operationally valid LVC environment to train, develop doctrine and tactics, formulate operational plans and assess warfighting situations. A common use of these LVC environments will promote closer interaction between operations and acquisition communities. These M&S environments will be constructed from composeable components interoperating through an integrated architecture. A robust M&S capability enables the DOD to meet operational and support objectives effectively across the diverse activities of the military services, combatant commands and agencies. [5] [6]

The number of available architectures have increased over time. M&S trends indicate that once a community of use develops around an architecture, that architecture is likely to be used regardless of new architectural developments. M&S trends also indicate that few, if any, architectures will be retired as new ones come online. When a new architecture is created to replace one or more of the existing set, the likely outcome is one more architecture will be added to the available set. As the number of mixed-architecture events increase over time, the inter-architecture communication problem increases as well. [7]

M&S has made significant progress in enabling users to link critical resources through distributed architectures.

In the mid 1980s, SIMNET became the first successful implementation of a large-scale, real-time, man-in-the-loop simulator networking for team training and mission rehearsal in military operations. The earliest successes that came through the SIMNET program was the demonstration that geographically dispersed simulation systems could support distributed training by interacting with each other across network connections. [8]

The Aggregate Level Simulation Protocol (ALSP) extended the benefits of distributed simulation to the force-level training community so that different aggregate-level simulations could cooperate to provide theater-level experiences for battle-staff training. The ALSP has supported an evolving “confederation of models” since 1992, consisting of a collection of infrastructure software and protocols for both inter-model communication through a common interface and time advance using a conservative Chandy-Misra-based algorithm. [9]

At about the same time, the SIMNET protocol evolved and matured into the Distributed Interactive Simulation (DIS) Standard. DIS allowed an increased number of simulation types to interact in distributed events, but was primarily focused on the platform-level training community. DIS provided an open network protocol standard for linking real-time platform-level wargaming simulations. [10]

In the mid 1990s, the Defense Modeling and Simulation Office (DMSO) sponsored the High Level Architecture (HLA) initiative. Designed to support and supplant both DIS and ALSP, investigation efforts were started to prototype an infrastructure capable of supporting these two disparate applications. The intent was to combine the best features of DIS and ALSP into a single architecture that could also support uses in the analysis and acquisition communities while continuing to support training applications.

The DoD test community started development of alternate architectures based on their perception that HLA yielded unacceptable performance and included reliability limitations. The real-time test range community started development of the Test and Training Enabling Architecture (TENA) to provide low-latency, high-performance service in the hard-real-time application of integrating live assets in the test-range setting. TENA, through its common infrastructure, including the TENA Middleware and other complementary architecture components, such as the TENA Repository, Logical Range Archive, and other TENA utilities and tools, provides the architecture and software implementation and capabilities necessary to quickly and economically enable interchangeability among range systems, facilities, and simulations. [11] [12] [13]

Similarly, the U.S. Army started the development of the Common Training Instrumentation Architecture (CTIA) to link a large number of live assets requiring a relatively narrowly bounded set of data for purposes of providing After Action Reviews (AARs) on Army training ranges in the support of large-scale exercises. [14]

Other efforts that make the LVC architecture space more complex include universal interchangeability software packages such as OSAMS [15] or CONDOR [16] developed and distributed by commercial vendors.

As of 2010 all of the DoD architectures remain in service with the exception of SIMNET. Of the remaining architectures: CTIA, DIS, HLA, ALSP and TENA, some are in early and growing use (e.g., CTIA, TENA) while others have seen a user-base reduction (e.g., ALSP). Each of the architectures is providing an acceptable level of capability within the areas where they have been adopted. However, DIS, HLA, TENA, and CTIA-based federations are not inherently interoperable with each other. when simulations rely on different architectures, additional steps must be taken to ensure effective communication between all applications. These additional steps, typically involving interposing gateways or bridges between the various architectures, may introduce increased risk, complexity, cost, level of effort, and preparation time. Additional problems extend beyond the implementation of individual simulation events. As a single example, the ability to reuse supporting models, personnel (expertise), and applications across the different protocols is limited. The limited inherent interoperability between the different protocols introduces a significant and unnecessary barrier to the integration of live, virtual, and constructive simulations.

Challenges

The current status of LVC interoperability is fragile and subject to several reoccurring problems that must be resolved (often anew) whenever live, virtual or constructive simulation systems are to be components in a mixed-architecture simulation event. Some of the attendant problems stem from simulation system capability limitations and other system-to-system incompatibilities. Other types of problems arise from the general failure to provide a framework which achieves a more complete semantic-level interoperability between disparate systems. [17] Interoperability, Integration and Composeablity have been identified as the most technical challenging aspects of a LVC-IA since at least 1996. The Study on the Effectiveness of Modeling and Simulation in the Weapon System Acquisition Process [18] identified cultural and managerial challenges as well. By definition a LVC-IA is a socialtechnical system, a technical system that interacts directly with people. The following table identifies the 1996 challenges associated with the technical, cultural and managerial aspects. In addition, the challenges or gaps found in a 2009 study are also included. [19] The table shows there is little difference between the challenges of 1996 and the challenges of 2009.

Type1996 Challenges2009 Challenges
Technical
  • Interoperability
  • Data Description Availability
  • Data Security and Sensitivity
  • Physics-based M&S
  • Hardware and Software Limitations
  • Variable Resolution
  • Interoperability
  • Data Discovery
  • Security
  • Representative, Composeable and Validated Models
  • Fault Monitoring and Persistence
  • Fidelity, Scale and Resolution filters
Cultural
  • Acquisition Process
  • Incentives for M&S use
  • M&S workforce (Training and Access)
  • Acceptance of M&S
  • Process Tools
  • Communities of Practice
  • Workforce Training and Collaboration
  • Infrastructure
Managerial
  • Office of Secretary Defense Guidance
  • Ownership of Data and Models
  • VV&A
  • Funding Process
  • Use of System Model
  • Governance, Standards Policies
  • Data & Model Mediation
  • VV&A
  • Consistent Funding
  • Efficient Use and Best Practices

Approaches to a solution

Ziegler's Architecture for Modeling and Simulation Ziegler Architectural Levels.png
Ziegler's Architecture for Modeling and Simulation
M&S in the JCID process M&S in the JCID.png
M&S in the JCID process

A virtual or constructive model usually focuses on the fidelity or accuracy of the element being represented. A live simulation, by definition represents the highest fidelity, since it is reality. But a simulation quickly becomes more difficult when it is created from various live, virtual and constructive elements, or sets of simulations with various network protocols, where each simulation consists of a set of live, virtual and constructive elements. The LVC simulations are socialtechical systems due to the interaction between people and technology in the simulation. The users represent stakeholders from across the acquisition, analysis, testing, training, planning and experimentation communities. M&S occurs across the entire Joint Capabilities Integration Development System (JCID) lifecycle. See the "M&S in the JCID Process" figure. A LVC-IA is also considered an Ultra Large Scale (ULS) system due to the use by a wide variety of stakeholders with conflicting needs and the continuously evolving construction from heterogeneous parts. [20] By definition, people are not just users but elements of a LVC simulation.

During the development of various LVC-IA environments, attempts to understand the foundational elements of integration, composability and interoperability emerged. As of 2010, our understanding of these three elements are still evolving, just as software development continues to evolve. Consider software architecture; as a concept it was first identified in the research work of Edsger Dijkstra in 1968 and David Parnas in the early 1970s. The area of software architecture was only recently adopted in 2007 by ISO as ISO/IEC 42010:2007. Integration is routinely described using the methods of architectural and software patterns. The functional elements of integration can be understood due to universality of integration patterns, e.g. Mediation (intra-communication) and Federation (inter-communication); process, data synchronization and concurrency patterns.

A LVC-IA is dependent on the Interoperability and Composability attributes, not just the technical aspects, but the social or cultural aspects as well. There are sociotechnical challenges, as well as ULS system challenges associated with these features. An example of a cultural aspect is the problem of composition validity. In an ULS the ability to control all interfaces to ensure a valid composition is extremely difficult. The VV&A paradigms are challenged to identify a level of acceptable validity.

Interoperability

The study of interoperability concerns methodologies to interoperate different systems distributed over a network system. Andreas Tolk introduced the Levels of Conceptual Interoperability Model (LCIM) which identified seven levels of interoperability among participating systems as a method to describe technical interoperability and the complexity of interoperations. [21]

Bernard Zeigler's Theory of Modeling and Simulation extends on the three basic levels of interoperability:

The pragmatic level focuses on the receiver’s interpretation of messages in the context of application relative to the sender’s intent. The semantic level concerns definitions and attributes of terms and how they are combined to provide shared meaning to messages. The syntactic level focuses on a structure of messages and adherence to the rules governing that structure. The linguistic interoperability concept supports simultaneous testing environment at multiple levels.

The LCIM associate the lower layers with the problems of simulation interoperation while the upper layers relate to the problems of reuse and composition of models. They conclude “simulation systems are based on models and their assumptions and constraints. If two simulation systems are combined, these assumptions and constraints must be aligned accordingly to ensure meaningful results”. This suggests that levels of interoperability that have been identified in the area of M&S can serve as guidelines to discussion of information exchange in general.

The Zeigler Architecture provides an architecture description language or conceptual model in which to discuss M&S. The LCIM provides a conceptual model as a means to discuss integration, interoperability and composability. The three linguistic elements relates the LCIM to the Ziegler conceptual model. Architectural and structural complexity are an area of research in systems theory to measure the cohesion and coupling and is based on the metrics commonly used in software development projects. Zeigler, Kim, and Praehofer present a theory of modeling and simulation which provides a conceptual framework and an associated computational approach to methodological problems in M&S. The framework provides a set of entities and relations among the entities that, in effect, present an ontology of the M&S domain. [22]

Composability

Petty and Weisel [23] formulated the current working definition: "Composability is the capability to select and assemble simulation components in various combinations into simulation systems to satisfy specific user requirements." Both a technical and user interaction is required indicative of a sociotechnical system is involved. The ability for a user to access data or access models is an important factor when considering composability metrics. If the user does not have visibility into a repository of models, the aggregation of models becomes problematic.

In Improving the Composability of Department of Defense Models and Simulation, the factors associated with the ability to provide composability are as follows:

Tolk [25] introduced an alternative view on Composability, focusing more on the need for conceptual alignment:

The M&S community understands interoperability quite well as the ability to exchange information and to use the data exchanged in the receiving system. Interoperability can be engineered into a system or a service after definition and implementation. ...

Composability is different from interoperability. Composability is the consistent representation of truth in all participating systems. It extends the ideas of interoperability by adding the pragmatic level to cover what happens within the receiving system based on the received information. In contrast to interoperability, composability cannot be engineered into a system after the fact. Composability requires often significant changes to the simulation.

In other words: Propertied concepts, if they are modeled in more than one participating system, have to represent the same truth. It is not allowed for composable systems to gain different answer to the same question in both systems. The requirement for consistent representation of truth supersedes the requirement for meaningful use of received information known from interoperability.

LVC requires integratability, interoperability, and composability

Page et al. [26] suggest defining integratability contending with the physical/technical realms of connections between systems, which include hardware and firmware, protocols, networks, etc., interoperability contending with the software and implementation details of interoperations; this includes exchange of data elements via interfaces, the use of middleware, mapping to common information exchange models, etc., and composability contending with the alignment of issues on the modeling level. As captured, among others, by Tolk, [27] successful interoperation of solutions of LVC components requires integratability of infrastructures, interoperability of systems, and composability of models. LVC Architectures must holistically address all three aspects in well aligned systemic approaches.

Economic drivers

To produce the greatest impact from its investments, the DoD needs to manage its M&S programs utilizing an enterprise-type approach. This includes both identifying gaps in M&S capabilities that are common across the enterprise and providing seed moneys to fund projects that have widely applicable payoffs, and conducting M&S investment across the Department in ways that are systematic and transparent. In particular, “Management processes for models, simulations, and data that … Facilitate the cost effective and efficient development of M&S systems and capabilities….” such as are cited in the vision statement require comprehensive Departmental M&S best-practice investment strategies and processes. M&S investment management requires metrics, both for quantifying the extent of potential investments and for identifying and understanding the full range of benefits resulting from these investments. There is at this time no consistent guidance for such practice. [28]

LVC Continuum LVC Continuum.PNG
LVC Continuum

The development & use costs associated with LVC can be summarized as follows: [29] [30]

In contrast, the fidelity of M&S is highest in Live, lower in Virtual, and lowest in Constructive. As such, DoD policy is a mixed use of LVC through the Military Acquisition life cycle, also known as the LVC Continuum. In the LVC Continuum figure to the right, the JCIDS process is related to the relative use of LVC through the Military Acquisition life cycle.

See also

Related Research Articles

<span class="mw-page-title-main">Simulation</span> Imitation of the operation of a real-world process or system over time

A simulation is an imitative representation of a process or system that could exist in the real world. In this broad sense, simulation can often be used interchangeably with model. Sometimes a clear distinction between the two terms is made, in which simulations require the use of models; the model represents the key characteristics or behaviors of the selected system or process, whereas the simulation represents the evolution of the model over time. Another way to distinguish between the terms is to define simulation as experimentation with the help of a model. This definition includes time-independent simulations. Often, computers are used to execute the simulation.

<span class="mw-page-title-main">Flight simulator</span> Technology used for training aircrew

A flight simulator is a device that artificially re-creates aircraft flight and the environment in which it flies, for pilot training, design, or other purposes. It includes replicating the equations that govern how aircraft fly, how they react to applications of flight controls, the effects of other aircraft systems, and how the aircraft reacts to external factors such as air density, turbulence, wind shear, cloud, precipitation, etc. Flight simulation is used for a variety of reasons, including flight training, the design and development of the aircraft itself, and research into aircraft characteristics and control handling qualities.

Distributed Interactive Simulation (DIS) is an IEEE standard for conducting real-time platform-level wargaming across multiple host computers and is used worldwide, especially by military organizations but also by other agencies such as those involved in space exploration and medicine.

Composability is a system design principle that deals with the inter-relationships of components. A highly composable system provides components that can be selected and assembled in various combinations to satisfy specific user requirements. In information systems, the essential features that make a component composable are that it be:

<span class="mw-page-title-main">Air Force Agency for Modeling and Simulation</span> Military unit

The United States Air Force established the Air Force Agency for Modeling and Simulation (AFAMS) in June 1996 at Orlando, Florida. AFAMS mission is to enhance and leverage Modeling & Simulation to support and facilitate integrated, realistic and efficient operational training across warfighting domains to enable full-spectrum readiness. AFAMS vision is advance readiness through Live, Virtual and Constructive (LVC) training.

SIMNET was a wide area network with vehicle simulators and displays for real-time distributed combat simulation: tanks, helicopters and airplanes in a virtual battlefield. SIMNET was developed for and used by the United States military. SIMNET development began in the mid-1980s, was fielded starting in 1987, and was used for training until successor programs came online well into the 1990s.

Simulation software is based on the process of modeling a real phenomenon with a set of mathematical formulas. It is, essentially, a program that allows the user to observe an operation through simulation without actually performing that operation. Simulation software is used widely to design equipment so that the final product will be as close to design specs as possible without expensive in process modification. Simulation software with real-time response is often used in gaming, but it also has important industrial applications. When the penalty for improper operation is costly, such as airplane pilots, nuclear power plant operators, or chemical plant operators, a mock up of the actual control panel is connected to a real-time simulation of the physical response, giving valuable training experience without fear of a disastrous outcome.

The Aggregate Level Simulation Protocol (ALSP) is a protocol and supporting software that enables simulations to interoperate with one another. Replaced by the High Level Architecture (simulation) (HLA), it was used by the US military to link analytic and training simulations.

MIMIC Simulator is a product suite from Gambit Communications consisting of simulation software in the network and systems management space.

An instructional simulation, also called an educational simulation, is a simulation of some type of reality but which also includes instructional elements that help a learner explore, navigate or obtain more information about that system or environment that cannot generally be acquired from mere experimentation. Instructional simulations are typically goal oriented and focus learners on specific facts, concepts, or applications of the system or environment. Today, most universities make lifelong learning possible by offering a virtual learning environment (VLE). Not only can users access learning at different times in their lives, but they can also immerse themselves in learning without physically moving to a learning facility, or interact face to face with an instructor in real time. Such VLEs vary widely in interactivity and scope. For example, there are virtual classes, virtual labs, virtual programs, virtual library, virtual training, etc. Researchers have classified VLE in 4 types:

Modeling and simulation (M&S) is the use of models as a basis for simulations to develop data utilized for managerial or technical decision making.

A dynamic terrain is the representation of terrain together with the capability for modification during a simulation.

The Simulation Interoperability Standards Organization (SISO) is an organization dedicated to the promotion of modeling and simulation interoperability and reuse for the benefit of diverse modeling and simulation communities, including developers, procurers, and users, worldwide.

Vortex Studio is a simulation software platform developed by CM Labs Simulations. It features a real-time physics engine that simulates rigid body dynamics, collision detection, contact determination, and dynamic reactions. It also contains model import and preparation tools, an image generator, and networking tools for distributed simulation which is accessed through a desktop editor via a GUI. Vortex adds accurate physical motion and interactions to objects in visual-simulation applications for operator training, mission planning, product concept validation, heavy machinery and robotics design and testing, haptics devices, immersive and virtual reality (VR) environments.

Mounted Warfare TestBed (MWTB) at Fort Knox, Kentucky, was the premier site for distributed simulation experiments in the US Army for over 20 years. It used simulation systems, including fully manned virtual simulators and computer-generated forces, to perform experiments that examined current and future weapon systems, concepts, and tactics.

The virtual world framework (VWF) is a means to connect robust 3D, immersive, entities with other entities, virtual worlds, content and users via web browsers. It provides the ability for client-server programs to be delivered in a lightweight manner via web browsers, and provides synchronization for multiple users to interact with common objects and environments. For example, using VWF, a developer can take video lesson plans, component objects and avatars and successfully insert them into an existing virtual or created landscape, interacting with the native objects and users via a VWF interface.

Test and Training Enabling Architecture (TENA) is an architecture designed to bring interoperability to United States Department of Defense test and training systems. TENA is designed to promote integrated testing and simulation-based acquisition through the use of a large-scale, distributed, real-time synthetic environment, which integrates testing, training, simulation, and high-performance computing technologies, distributed across many facilities, using a common architecture.

MAK Technologies, formerly doing business as VT MAK, Inc. is a software company based in Cambridge, Massachusetts that provides commercial off-the-shelf (COTS) modeling and simulation software. The company develops and sells software for distributed simulations that system integrators, governments, and research institutions use to build and populate 3D simulated environments. Users include medical, aerospace, defense, and transportation industries. In addition to offering COTS software, MAK provides the following services: simulation content creation, software customization, interoperability, research and development, and training.

The CAPE-OPEN Interface Standard consists of a series of specifications to expand the range of application of process simulation technologies. The CAPE-OPEN specifications define a set of software interfaces that allow plug and play inter-operability between a given Process Modelling Environment and a third-party Process Modelling Component.

References

  1. "DoD Modeling and Simulation (M&S) Glossary", DoD 5000.59-M, DoD, January 1998 "Archived copy" (PDF). Archived from the original (PDF) on 2007-07-10. Retrieved 2009-04-22.{{cite web}}: CS1 maint: archived copy as title (link)
  2. "US Department of Defense Modeling and Simulation Glossary" (PDF).
  3. "Policy, information and guidance on the Modelling and Simulation aspects of UK MOD Defence Acquisition version 1.0.3 - May 2010", "Acquisition Operating Framework". Archived from the original on 2011-09-04. Retrieved 2010-11-21.
  4. "Eurosim: Eurosim".
  5. Strategic Vision for DOD Modeling and Simulation; http://www.msco.mil/files/Strategic_Vision_Goals.pdf, 2007
  6. “Modeling and Simulation Master Plan”, DoD 5000.59P, Oct 1995, http://www.everyspec.com/DoD/DoD-PUBLICATIONS/DoD5000--59_P_2258/
  7. Henninger, Amy E., Cutts, Dannie, Loper, Margaret, et al., “Live Virtual Constructive Architecture Roadmap (LVCAR) Final Report”, Institute for Defense Analysis, Sept. 2008, "Archived copy" (PDF). Archived from the original (PDF) on 2011-07-22. Retrieved 2010-11-27.{{cite web}}: CS1 maint: archived copy as title (link)
  8. Miller, D. C.; Thorpe, J. A. (1995). SIMNET: the advent of simulator networking; Proceedings of the IEEE Volume: 83 Issue: 8 Aug 1995 Page(s): 1114-1123, cited in Henniger, Amy, et al., "Live Virtual Constructive Architecture Roadmap Final report"
  9. Weatherly, Richard M.; Wilson, Annette L.; Canova, Bradford S.; Page, Ernest H.; Zabek, Anita A.; Fischer, Mary C. (1996). "Advanced distributed simulation through the Aggregate Level Simulation Protocol" . Proceedings of HICSS-29: 29th Hawaii International Conference on System Sciences. Hawaii International Conference on System Sciences. pp.  407. CiteSeerX   10.1.1.37.4784 . doi:10.1109/HICSS.1996.495488. ISBN   978-0-8186-7324-5. S2CID   16082035.
  10. Murray, Robert;"DIS Overview and Version 7 Information", SISO; http://www.sisostds.org/DesktopModules/Bring2mind/DMX/Download.aspx?Command=Core_Download&EntryId=29289&PortalId=0&TabId=105
  11. Hudges, Ed; The Test and Training Enabling Architecture (TENA) Enabling Interchangeability Among Ranges, Facilities, and Simulations; "Archived copy" (PDF). Archived from the original (PDF) on 2011-07-06. Retrieved 2010-11-28.{{cite web}}: CS1 maint: archived copy as title (link)
  12. Powell, E.; Range System Interchangeability. In the Proceedings of Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC); 2005
  13. Powell, E. T., and J. R. Noseworthy (2012) “The Test and Training Enabling Architecture (TENA)”. In Engineering Principles of Combat Modeling and Distributed Simulation, edited by A. Tolk, Chapter 20, pp. 449–477. Hoboken, NJ: John Wiley & Sons.
  14. "COMBAT TRAINING CENTER - INSTRUMENTATION SYSTEM", PEO STRI; http://www.peostri.army.mil/combat-training-center-instrumentation-system-ctc-is-
  15. Steinman, Jeffrey;"A Proposed Open System Architecture for Modeling and Simulation";presentation to JPEO; 2007;http://www.dtic.mil/ndia/2007cbis/wednesday/steinmanWed430.pdf
  16. Wallace, Jeffrey W.; Hannibal, Barbara J. (2006). "A Naturalistic Approach to Complex, Intelligent System Development and Integration". Proceedings of the 2006 International Conference on Artificial Intelligence, ICAI 2006. Vol. 2. CiteSeerX   10.1.1.85.4259 .
  17. Bernard Zeigler, Saurabh Mittal, Xiaolin Hu; "Towards a Formal Standard for Interoperability in M&S/System of Systems Integration", AFCEA-George Mason University Symposium, May 2008; "Archived copy" (PDF). Archived from the original (PDF) on 2010-07-02. Retrieved 2010-11-27.{{cite web}}: CS1 maint: archived copy as title (link)
  18. Patenaude, A;"Study on the Effectiveness of Modeling and Simulation in the Weapon System Acquisition Process";SAIC for the Director, Test, Systems Engineering and Evaluation Office of the Under Secretary of Defense for Acquisition, Logistics and Technology; 1996; http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA327774&Location=U2&doc=GetTRDoc.pdf
  19. Funaro, Gregory, “Measures of Effectiveness for Live, Virtual, Constructive Integrated Architectures”, 09F-SIW-028, SISO Conference, 2009;
  20. "SEI Digital Library". June 2006.
  21. Chungman Seo, Bernard P. Zeigler;"DEVS NAMESPACE FOR INTEROPERABLE DEVS/SOA";Proceedings of the 2009 Winter Simulation Conference; "Archived copy" (PDF). Archived from the original (PDF) on 2010-06-27. Retrieved 2010-11-27.{{cite web}}: CS1 maint: archived copy as title (link)
  22. Zeigler, B. P., Kim, T.G., and Praehofer, H., Theory of Modeling and Simulation, New York, NY, Academic Press, 2000.
  23. Petty, M.D. and Weisel, E.W. (2003). A Composability Lexicon. Proceedings IEEE Spring Simulation Interoperability Workshop, IEEE CS Press; http://www.cs.virginia.edu/~rgb2u/03S-SIW-023.doc
  24. Davis, P.K. and Anderson, R.H. (2003). Improving the Composability of Department of Defense Models and Simulations. RAND Corporation
  25. Simon J. E Taylor, Azam Khan, Katherine L. Morse, Andreas Tolk, Levent Yilmaz, Justyna Zander, and Pieter J. Mosterman (2015): “Grand Challenges for Modeling and Simulation: Simulation Everywhere - From Cyberinfrastructure to Clouds to Citizens,” SIMULATION Vol.91, pp. 648-665, DOI: 10.1177/0037549715590594
  26. Page, E.H., Briggs, R., and Tufarolo, J.A. (2004). Toward a Family of Maturity Models for the Simulation Interconnection Problem. Proceedings of the Spring 2004 Simulation Interoperability Workshop, IEEE CS Press
  27. Tolk, A. (2010). Interoperability and Composability. Chapter 12 in J.A. Sokolowski and C.M. Banks (Eds): Modeling and Simulation Fundamentals - Theoretical Underpinnings and Practical Domains, John Wiley, 403-433
  28. AEgis;Metrics for Modeling and Simulation (M&S) Investments, REPORT No. TJ-042608-RP013;2008;http://www.msco.mil/files/MSCO%20Online%20Library/MSCO%20-%20Metrics%20for%20M&S%20Investments%20-%20Final%20Report.pdf Archived 2011-07-22 at the Wayback Machine
  29. Kelly, Michael J., Ratcliff, Allen, and Phillips, Mark, "The Application of Live, Virtual and Constructive Simulation to Training for Operations Other Than War", Simulation Industry Association of Australia, 3 February 1997
  30. Furness, Zach, Tyler, John, "Fully Automated Simulation Forces (FAFs): A Grand Challenge for Military Training", 01F-SIW-007, Simulation Interoperability Standards Organization, 2001