Co-simulation

Last updated

In co-simulation, the different subsystems that form a coupled problem are modeled and simulated in a distributed manner. Hence, the modeling is done on the subsystem level without having the coupled problem in mind. Furthermore, the coupled simulation is carried out by running the subsystems in a black-box manner. During the simulation, the subsystems will exchange data. Co-simulation can be considered as the joint simulation of the already well-established tools and semantics; when they are simulated with their suitable solvers. [1] Co-simulation proves its advantage in validation of multi-domain and cyber-physical systems by offering a flexible solution that allows consideration of multiple domains with different time steps, at the same time. As the calculation load is shared among simulators, co-simulation also enables the possibility of large-scale system assessment. [2]

Contents

Abstraction layers of co-simulation framework

The following introduction and structuration is proposed in. [3]

Establishing a co-simulation framework can be a challenging and complex task, because it requires a strong interoperability among the participating elements, especially in case of multiple-formalism co-simulation. Harmonization, adaptation, and eventually changes of actual employed standards and protocols in individual models needs to be done to be able to integrate into the holistic framework. The generic layered structuration of co-simulation framework [3] highlights the intersection of domains and the issues that need to be solved in the process of designing a co-simulation framework. In general, a co-simulation framework consists of five abstraction layers:

Structuration of co-simulation framework
Abstraction layerDescriptionAssociated issues
ConceptualHighest level where the models are considered as black boxes and the level concerns the co-simulation framework representation.Generic structure of the framework; Meta-Modeling of the components.
SemanticThe level concerns the signification and the role of the co-simulation framework with respect to the open questions of the investigated system and studied phenomenon.Signification of individual models; Interaction graph among the models; Signification of each interaction.
SyntacticThe level concerns the formalization of the co-simulation framework.Formalization of individual models in the respective domains; Specification and handling the difference between a formalism to another one.
DynamicThe level concerns the execution of the co-simulation framework, the synchronization techniques and harmonization of different models of computation.Order of execution and causality of models; Harmonization of different models of computation; Resolution for potential conflict in simultaneity of actions.
TechnicalThe level concerns the implementation details and evaluation of simulation.Distributed or centralized implementation; Robustness of the simulation; Reliability and efficiency of the simulation.

From conceptual structuration, the architecture on which the co-simulation framework is developed and the formal semantic relations/syntactic formulation are defined. The detailed technical implementation and synchronization techniques are covered in dynamic and technical layers.

Problem Partitioning - Architecture of co-simulation

The partitioning procedure identifies the process of spatial separation of the coupled problem into multiple partitioned subsystems. Information is exchanged through either ad-hoc interfaces or via intermediate buffer governed by a master algorithm. Master algorithm (where exists) is responsible for instantiating the simulators and for orchestrating the information exchange (simulator-simulator or simulator-orchestrator). [3]

Coupling methods

Co-simulation coupling methods can be classified into operational integration and formal integration, depending on abstraction layers. In general, operational integration is used in co-simulation for a specific problem and aims for interoperability at dynamic and technical layers (i.e. signal exchange). On the other hand, formal integration allows interoperability in semantic and syntactic level via either model coupling or simulator coupling. Formal integration often involves a master federate to orchestrate the semantic and syntactic of the interaction among simulators.

From a dynamic and technical point of view, it is necessary to consider the synchronization techniques and communication patterns in the process of implementation.

Communication Patterns

There exist three principal communication patterns for master algorithms. The Gauss-Seidel, the Jacobi variants and transmission line modelling, TLM. The names of the first two methods are derived from the structural similarities to the numerical methods by the same name.

The reason is that the Jacobi method is easy to convert into an equivalent parallel algorithm while there are difficulties to do so for the Gauss-Seidel method. [4]

Gauss-Seidel (serial)

Gauss-Seidel sequence for two subsystems Gauss-Seidel iteration sequence for two subsystems.pdf
Gauss-Seidel sequence for two subsystems

Jacobi (parallel)

Jacobi sequence for two subsystems Jacobi iteration sequence for two subsystems.pdf
Jacobi sequence for two subsystems

Transmission line modelling, TLM

In transmission line modelling (a.k.a. bi-directional delay line modelling), a capacitance (or inductance) is substituted with a transmission line element with wave propagation. The time delay is set to be one time step. In this way a physically motivated time delay is introduced which means that the system can be partitioned at this location. Numerical stability is ensured since there is no numerical error, instead there is a modelling error introduced, which is more benign. This is usually the most simple to implement since it results in an explicit scheme.

Related Research Articles

<span class="mw-page-title-main">Interoperability</span> Ability of systems to work with each other

Interoperability is a characteristic of a product or system to work with other products or systems. While the term was initially defined for information technology or systems engineering services to allow for information exchange, a broader definition takes into account social, political, and organizational factors that impact system-to-system performance.

System of systems is a collection of task-oriented or dedicated systems that pool their resources and capabilities together to create a new, more complex system which offers more functionality and performance than simply the sum of the constituent systems. Currently, systems of systems is a critical research discipline for which frames of reference, thought processes, quantitative analysis, tools, and design methods are incomplete. The methodology for defining, abstracting, modeling, and analyzing system of systems problems is typically referred to as system of systems engineering.

An agent-based model (ABM) is a computational model for simulating the actions and interactions of autonomous agents in order to understand the behavior of a system and what governs its outcomes. It combines elements of game theory, complex systems, emergence, computational sociology, multi-agent systems, and evolutionary programming. Monte Carlo methods are used to understand the stochasticity of these models. Particularly within ecology, ABMs are also called individual-based models (IBMs). A review of recent literature on individual-based models, agent-based models, and multiagent systems shows that ABMs are used in many scientific domains including biology, ecology and social science. Agent-based modeling is related to, but distinct from, the concept of multi-agent systems or multi-agent simulation in that the goal of ABM is to search for explanatory insight into the collective behavior of agents obeying simple rules, typically in natural systems, rather than in designing agents or solving specific practical or engineering problems.

<span class="mw-page-title-main">Fluid–structure interaction</span>

Fluid–structure interaction (FSI) is the interaction of some movable or deformable structure with an internal or surrounding fluid flow. Fluid–structure interactions can be stable or oscillatory. In oscillatory interactions, the strain induced in the solid structure causes it to move such that the source of strain is reduced, and the structure returns to its former state only for the process to repeat.

Semantic interoperability is the ability of computer systems to exchange data with unambiguous, shared meaning. Semantic interoperability is a requirement to enable machine computable logic, inferencing, knowledge discovery, and data federation between information systems.

Statistical energy analysis (SEA) is a method for predicting the transmission of sound and vibration through complex structural acoustic systems. The method is particularly well suited for quick system level response predictions at the early design stage of a product, and for predicting responses at higher frequencies. In SEA a system is represented in terms of a number of coupled subsystems and a set of linear equations are derived that describe the input, storage, transmission and dissipation of energy within each subsystem. The parameters in the SEA equations are typically obtained by making certain statistical assumptions about the local dynamic properties of each subsystem (similar to assumptions made in room acoustics and statistical mechanics). These assumptions significantly simplify the analysis and make it possible to analyze the response of systems that are often too complex to analyze using other methods (such as finite element and boundary element methods).

Contact dynamics deals with the motion of multibody systems subjected to unilateral contacts and friction. Such systems are omnipresent in many multibody dynamics applications. Consider for example

Business semantics management (BSM) encompasses the technology, methodology, organization, and culture that brings business stakeholders together to collaboratively realize the reconciliation of their heterogeneous metadata; and consequently the application of the derived business semantics patterns to establish semantic alignment between the underlying data structures.

Live, Virtual, & Constructive (LVC) Simulation is a broadly used taxonomy for classifying Modeling and Simulation (M&S). However, categorizing a simulation as a live, virtual, or constructive environment is problematic since there is no clear division among these categories. The degree of human participation in a simulation is infinitely variable, as is the degree of equipment realism. The categorization of simulations also lacks a category for simulated people working real equipment.

This article documents the effort of the Health Level Seven(HL7) community and specifically the former HL7 Architecture Board (ArB) to develop an interoperability framework that would support services, messages, and Clinical Document Architecture(CDA) ISO 10871.

The JAUS Tool Set (JTS) is a software engineering tool for the design of software services used in a distributed computing environment. JTS provides a Graphical User Interface (GUI) and supporting tools for the rapid design, documentation, and implementation of service interfaces that adhere to the Society of Automotive Engineers' standard AS5684A, the JAUS Service Interface Design Language (JSIDL). JTS is designed to support the modeling, analysis, implementation, and testing of the protocol for an entire distributed system.

Model Driven Interoperability (MDI) is a methodological framework, which provides a conceptual and technical support to make interoperable enterprises using ontologies and semantic annotations, following model driven development (MDD) principles.

Medical device connectivity is the establishment and maintenance of a connection through which data is transferred between a medical device, such as a patient monitor, and an information system. The term is used interchangeably with biomedical device connectivity or biomedical device integration. By eliminating the need for manual data entry, potential benefits include faster and more frequent data updates, diminished human error, and improved workflow efficiency.

Unified interoperability is the property of a system that allows for the integration of real-time and non-real time communications, activities, data, and information services and the display and coordination of those services across systems and devices. Unified interoperability provides the capability to communicate and exchange processing across different applications, data, and infrastructure.

<span class="mw-page-title-main">MOOSE (software)</span> Finite element framework software

MOOSE is an object-oriented C++ finite element framework for the development of tightly coupled multiphysics solvers from Idaho National Laboratory. MOOSE makes use of the PETSc non-linear solver package and libmesh to provide the finite element discretization.

System-level simulation (SLS) is a collection of practical methods used in the field of systems engineering, in order to simulate, with a computer, the global behavior of large cyber-physical systems.

The CAPE-OPEN Interface Standard consists of a series of specifications to expand the range of application of process simulation technologies. The CAPE-OPEN specifications define a set of software interfaces that allow plug and play inter-operability between a given Process Modelling Environment and a third-party Process Modelling Component.

In numerical analysis, multi-time-step integration, also referred to as multiple-step or asynchronous time integration, is a numerical time-integration method that uses different time-steps or time-integrators for different parts of the problem. There are different approaches to multi-time-step integration. They are based on domain decomposition and can be classified into strong (monolithic) or weak (staggered) schemes. Using different time-steps or time-integrators in the context of a weak algorithm is rather straightforward, because the numerical solvers operate independently. However, this is not the case in a strong algorithm. In the past few years a number of research articles have addressed the development of strong multi-time-step algorithms. In either case, strong or weak, the numerical accuracy and stability needs to be carefully studied. Other approaches to multi-time-step integration in the context of operator splitting methods have also been developed; i.e., multi-rate GARK method and multi-step methods for molecular dynamics simulations.

References

  1. Steinbrink, Cornelius (2017). "Simulation-based Validation of Smart Grids – Status Quo and Future Research Trends". Industrial Applications of Holonic and Multi-Agent Systems. Lecture Notes in Computer Science. Vol. 10444. pp. 171–185. arXiv: 1710.02315 . doi:10.1007/978-3-319-64635-0_13. ISBN   978-3-319-64634-3. S2CID   10022783.
  2. Andersson, Håkan (2018-09-11). A Co-Simulation Approach for Hydraulic Percussion Units. Linköping University Electronic Press. ISBN   978-91-7685-222-4.
  3. 1 2 3 Nguyen, V.H.; Besanger, Y.; Tran, Q.T; Nguyen, T.L. (29 Nov 2017). "On Conceptual Structuration and Coupling Methods of Co-Simulation Frameworks in Cyber-Physical Energy System Validation". Energies. 10 (12): 1977. doi: 10.3390/en10121977 . CC-BY icon.svg Material was copied from this source, which is available under a Creative Commons Attribution 4.0 International License.
  4. Heath, Michael T. Scientific computing: an introductory survey. SIAM.