Software Process simulation

Last updated

Software process simulation modelling: Like any simulation, software process simulation (SPS) is the numerical evaluation of a mathematical model that imitates the behavior of the software development process being modeled. SPS has the ability to model the dynamic nature of software development and handle the uncertainty and randomness inherent in it. [1]

Contents

Uses of software process simulation

Following main purposes have been proposed for SPS: [2]

How to do software process simulation

Software process simulation starts with identifying a question that we want to answer. The question could be, for example, related to assessment of an alternative, incorporating a new practice in the software development process. Introducing such changes in the actual development process will be expensive and if the consequences of change are not positive the implications can be dire for the organization. Thus, through the use of simulation we attempt to get an initial assessment of such changes on the model instead of an active development project. Based on this problem description an appropriate scope of the process is chosen. A simulation approach is chosen to model the development process. Such a model is then calibrated using empirical data and then used to conduct simulation based investigations. A detailed description of each step in general can be found in Balci's work, [5] and in particular for software process simulation a comprehensive overview can be found in Ali et al. [6]

In a recent initiative, by ACM special interest group on software engineering (SIGSOFT), a standard for assessing simulation-based scientific studies has been proposed. [7]

Examples of using software process simulation for practical issues in industrial settings

Key venues

Software process simulation has been an active research area for many decades some of the key venues include the International Conference on Software and Systems Process [10] and its predecessor Workshop on Software Process Simulation Modeling (ProSim) from 1998-2004. [11]

Related Research Articles

In computer science, static program analysis is the analysis of computer programs performed without executing them, in contrast with dynamic program analysis, which is performed on programs during their execution in the integrated environment.

<span class="mw-page-title-main">Software architecture</span> High level structures of a software system

Software architecture is the set of structures needed to reason about a software system and the discipline of creating such structures and systems. Each structure comprises software elements, relations among them, and properties of both elements and relations.

<span class="mw-page-title-main">Code review</span> Activity where one or more people check a programs code

Code review is a software quality assurance activity in which one or more people check a program, mainly by viewing and reading parts of its source code, either after implementation or as an interruption of implementation. At least one of the persons must not have authored the code. The persons performing the checking, excluding the author, are called "reviewers".

In software project management, software testing, and software engineering, verification and validation is the process of checking that a software engineer system meets specifications and requirements so that it fulfills its intended purpose. It may also be referred to as software quality control. It is normally the responsibility of software testers as part of the software development lifecycle. In simple terms, software verification is: "Assuming we should build X, does our software achieve its goals without any bugs or gaps?" On the other hand, software validation is: "Was X what we should have built? Does X meet the high-level requirements?"

<span class="mw-page-title-main">Multi-agent system</span> Built of multiple interacting agents

A multi-agent system is a computerized system composed of multiple interacting intelligent agents. Multi-agent systems can solve problems that are difficult or impossible for an individual agent or a monolithic system to solve. Intelligence may include methodic, functional, procedural approaches, algorithmic search or reinforcement learning.

Prognostics is an engineering discipline focused on predicting the time at which a system or a component will no longer perform its intended function. This lack of performance is most often a failure beyond which the system can no longer be used to meet desired performance. The predicted time then becomes the remaining useful life (RUL), which is an important concept in decision making for contingency mitigation. Prognostics predicts the future performance of a component by assessing the extent of deviation or degradation of a system from its expected normal operating conditions. The science of prognostics is based on the analysis of failure modes, detection of early signs of wear and aging, and fault conditions. An effective prognostics solution is implemented when there is sound knowledge of the failure mechanisms that are likely to cause the degradations leading to eventual failures in the system. It is therefore necessary to have initial information on the possible failures in a product. Such knowledge is important to identify the system parameters that are to be monitored. Potential uses for prognostics is in condition-based maintenance. The discipline that links studies of failure mechanisms to system lifecycle management is often referred to as prognostics and health management (PHM), sometimes also system health management (SHM) or—in transportation applications—vehicle health management (VHM) or engine health management (EHM). Technical approaches to building models in prognostics can be categorized broadly into data-driven approaches, model-based approaches, and hybrid approaches.

<span class="mw-page-title-main">Meta-process modeling</span>

Meta-process modeling is a type of metamodeling used in software engineering and systems engineering for the analysis and construction of models applicable and useful to some predefined problems.

<span class="mw-page-title-main">Value-stream mapping</span> Lean-management method for analyzing the current state and designing a future state

Value-stream mapping, also known as material- and information-flow mapping, is a lean-management method for analyzing the current state and designing a future state for the series of events that take a product or service from the beginning of the specific process until it reaches the customer. A value stream map is a visual tool that displays all critical steps in a specific process and easily quantifies the time and volume taken at each stage. Value stream maps show the flow of both materials and information as they progress through the process.

Quality engineering is the discipline of engineering concerned with the principles and practice of product and service quality assurance and control. In software development, it is the management, development, operation and maintenance of IT systems and enterprise architectures with high quality standard.

Search-based software engineering (SBSE) applies metaheuristic search techniques such as genetic algorithms, simulated annealing and tabu search to software engineering problems. Many activities in software engineering can be stated as optimization problems. Optimization techniques of operations research such as linear programming or dynamic programming are often impractical for large scale software engineering problems because of their computational complexity or their assumptions on the problem structure. Researchers and practitioners use metaheuristic search techniques, which impose little assumptions on the problem structure, to find near-optimal or "good-enough" solutions.

<span class="mw-page-title-main">Fuzzy cognitive map</span>

A fuzzy cognitive map (FCM) is a cognitive map within which the relations between the elements of a "mental landscape" can be used to compute the "strength of impact" of these elements. Fuzzy cognitive maps were introduced by Bart Kosko. Robert Axelrod introduced cognitive maps as a formal way of representing social scientific knowledge and modeling decision making in social and political systems, then brought in the computation.

In software development, effort estimation is the process of predicting the most realistic amount of effort required to develop or maintain software based on incomplete, uncertain and noisy input. Effort estimates may be used as input to project plans, iteration plans, budgets, investment analyses, pricing processes and bidding rounds.

FlexSim is a discrete-event simulation software package developed by FlexSim Software Products, Inc. The FlexSim product family currently includes the general purpose FlexSim product and healthcare systems modeling environment.

Continuous delivery (CD) is a software engineering approach in which teams produce software in short cycles, ensuring that the software can be reliably released at any time and following a pipeline through a "production-like environment", without doing so manually. It aims at building, testing, and releasing software with greater speed and frequency. The approach helps reduce the cost, time, and risk of delivering changes by allowing for more incremental updates to applications in production. A straightforward and repeatable deployment process is important for continuous delivery.

Continuous deployment (CD) is a software engineering approach in which software functionalities are delivered frequently and through automated deployments.

<span class="mw-page-title-main">INGENIAS</span>

INGENIAS is an open-source software framework for the analysis, design and implementation of multi-agent systems (MAS).

System-level simulation (SLS) is a collection of practical methods used in the field of systems engineering, in order to simulate, with a computer, the global behavior of large cyber-physical systems.

Predictive engineering analytics (PEA) is a development approach for the manufacturing industry that helps with the design of complex products. It concerns the introduction of new software tools, the integration between those, and a refinement of simulation and testing processes to improve collaboration between analysis teams that handle different applications. This is combined with intelligent reporting and data analytics. The objective is to let simulation drive the design, to predict product behavior rather than to react on issues which may arise, and to install a process that lets design continue after product delivery.

Vensim is a simulation software developed by Ventana Systems. It primarily supports continuous simulation, with some discrete event and agent-based modelling capabilities. It is available commercially and as a free "Personal Learning Edition".

The Lincoln Adaptable Real-time Information Assurance Testbed (LARIAT) is a physical computing platform developed by the MIT Lincoln Laboratory as a testbed for network security applications. Use of the platform is restricted to the United States military, though some academic organizations can also use the platform under certain conditions.

References

  1. Ali, NB; Petersen, K; Wohlin, C (2014). "A Systematic Literature Review on the Industrial Use of Software Process Simulation". Journal of Systems and Software. 97: 65–85. CiteSeerX   10.1.1.717.3797 . doi:10.1016/j.jss.2014.06.059.
  2. Kellner, Marc I; Madachy, Raymond J; Raffo, David M (1999). "Software process simulation modeling: Why? What? How?". Journal of Systems and Software. 46 (2–3): 91–105. CiteSeerX   10.1.1.587.8752 . doi:10.1016/s0164-1212(99)00003-5.
  3. "Use of simulation for software process education: a case study" (PDF). Archived from the original (PDF) on 2016-03-04. Retrieved 2014-12-01.
  4. von Wangenheim, C.G.; Shull, F. (2009). "To Game or Not to Game?". IEEE Software. 26 (2): 92–94. doi:10.1109/MS.2009.54. S2CID   13354988.
  5. Osman Balci (2012), "A Life Cycle for Modeling and Simulation," Simulation: Transactions of the Society for Modeling and Simulation International 88, 7, 870–883.
  6. Ali, N.B.; Petersen, K., "A Consolidated Process for Software Process Simulation: State of the Art and Industry Experience," Software Engineering and Advanced Applications (SEAA), 2012 38th EUROMICRO Conference on , vol., no., pp.327,336, 5-8 Sept. 2012 doi: 10.1109/SEAA.2012.69 http://www.bth.se/fou/forskinfo.nsf/0/7e2b9e104c9956cec1257acf006a1282/$file/Consolidated%20process.pdf%5B%5D
  7. Franca, Breno. "Simulation (quantitative)". Empirical standards. Retrieved 25 February 2021.
  8. Ali, NB; Petersen, K; de França, BBN (2015). "Evaluation of simulation-assisted value stream mapping for software product development: Two industrial cases". Information and Software Technology. 68: 45–61. doi:10.1016/j.infsof.2015.08.005.
  9. Garousi, Vahid; Pfahl, Dietmar (2015). "When to automate software testing? A decision‐support approach based on process simulation". Journal of Software: Evolution and Process.
  10. "Icssp2015". Archived from the original on 2015-02-21. Retrieved 2014-12-01.
  11. http://www.verlag.fraunhofer.de/bookshop/artikel.jsp?v=220684%5B%5D