Software Process simulation

Last updated

Software process simulation modelling: Like any simulation, software process simulation (SPS) is the numerical evaluation of a mathematical model that imitates the behavior of the software development process being modeled. SPS has the ability to model the dynamic nature of software development and handle the uncertainty and randomness inherent in it. [1]

Contents

Uses of software process simulation

Following main purposes have been proposed for SPS: [2]

How to do software process simulation

Software process simulation starts with identifying a question that we want to answer. The question could be, for example, related to assessment of an alternative, incorporating a new practice in the software development process. Introducing such changes in the actual development process will be expensive and if the consequences of change are not positive the implications can be dire for the organization. Thus, through the use of simulation we attempt to get an initial assessment of such changes on the model instead of an active development project. Based on this problem description an appropriate scope of the process is chosen. A simulation approach is chosen to model the development process. Such a model is then calibrated using empirical data and then used to conduct simulation based investigations. A detailed description of each step in general can be found in Balci's work, [5] and in particular for software process simulation a comprehensive overview can be found in Ali et al. [6]

In a recent initiative, by ACM special interest group on software engineering (SIGSOFT), a standard for assessing simulation-based scientific studies has been proposed. [7]

Examples of using software process simulation for practical issues in industrial settings

Key venues

Software process simulation has been an active research area for many decades some of the key venues include the International Conference on Software and Systems Process [10] and its predecessor Workshop on Software Process Simulation Modeling (ProSim) from 1998-2004. [11]

Related Research Articles

In computer science, static program analysis is the analysis of computer programs performed without executing them, in contrast with dynamic program analysis, which is performed on programs during their execution.

Code review is a software quality assurance activity in which one or several people check a program mainly by viewing and reading parts of its source code, and they do so after implementation or as an interruption of implementation. At least one of the persons must not be the code's author. The persons performing the checking, excluding the author, are called "reviewers".

In software project management, software testing, and software engineering, verification and validation (V&V) is the process of checking that a software system meets specifications and requirements so that it fulfills its intended purpose. It may also be referred to as software quality control. It is normally the responsibility of software testers as part of the software development lifecycle. In simple terms, software verification is: "Assuming we should build X, does our software achieve its goals without any bugs or gaps?" On the other hand, software validation is: "Was X what we should have built? Does X meet the high-level requirements?"

Multi-agent system Built of multiple interacting agents

A multi-agent system is a computerized system composed of multiple interacting intelligent agents. Multi-agent systems can solve problems that are difficult or impossible for an individual agent or a monolithic system to solve. Intelligence may include methodic, functional, procedural approaches, algorithmic search or reinforcement learning.

An agent-based model (ABM) is a computational model for simulating the actions and interactions of autonomous agents in order to understand the behavior of a system and what governs its outcomes. It combines elements of game theory, complex systems, emergence, computational sociology, multi-agent systems, and evolutionary programming. Monte Carlo methods are used to understand the stochasticity of these models. Particularly within ecology, ABMs are also called individual-based models (IBMs). A review of recent literature on individual-based models, agent-based models, and multiagent systems shows that ABMs are used in many scientific domains including biology, ecology and social science. Agent-based modeling is related to, but distinct from, the concept of multi-agent systems or multi-agent simulation in that the goal of ABM is to search for explanatory insight into the collective behavior of agents obeying simple rules, typically in natural systems, rather than in designing agents or solving specific practical or engineering problems.

The cleanroom software engineering process is a software development process intended to produce software with a certifiable level of reliability. The cleanroom process was originally developed by Harlan Mills and several of his colleagues including Alan Hevner at IBM. The focus of the cleanroom process is on defect prevention, rather than defect removal. The name "cleanroom" was chosen to evoke the cleanrooms used in the electronics industry to prevent the introduction of defects during the fabrication of semiconductors. The cleanroom process first saw use in the mid to late 1980s. Demonstration projects within the military began in the early 1990s. Recent work on the cleanroom process has examined fusing cleanroom with the automated verification capabilities provided by specifications expressed in CSP.

Value-stream mapping Lean-management method for analyzing the current state and designing a future state

Value-stream mapping, also known as "material- and information-flow mapping", is a lean-management method for analyzing the current state and designing a future state for the series of events that take a product or service from the beginning of the specific process until it reaches the customer. A value stream map is a visual tool that displays all critical steps in a specific process and easily quantifies the time and volume taken at each stage. Value stream maps show the flow of both materials and information as they progress through the process.

In computer science, fault injection is a testing technique for understanding how computing systems behave when stressed in unusual ways. This can be achieved using physical- or software-based means, or using a hybrid approach. Widely studied physical fault injections include the application of high voltages, extreme temperatures and electromagnetic pulses on electronic components, such as computer memory and central processing units. By exposing components to conditions beyond their intended operating limits, computing systems can be coerced into mis-executing instructions and corrupting critical data.

Search-based software engineering (SBSE) applies metaheuristic search techniques such as genetic algorithms, simulated annealing and tabu search to software engineering problems. Many activities in software engineering can be stated as optimization problems. Optimization techniques of operations research such as linear programming or dynamic programming are often impractical for large scale software engineering problems because of their computational complexity or their assumptions on the problem structure. Researchers and practitioners use metaheuristic search techniques, which impose little assumptions on the problem structure, to find near-optimal or "good-enough" solutions.

In software development, effort estimation is the process of predicting the most realistic amount of effort required to develop or maintain software based on incomplete, uncertain and noisy input. Effort estimates may be used as input to project plans, iteration plans, budgets, investment analyses, pricing processes and bidding rounds.

Reverse computation is a software application of the concept of reversible computing.

FlexSim is a discrete-event simulation software package developed by FlexSim Software Products, Inc. The FlexSim product family currently includes the general purpose FlexSim product and healthcare systems modeling environment.

DevOps is a set of practices that combines software development (Dev) and IT operations (Ops). It aims to shorten the systems development life cycle and provide continuous delivery with high software quality. DevOps is complementary with Agile software development; several DevOps aspects came from the Agile methodology.

Continuous delivery (CD) is a software engineering approach in which teams produce software in short cycles, ensuring that the software can be reliably released at any time and, when releasing the software, without doing so manually. It aims at building, testing, and releasing software with greater speed and frequency. The approach helps reduce the cost, time, and risk of delivering changes by allowing for more incremental updates to applications in production. A straightforward and repeatable deployment process is important for continuous delivery.

Continuous deployment (CD) is a software engineering approach in which software functionalities are delivered frequently and through automated deployments.

INGENIAS

INGENIAS is an open-source software framework for the analysis, design and implementation of multi-agent systems (MAS).

Model-based systems engineering (MBSE), according to the International Council on Systems Engineering (INCOSE), is the formalized application of modeling to support system requirements, design, analysis, verification and validation activities beginning in the conceptual design phase and continuing throughout development and later life cycle phases. MBSE is a technical approach to systems engineering that focuses on creating and exploiting domain models as the primary means of information exchange, rather than on document-based information exchange. MBSE technical approaches are commonly applied to a wide range of industries with complex systems, such as aerospace, defense, rail, automotive, manufacturing, etc.

System-level simulation (SLS) is a collection of practical methods used in the field of systems engineering, in order to simulate, with a computer, the global behavior of large cyber-physical systems.

Automatic bug-fixing is the automatic repair of software bugs without the intervention of a human programmer. It is also commonly referred to as automatic patch generation, automatic bug repair, or automatic program repair. The typical goal of such techniques is to automatically generate correct patches to eliminate bugs in software programs without causing software regression.

Predictive engineering analytics (PEA) is a development approach for the manufacturing industry that helps with the design of complex products. It concerns the introduction of new software tools, the integration between those, and a refinement of simulation and testing processes to improve collaboration between analysis teams that handle different applications. This is combined with intelligent reporting and data analytics. The objective is to let simulation drive the design, to predict product behavior rather than to react on issues which may arise, and to install a process that lets design continue after product delivery.

References

  1. Ali, NB; Petersen, K; Wohlin, C (2014). "A Systematic Literature Review on the Industrial Use of Software Process Simulation". Journal of Systems and Software. 97: 65–85. CiteSeerX   10.1.1.717.3797 . doi:10.1016/j.jss.2014.06.059.
  2. Kellner, Marc I; Madachy, Raymond J; Raffo, David M (1999). "Software process simulation modeling: Why? What? How?". Journal of Systems and Software. 46 (2–3): 91–105. CiteSeerX   10.1.1.587.8752 . doi:10.1016/s0164-1212(99)00003-5.
  3. "Use of simulation for software process education: a case study" (PDF).
  4. von Wangenheim, C.G.; Shull, F. (2009). "To Game or Not to Game?". IEEE Software. 26 (2): 92–94. doi:10.1109/MS.2009.54. S2CID   13354988.
  5. Osman Balci (2012), "A Life Cycle for Modeling and Simulation," Simulation: Transactions of the Society for Modeling and Simulation International 88, 7, 870–883.
  6. Ali, N.B.; Petersen, K., "A Consolidated Process for Software Process Simulation: State of the Art and Industry Experience," Software Engineering and Advanced Applications (SEAA), 2012 38th EUROMICRO Conference on , vol., no., pp.327,336, 5-8 Sept. 2012 doi: 10.1109/SEAA.2012.69 http://www.bth.se/fou/forskinfo.nsf/0/7e2b9e104c9956cec1257acf006a1282/$file/Consolidated%20process.pdf
  7. Franca, Breno. "Simulation (quantitative)". Empirical standards. Retrieved 25 February 2021.
  8. Ali, NB; Petersen, K; de França, BBN (2015). "Evaluation of simulation-assisted value stream mapping for software product development: Two industrial cases". Information and Software Technology. 68: 45–61. doi:10.1016/j.infsof.2015.08.005.
  9. Garousi, Vahid; Pfahl, Dietmar (2015). "When to automate software testing? A decision‐support approach based on process simulation". Journal of Software: Evolution and Process.
  10. "Icssp2015". Archived from the original on 2015-02-21. Retrieved 2014-12-01.
  11. http://www.verlag.fraunhofer.de/bookshop/artikel.jsp?v=220684