The term process model is used in various contexts. For example, in business process modeling the enterprise process model is often referred to as the business process model.
Process models are processes of the same nature that are classified together into a model. Thus, a process model is a description of a process at the type level. Since the process model is at the type level, a process is an instantiation of it. The same process model is used repeatedly for the development of many applications and thus, has many instantiations. One possible use of a process model is to prescribe how things must/should/could be done in contrast to the process itself which is really what happens. A process model is roughly an anticipation of what the process will look like. What the process shall be will be determined during actual system development. [2]
The goals of a process model are to be:
From a theoretical point of view, the meta-process modeling explains the key concepts needed to describe what happens in the development process, on what, when it happens, and why. From an operational point of view, the meta-process modeling is aimed at providing guidance for method engineers and application developers. [1]
The activity of modeling a business process usually predicates a need to change processes or identify issues to be corrected. This transformation may or may not require IT involvement, although that is a common driver for the need to model a business process. Change management programmes are desired to put the processes into practice. With advances in technology from larger platform vendors, the vision of business process models (BPM) becoming fully executable (and capable of round-trip engineering) is coming closer to reality every day. Supporting technologies include Unified Modeling Language (UML), model-driven architecture, and service-oriented architecture.
Process modeling addresses the process aspects of an enterprise business architecture, leading to an all encompassing enterprise architecture. The relationships of a business processes in the context of the rest of the enterprise systems, data, organizational structure, strategies, etc. create greater capabilities in analyzing and planning a change. One real-world example is in corporate mergers and acquisitions; understanding the processes in both companies in detail, allowing management to identify redundancies resulting in a smoother merger.
Process modeling has always been a key aspect of business process reengineering, and continuous improvement approaches seen in Six Sigma.
There are five types of coverage where the term process model has been defined differently: [3]
Processes can be of different kinds. [2] These definitions "correspond to the various ways in which a process can be modelled".
Granularity refers to the level of detail of a process model and affects the kind of guidance, explanation and trace that can be provided. Coarse granularity restricts these to a rather limited level of detail whereas fine granularity provides more detailed capability. The nature of granularity needed is dependent on the situation at hand. [2]
Project manager, customer representatives, the general, top-level, or middle management require rather coarse-grained process description as they want to gain an overview of time, budget, and resource planning for their decisions. In contrast, software engineers, users, testers, analysts, or software system architects will prefer a fine-grained process model where the details of the model can provide them with instructions and important execution dependencies such as the dependencies between people.
While notations for fine-grained models exist, most traditional process models are coarse-grained descriptions. Process models should, ideally, provide a wide range of granularity (e.g. Process Weaver). [2] [7]
It was found that while process models were prescriptive, in actual practice departures from the prescription can occur. [6] Thus, frameworks for adopting methods evolved so that systems development methods match specific organizational situations and thereby improve their usefulness. The development of such frameworks is also called situational method engineering.
Method construction approaches can be organized in a flexibility spectrum ranging from 'low' to 'high'. [8]
Lying at the 'low' end of this spectrum are rigid methods, whereas at the 'high' end there are modular method construction. Rigid methods are completely pre-defined and leave little scope for adapting them to the situation at hand. On the other hand, modular methods can be modified and augmented to fit a given situation. Selecting a rigid methods allows each project to choose its method from a panel of rigid, pre-defined methods, whereas selecting a path within a method consists of choosing the appropriate path for the situation at hand. Finally, selecting and tuning a method allows each project to select methods from different approaches and tune them to the project's needs." [9]
As the quality of process models is being discussed in this paper, there is a need to elaborate quality of modeling techniques as an important essence in quality of process models. In most existing frameworks created for understanding the quality, the line between quality of modeling techniques and the quality of models as a result of the application of those techniques are not clearly drawn. This report will concentrate both on quality of process modeling techniques and quality of process models to clearly differentiate the two. Various frameworks were developed to help in understanding quality of process modeling techniques, one example is Quality based modeling evaluation framework or known as Q-Me framework which argued to provide set of well defined quality properties and procedures to make an objective assessment of this properties possible. [10] This framework also has advantages of providing uniform and formal description of the model element within one or different model types using one modeling techniques [10] In short this can make assessment of both the product quality and the process quality of modeling techniques with regard to a set of properties that have been defined before.
Quality properties that relate to business process modeling techniques discussed in [10] are:
To assess the quality of Q-ME framework; it is used to illustrate the quality of the dynamic essentials modeling of the organisation (DEMO) business modeling techniques.
It is stated that the evaluation of the Q-ME framework to the DEMO modeling techniques has revealed the shortcomings of Q-ME. One particular is that it does not include quantifiable metric to express the quality of business modeling technique which makes it hard to compare quality of different techniques in an overall rating.
There is also a systematic approach for quality measurement of modeling techniques known as complexity metrics suggested by Rossi et al. (1996). Techniques of Meta model is used as a basis for computation of these complexity metrics. In comparison to quality framework proposed by Krogstie, quality measurement focus more on technical level instead of individual model level. [11]
Authors (Cardoso, Mendling, Neuman and Reijers, 2006) used complexity metrics to measure the simplicity and understandability of a design. This is supported by later research done by Mendling et al. who argued that without using the quality metrics to help question quality properties of a model, simple process can be modeled in a complex and unsuitable way. This in turn can lead to a lower understandability, higher maintenance cost and perhaps inefficient execution of the process in question. [12]
The quality of modeling technique is important in creating models that are of quality and contribute to the correctness and usefulness of models.
Earliest process models reflected the dynamics of the process with a practical process obtained by instantiation in terms of relevant concepts, available technologies, specific implementation environments, process constraints and so on. [13]
Enormous number of research has been done on quality of models but less focus has been shifted towards the quality of process models. Quality issues of process models cannot be evaluated exhaustively however there are four main guidelines and frameworks in practice for such. These are: top-down quality frameworks, bottom-up metrics related to quality aspects, empirical surveys related to modeling techniques, and pragmatic guidelines. [14]
Hommes quoted Wang et al. (1994) [11] that all the main characteristic of quality of models can all be grouped under 2 groups namely correctness and usefulness of a model, correctness ranges from the model correspondence to the phenomenon that is modeled to its correspondence to syntactical rules of the modeling and also it is independent of the purpose to which the model is used.
Whereas the usefulness can be seen as the model being helpful for the specific purpose at hand for which the model is constructed at first place. Hommes also makes a further distinction between internal correctness (empirical, syntactical and semantic quality) and external correctness (validity).
A common starting point for defining the quality of conceptual model is to look at the linguistic properties of the modeling language of which syntax and semantics are most often applied.
Also the broader approach is to be based on semiotics rather than linguistic as was done by Krogstie using the top-down quality framework known as SEQUAL. [15] [16] It defines several quality aspects based on relationships between a model, knowledge Externalisation, domain, a modeling language, and the activities of learning, taking action, and modeling.
The framework does not however provide ways to determine various degrees of quality but has been used extensively for business process modeling in empirical tests carried out [17] According to previous research done by Moody et al. [18] with use of conceptual model quality framework proposed by Lindland et al. (1994) to evaluate quality of process model, three levels of quality [19] were identified:
From the research it was noticed that the quality framework was found to be both easy to use and useful in evaluating the quality of process models however it had limitations in regards to reliability and difficult to identify defects. These limitations led to refinement of the framework through subsequent research done by Krogstie. This framework is called SEQUEL framework by Krogstie et al. 1995 (Refined further by Krogstie & Jørgensen, 2002) which included three more quality aspects.
Dimensions of Conceptual Quality framework [20] Modeling Domain is the set of all statements that are relevant and correct for describing a problem domain, Language Extension is the set of all statements that are possible given the grammar and vocabulary of the modeling languages used. Model Externalization is the conceptual representation of the problem domain.
It is defined as the set of statements about the problem domain that are actually made. Social Actor Interpretation and Technical Actor Interpretation are the sets of statements that actors both human model users and the tools that interact with the model, respectively 'think' the conceptual representation of the problem domain contains.
Finally, Participant Knowledge is the set of statements that human actors, who are involved in the modeling process, believe should be made to represent the problem domain. These quality dimensions were later divided into two groups that deal with physical and social aspects of the model.
In later work, Krogstie et al. [15] stated that while the extension of the SEQUAL framework has fixed some of the limitation of the initial framework, however other limitation remain . In particular, the framework is too static in its view upon semantic quality, mainly considering models, not modeling activities, and comparing these models to a static domain rather than seeing the model as a facilitator for changing the domain.
Also, the framework's definition of pragmatic quality is quite narrow, focusing on understanding, in line with the semiotics of Morris, while newer research in linguistics and semiotics has focused beyond mere understanding, on how the model is used and affects its interpreters.
The need for a more dynamic view in the semiotic quality framework is particularly evident when considering process models, which themselves often prescribe or even enact actions in the problem domain, hence a change to the model may also change the problem domain directly. This paper discusses the quality framework in relation to active process models and suggests a revised framework based on this.
Further work by Krogstie et al. (2006) to revise SEQUAL framework to be more appropriate for active process models by redefining physical quality with a more narrow interpretation than previous research. [15]
The other framework in use is Guidelines of Modeling (GoM) [21] based on general accounting principles include the six principles: Correctness, Clarity deals with the comprehensibility and explicitness (System description) of model systems. Comprehensibility relates to graphical arrangement of the information objects and, therefore, supports the understand ability of a model. Relevance relates to the model and the situation being presented. Comparability involves the ability to compare models that is semantic comparison between two models, Economic efficiency; the produced cost of the design process need at least to be covered by the proposed use of cost cuttings and revenue increases.
Since the purpose of organizations in most cases is the maximization of profit, the principle defines the borderline for the modeling process. The last principle is Systematic design defines that there should be an accepted differentiation between diverse views within modeling. Correctness, relevance and economic efficiency are prerequisites in the quality of models and must be fulfilled while the remaining guidelines are optional but necessary.
The two frameworks SEQUAL and GOM have a limitation of use in that they cannot be used by people who are not competent with modeling. They provide major quality metrics but are not easily applicable by non-experts.
The use of bottom-up metrics related to quality aspects of process models is trying to bridge the gap of use of the other two frameworks by non-experts in modeling but it is mostly theoretical and no empirical tests have been carried out to support their use.
Most experiments carried out relate to the relationship between metrics and quality aspects and these works have been done individually by different authors: Canfora et al. study the connection mainly between count metrics (for example, the number of tasks or splits -and maintainability of software process models); [22] Cardoso validates the correlation between control flow complexity and perceived complexity; and Mendling et al. use metrics to predict control flow errors such as deadlocks in process models. [12] [23]
The results reveal that an increase in size of a model appears to reduce its quality and comprehensibility. Further work by Mendling et al. investigates the connection between metrics and understanding [24] and [25] While some metrics are confirmed regarding their effect, also personal factors of the modeler – like competence – are revealed as important for understanding about the models.
Several empirical surveys carried out still do not give clear guidelines or ways of evaluating the quality of process models but it is necessary to have clear set of guidelines to guide modelers in this task. Pragmatic guidelines have been proposed by different practitioners even though it is difficult to provide an exhaustive account of such guidelines from practice.
Most of the guidelines are not easily put to practice but "label activities verb–noun" rule has been suggested by other practitioners before and analyzed empirically. From the research. [26] value of process models is not only dependent on the choice of graphical constructs but also on their annotation with textual labels which need to be analyzed. It was found that it results in better models in terms of understanding than alternative labelling styles.
From the earlier research and ways to evaluate process model quality it has been seen that the process model's size, structure, expertise of the modeler and modularity affect its overall comprehensibility. [24] [27] Based on these a set of guidelines was presented [28] 7 Process Modeling Guidelines (7PMG). This guideline uses the verb-object style, as well as guidelines on the number of elements in a model, the application of structured modeling, and the decomposition of a process model. The guidelines are as follows:
7PMG still though has limitations with its use: Validity problem 7PMG does not relate to the content of a process model, but only to the way this content is organized and represented. It does suggest ways of organizing different structures of the process model while the content is kept intact but the pragmatic issue of what must be included in the model is still left out. The second limitation relates to the prioritizing guideline the derived ranking has a small empirical basis as it relies on the involvement of 21 process modelers only.
This could be seen on the one hand as a need for a wider involvement of process modelers' experience, but it also raises the question, what alternative approaches may be available to arrive at a prioritizing guideline? [28]
The Capability Maturity Model (CMM) is a development model created in 1986 after a study of data collected from organizations that contracted with the U.S. Department of Defense, who funded the research. The term "maturity" relates to the degree of formality and optimization of processes, from ad hoc practices, to formally defined steps, to managed result metrics, to active optimization of the processes.
Software architecture is the set of structures needed to reason about a software system and the discipline of creating such structures and systems. Each structure comprises software elements, relations among them, and properties of both elements and relations.
Usability can be described as the capacity of a system to provide a condition for its users to perform the tasks safely, effectively, and efficiently while enjoying the experience. In software engineering, usability is the degree to which a software can be used by specified consumers to achieve quantified objectives with effectiveness, efficiency, and satisfaction in a quantified context of use.
A modeling language is any artificial language that can be used to express information or knowledge or systems in a structure that is defined by a consistent set of rules. The rules are used for interpretation of the meaning of components in the structure Programing language.
In systems engineering, information systems and software engineering, the systems development life cycle (SDLC), also referred to as the application development life cycle, is a process for planning, creating, testing, and deploying an information system. The SDLC concept applies to a range of hardware and software configurations, as a system can be composed of hardware only, software only, or a combination of both. There are usually six stages in this cycle: requirement analysis, design, development and testing, implementation, documentation, and evaluation.
Data modeling in software engineering is the process of creating a data model for an information system by applying certain formal techniques. It may be applied as part of broader Model-driven engineering (MDD) concept.
A conceptual model is a representation of a system. It consists of concepts used to help people know, understand, or simulate a subject the model represents. In contrast, a physical model focuses on a physical object such as a toy model that may be assembled and made to work like the object it represents.
Extended Enterprise Modeling Language (EEML) in software engineering is a modelling language used for Enterprise modelling across a number of layers.
The SEQUAL framework is systems modelling reference model for evaluating the quality of models. The SEQUAL framework, which stands for "semiotic quality framework" is developed by John Krogstie and others since the 1990s.
The function point is a "unit of measurement" to express the amount of business functionality an information system provides to a user. Function points are used to compute a functional size measurement (FSM) of software. The cost of a single unit is calculated from past projects.
Process mining is a family of techniques relating the fields of data science and process management to support the analysis of operational processes based on event logs. The goal of process mining is to turn event data into insights and actions. Process mining is an integral part of data science, fueled by the availability of event data and the desire to improve processes. Process mining techniques use event data to show what people, machines, and organizations are really doing. Process mining provides novel insights that can be used to identify the execution paths taken by operational processes and address their performance and compliance problems.
Enterprise modelling is the abstract representation, description and definition of the structure, processes, information and resources of an identifiable business, government body, or other large organization.
Business process discovery (BPD) related to business process management and process mining is a set of techniques that manually or automatically construct a representation of an organisations' current business processes and their major process variations. These techniques use data recorded in the existing organisational methods of work, documentations, and technology systems that run business processes within an organisation. The type of data required for process discovery is called an event log. Any record of data that contains the case id, activity name, and timestamp. Such a record qualifies for an event log and can be used to discover the underlying process model. The event log can contain additional information related to the process, such as the resources executing the activity, the type or nature of the events, or any other relevant details. Process discovery aims to obtain a process model that describes the event log as closely as possible. The process model acts as a graphical representation of the process. The event logs used for discovery could contain noise, irregular information, and inconsistent/incorrect timestamps. Process discovery is challenging due to such noisy event logs and because the event log contains only a part of the actual process hidden behind the system. The discovery algorithms should solely depend on a small percentage of data provided by the event logs to develop the closest possible model to the actual behaviour.
Domain-driven design (DDD) is a major software design approach, focusing on modeling software to match a domain according to input from that domain's experts.
Sustainability accounting was originated about 20 years ago and is considered a subcategory of financial accounting that focuses on the disclosure of non-financial information about a firm's performance to external stakeholders, such as capital holders, creditors, and other authorities. Sustainability accounting represents the activities that have a direct impact on society, environment, and economic performance of an organisation. Sustainability accounting in managerial accounting contrasts with financial accounting in that managerial accounting is used for internal decision making and the creation of new policies that will have an effect on the organisation's performance at economic, ecological, and social level. Sustainability accounting is often used to generate value creation within an organisation.
A view model or viewpoints framework in systems engineering, software engineering, and enterprise engineering is a framework which defines a coherent set of views to be used in the construction of a system architecture, software architecture, or enterprise architecture. A view is a representation of the whole system from the perspective of a related set of concerns.
John Krogstie is a Norwegian computer scientist, professor in information systems at the Norwegian University of Science and Technology (NTNU) in Trondheim, Norway, and an expert in the field of enterprise modelling.
GQM+Strategies is a method that provides concepts and actionable steps for creating the link between goals and strategies across an organization and allows for measurement-based decision-making. It was developed by Victor Basili, Jens Heidrich, Mikael Lindvall, Jürgen Münch, Myrna Regardie, Carolyn B. Seaman, and Adam Trendowicz. The method was originally developed for organizations having a strong focus on IT and the development of software systems, but the method's popularity has grown to other domains and can be applied to any organization. The book Aligning Organizations through Measurement gives a comprehensive overview of the method, provides actionable guidance, case studies, and practical applications.
Management accounting principles (MAP) were developed to serve the core needs of internal management to improve decision support objectives, internal business processes, resource application, customer value, and capacity utilization needed to achieve corporate goals in an optimal manner. Another term often used for management accounting principles for these purposes is managerial costing principles. The two management accounting principles are:
Andreas Lothe Opdahl is a Norwegian computer scientist and Professor of Information Systems Development at the University of Bergen, known for his theory about Security requirements engineering, and for with Guttorm Sindre coining the term Misuse case.