Jackson system development

Last updated

Jackson System Development (JSD) is a linear software development methodology developed by Michael A. Jackson and John Cameron in the 1980s.

Contents

History

JSD was first presented by Michael A. Jackson in 1982, in a paper called "A System Development Method". [1] and in 1983 in System Development. [2] Jackson System Development (JSD) is a method of system development that covers the software life cycle either directly or, by providing a framework into which more specialized techniques can fit. Jackson System Development can start from the stage in a project when there is only a general statement of requirements. However, many projects that have used Jackson System Development actually started slightly later in the life cycle, doing the first steps largely from existing documents rather than directly with the users. The later steps of JSD produce the code of the final system. Jackson’s first method, Jackson Structured Programming (JSP), is used to produce the final code. The output of the earlier steps of JSD are a set of program design problems, the design of which is the subject matter of JSP. Maintenance is also addressed by reworking whichever of the earlier steps are appropriate.

JSD continued to evolve, and a few new features were introduced into the method. These are described in a 1989 collection of papers by John Cameron, JSP and JSD, [3] and in the 1992 version (version 2) of the LBMS JSD manual. [4]

Development of the JSD method came to an end in the early 1990s as Jackson's thinking evolved into the Problem Frames Approach with the publication of Software Requirements and Specifications (1995) and Problem Frames: Analyzing and Structuring Software Development Problems (2000).

Principles of operation

Three basic principles of operation of JSD is that:

JSD steps

When it was originally presented by Jackson in 1982, [1] the method consisted of six steps:

  1. Entity/action step
  2. Initial model step
  3. Interactive function step
  4. Information function step
  5. System timing step
  6. System implementation step

Later, some steps were combined to create a method with only three steps. [5]

  1. Modelling stage (analysis): with the entity/action step and entity structures step.
  2. Network stage (design): with the initial model step, function step, and system timing step.
  3. Implementation stage (realisation): the implementation step.

Modeling stage

In the modeling stage the designer creates a collection of entity structure diagrams and identifies the entities in the system, the actions they perform, the time-ordering of the actions in the life of the entities, and the attributes of the actions and entities. Entity structure diagrams use the diagramming notation of Jackson Structured Programming structure diagrams. Purpose of these diagrams is to create a full description of the aspects of the system and the organisation. Developers have to decide which things are important and which are not. Good communication between developers and users of the new system is very important.

This stage is the combination of the former entity/action step and the entity structures step.

Network stage

In the network stage a model of the system as a whole is developed and represented as a system specification diagram (SSD) (also known as a network diagram ). Network diagrams show processes (rectangles) and how they communicate with each other, either via state vector connections (diamonds) or via datastream connections (circles). In this stage, the functionality of the system is defined. Each entity becomes a process or program in the network diagram. External programs are later added to the network diagrams. The purpose of these programs is to process input, calculate output and to keep the entity processes up-to-date. The whole system is described with these network diagrams and are completed with descriptions about the data and connections between the processes and programs.

The initial model step specifies a simulation of the real world. The function step adds to this simulation the further executable operations and processes needed to produce output of the system. System timing step provides synchronisation among processes, introduces constraints. This stage is the combination of the former ‘Initial model’ step, the ‘function’ step and the ‘system timing’ step.

Implementation stage

In the implementation stage the abstract network model of the solution is converted into a physical system, represented as a system implementation diagram (SID). The SID shows the system as a scheduler process that calls modules that implement the processes. Datastreams are represented as calls to inverted processes. Database symbols represent collections of entity state-vectors, and there are special symbols for file buffers (which must be implemented when processes are scheduled to run at different time intervals).

The central concern of implementation step is optimization of the system. It is necessary to reduce the number of processes because it is impossible to provide each process that is contained in specification with its own virtual processor. By means of transformation, processes are combined in order to limit their number to the number of processors.

Designing the diagrams

Entity structure diagram (ESD)

The diagram shows how the action entities cooperate with the system. Entity structure diagram (ESD) notations:

Normally there would be only one action underneath a RecurringConstruct.

Network diagram (ND)

Network diagrams show the interaction between the processes. Sometimes they are referred to as system specification diagrams (SSDs). Network diagram (ND) notations:

The difference between a state vector connection and a data stream connection lies in which process is active. In a datastream connection, the process with the information, A, is the active process; it actively sends a message to the datastream reader B at a time that it (A, the sender) chooses. In a state vector inspection, the process with the information, A, is passive; it does nothing but let the reader process B inspect its (A's) state vector. B, the process doing the inspection, is the active process; it decides when it will read information from A. Roughly speaking, datastream connection is an abstraction of message passing, while state vector inspection is an abstraction for polling (and for database retrieval).

Related Research Articles

<span class="mw-page-title-main">Jackson structured programming</span>

Jackson structured programming (JSP) is a method for structured programming developed by British software consultant Michael A. Jackson and described in his 1975 book Principles of Program Design. The technique of JSP is to analyze the data structures of the files that a program must read as input and produce as output, and then produce a program design based on those data structures, so that the program control structure handles those data structures in a natural and intuitive way.

<span class="mw-page-title-main">Data model</span> Model that organizes elements of data and how they relate to one another and to real-world entities.

A data model is an abstract model that organizes elements of data and standardizes how they relate to one another and to the properties of real-world entities. For instance, a data model may specify that the data element representing a car be composed of a number of other elements which, in turn, represent the color and size of the car and define its owner.

Michael Anthony Jackson is a British computer scientist, and independent computing consultant in London, England. He is also a visiting research professor at the Open University in the UK.

Software design is the process of conceptualizing how a software system will work before it is implemented or modified. Software design also refers to the direct result of the design process – the concepts of how the software will work which consists of both design documentation and undocumented concepts.

In software and systems engineering, the phrase use case is a polyseme with two senses:

  1. A usage scenario for a piece of software; often used in the plural to suggest situations where a piece of software may be useful.
  2. A potential scenario in which a system receives an external request and responds to it.

A modeling language is any artificial language that can be used to express data, information or knowledge or systems in a structure that is defined by a consistent set of rules. The rules are used for interpretation of the meaning of components in the structure of a programing language.

<span class="mw-page-title-main">Systems development life cycle</span> Systems engineering terms

In systems engineering, information systems and software engineering, the systems development life cycle (SDLC), also referred to as the application development life cycle, is a process for planning, creating, testing, and deploying an information system. The SDLC concept applies to a range of hardware and software configurations, as a system can be composed of hardware only, software only, or a combination of both. There are usually six stages in this cycle: requirement analysis, design, development and testing, implementation, documentation, and evaluation.

Structured systems analysis and design method (SSADM) is a systems approach to the analysis and design of information systems. SSADM was produced for the Central Computer and Telecommunications Agency, a UK government office concerned with the use of technology in government, from 1980 onwards.

<span class="mw-page-title-main">Business process modeling</span> Activity of representing processes of an enterprise

Business process modeling (BPM), mainly used in business process management; software development or systems engineering, is the action of capturing and representing processes of an enterprise, so that the current business processes may be analyzed, applied securely and consistently, improved, and automated. BPM is typically orchestrated by business analysts, leveraging their expertise in modeling practices. Subject matter experts, equipped with specialized knowledge of the processes being modeled, often collaborate within these teams. Alternatively, process models can be directly derived from digital traces within IT systems, such as event logs, utilizing process mining tools.

A data-flow diagram is a way of representing a flow of data through a process or a system. The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram has no control flowthere are no decision rules and no loops. Specific operations based on the data can be represented by a flowchart.

Executable UML is both a software development method and a highly abstract software language. It was described for the first time in 2002 in the book "Executable UML: A Foundation for Model-Driven Architecture". The language "combines a subset of the UML graphical notation with executable semantics and timing rules." The Executable UML method is the successor to the Shlaer–Mellor method.

In computer programming, flow-based programming (FBP) is a programming paradigm that defines applications as networks of black box processes, which exchange data across predefined connections by message passing, where the connections are specified externally to the processes. These black box processes can be reconnected endlessly to form different applications without having to be changed internally. FBP is thus naturally component-oriented.

The Toolkit for Conceptual Modeling (TCM) is a collection of software tools to present specifications of software systems in the form of diagrams, tables, trees, and the like. TCM offers editors for techniques used in Structured Analysis as well as editors for object-oriented (UML) techniques. For some of the behavior specification techniques, an interface to model checkers is offered. More in particular, TCM contains the following editors.

<span class="mw-page-title-main">Enterprise modelling</span>

Enterprise modelling is the abstract representation, description and definition of the structure, processes, information and resources of an identifiable business, government body, or other large organization.

<span class="mw-page-title-main">Structured analysis</span>

In software engineering, structured analysis (SA) and structured design (SD) are methods for analyzing business requirements and developing specifications for converting practices into computer programs, hardware configurations, and related manual procedures.

<span class="mw-page-title-main">V-model (software development)</span> Software development methodology

In software development, the V-model represents a development process that may be considered an extension of the waterfall model and is an example of the more general V-model. Instead of moving down linearly, the process steps are bent upwards after the coding phase, to form the typical V shape. The V-Model demonstrates the relationships between each phase of the development life cycle and its associated phase of testing. The horizontal and vertical axes represent time or project completeness (left-to-right) and level of abstraction, respectively.

<span class="mw-page-title-main">Function model</span>

In systems engineering, software engineering, and computer science, a function model or functional model is a structured representation of the functions within the modeled system or subject area.

<span class="mw-page-title-main">Structure chart</span> Chart

A structure chart (SC) in software engineering and organizational theory is a chart which shows the breakdown of a system to its lowest manageable levels. They are used in structured programming to arrange program modules into a tree. Each module is represented by a box, which contains the module's name. The tree structure visualizes the relationships between modules.

The Lifecycle Modeling Language (LML) is an open-standard modeling language designed for systems engineering. It supports the full lifecycle: conceptual, utilization, support and retirement stages. Along with the integration of all lifecycle disciplines including, program management, systems and design engineering, verification and validation, deployment and maintenance into one framework. LML was originally designed by the LML steering committee. The specification was published October 17, 2013.

SensorThings API is an Open Geospatial Consortium (OGC) standard providing an open and unified framework to interconnect IoT sensing devices, data, and applications over the Web. It is an open standard addressing the syntactic interoperability and semantic interoperability of the Internet of Things. It complements the existing IoT networking protocols such CoAP, MQTT, HTTP, 6LowPAN. While the above-mentioned IoT networking protocols are addressing the ability for different IoT systems to exchange information, OGC SensorThings API is addressing the ability for different IoT systems to use and understand the exchanged information. As an OGC standard, SensorThings API also allows easy integration into existing Spatial Data Infrastructures or Geographic Information Systems.

References

  1. 1 2 "A System development method Archived 2012-02-06 at the Wayback Machine " by M. A. Jackson, published in Tools and notions for program construction: An advanced course, Cambridge University Press, 1982
  2. System Development, M. A. Jackson, Prentice Hall, 1983
  3. JSP and JSD: The Jackson Approach to Software Development, ed. John R. Cameron (IEEE Computer Society Press, ISBN   0-8186-8858-0, 1989)
  4. LBMS Jackson system development, Version 2.0 Method manual by LBMS (Learmonth, Burchett Management Systems), John Wiley & Sons, ISBN   0-471-93565-4; 1992
  5. Decision systems Inc. (2002), Jackson System Development. Accessed 24 Nov 2008.

Further reading