Safety-Critical Systems Club

Last updated

Safety-Critical Systems Club
AbbreviationSCSC
Established1991;33 years ago (1991)
Legal status CIC
PurposeEducation, Professional community
Membership
1,348[ citation needed ]
Affiliations
List
Website scsc.uk

The Safety-Critical Systems Club (SCSC) [1] is a professional association in the United Kingdom. [2] [3] It aims to share knowledge about safety-critical systems, including current and emerging practices in safety engineering, software engineering, and product and process safety standards. [4]

Contents

Activities

Since it started in 1991, the Club has met its objectives by holding regular one- and two- day seminars, publishing a newsletter three times per year, and running an annual conference – the Safety-critical Systems Symposium (SSS), for which it publishes proceedings. [5] In performing these functions, and in adding tutorials to its programme, the Club has been instrumental in helping to define the requirements for education and training in the safety-critical systems domain.

The SCSC also implements initiatives to improve professionalism in the field of safety-critical systems engineering, and organises various working groups to develop and maintain industry-standard guidance. Notable outputs of these groups include the Data Safety Guidance, Service Assurance Guidance and Safety Assurance Objectives for Autonomous Systems, which have been adopted by UK government organisations such as the NHS, [6] Dstl [7] [8] and the Ministry of Defence; [9] and the Goal Structuring Notation (GSN) community standard, which has influenced the development of the OMG's Structured Assurance Case Metamodel standard. [10]

History

The Safety-Critical Systems Club formally commenced operation on 1 May 1991 as the result of a contract placed by the UK Department of Trade and Industry (DTI) and the Science and Engineering Research Council (SERC). [11] [12] A report to the UK Parliamentary and Scientific Committee on the science of safety-critical systems led to the 'SafeIT' programme, which recommended formation of the Club. [13] As part of their safety-critical systems research programme, [14] the DTI and SERC awarded a three-year contract for organising and running the Safety-Critical Systems Club to the Institution of Electrical Engineers, [15] the British Computer Society, [16] and the University of Newcastle upon Tyne, the last of these to implement the organisation. [12] The SCSC became self-sufficient in 1994, based at Newcastle University through the Centre for Software Reliability. [17] Activities included detailed technical work, such as planning and organising events and editing the SCSC newsletter and other publications. From the start, the UK Health and Safety Executive was an active supporter of the Club, and, along with all the other organisations already mentioned, remains so.

It was intended that the Club should include in its ambit both technical and managerial personnel, and that it should facilitate communication among all sections of the safety-critical systems community.

The inaugural seminar, intended to introduce the Club to the safety-critical systems community, took place at UMIST, Manchester, on 11 July 1991 and attracted 256 delegates. The need for such an organisation was perceived by many in the software-engineering and safety-critical systems communities. [18]

Management of the SCSC moved to the University of York in 2016. [18] In 2020 it became an independent community interest company. [4] [19]

See also

Related Research Articles

<span class="mw-page-title-main">Safety-critical system</span> System whose failure would be serious

A safety-critical system or life-critical system is a system whose failure or malfunction may result in one of the following outcomes:

The Defence Evaluation and Research Agency (DERA) was a part of the UK Ministry of Defence (MoD) between 1995 and 2 July 2001. At the time it was the United Kingdom's largest science and technology organisation. It was regarded by its official history as 'a jewel in the crown' of both government and industry.

In the context of software engineering, software quality refers to two related but distinct notions:

Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability describes the ability of a system or component to function under stated conditions for a specified period of time. Reliability is closely related to availability, which is typically described as the ability of a component or system to function at a specified moment or interval of time.

Integrated logistics support (ILS) is a technology in the system engineering to lower a product life cycle cost and decrease demand for logistics by the maintenance system optimization to ease the product support. Although originally developed for military purposes, it is also widely used in commercial customer service organisations.

In functional safety, safety integrity level (SIL) is defined as the relative level of risk-reduction provided by a safety instrumented function (SIF), i.e. the measurement of the performance required of the SIF.

Software assurance (SwA) is a critical process in software development that ensures the reliability, safety, and security of software products. It involves a variety of activities, including requirements analysis, design reviews, code inspections, testing, and formal verification. One crucial component of software assurance is secure coding practices, which follow industry-accepted standards and best practices, such as those outlined by the Software Engineering Institute (SEI) in their CERT Secure Coding Standards (SCS).

A hazard analysis is used as the first step in a process used to assess risk. The result of a hazard analysis is the identification of different types of hazards. A hazard is a potential condition and exists or not. It may, in single existence or in combination with other hazards and conditions, become an actual Functional Failure or Accident (Mishap). The way this exactly happens in one particular sequence is called a scenario. This scenario has a probability of occurrence. Often a system has many potential failure scenarios. It also is assigned a classification, based on the worst case severity of the end condition. Risk is the combination of probability and severity. Preliminary risk levels can be provided in the hazard analysis. The validation, more precise prediction (verification) and acceptance of risk is determined in the risk assessment (analysis). The main goal of both is to provide the best selection of means of controlling or eliminating the risk. The term is used in several engineering specialties, including avionics, food safety, occupational safety and health, process safety, reliability engineering.

<span class="mw-page-title-main">Defence Science and Technology Laboratory</span> U.K. Government executive agency

The Defence Science and Technology Laboratory (Dstl) is an executive agency of the Ministry of Defence of the United Kingdom. Its stated purpose is "to maximise the impact of science and technology for the defence and security of the UK". The agency is headed by Paul Hollinshead as its chief executive, with the board being chaired by Adrian Belton. Ministerial responsibility lies with the Minister for Defence Procurement.

The British Ministry of Defence Architecture Framework (MODAF) was an architecture framework which defined a standardised way of conducting enterprise architecture, originally developed by the UK Ministry of Defence. It has since been replaced with the NATO Architecture Framework.

<span class="mw-page-title-main">Chartered Quality Institute</span> Chartered body for quality professionals

The Chartered Quality Institute (CQI), formerly known as the Institute of Quality Assurance (IQA), is the chartered body for quality professionals. It improves the performance of organizations by developing their capabilities in quality management. As a registered charity, the CQI exists to advance education in, knowledge of, and the practice of quality in industry, the public sector, and the voluntary sector.

The Central Computer and Telecommunications Agency (CCTA) was a UK government agency providing computer and telecoms support to government departments.

SOUP stands for software of unknownpedigree, and is a term often used in the context of safety-critical and safety-involved systems such as medical software. SOUP is software that has not been developed with a known software development process or methodology, or which has unknown or no safety-related properties.

The Motor Industry Software Reliability Association (MISRA) is an organization that produces guidelines for the software developed for electronic components used in the automotive industry. It is a collaboration between vehicle manufacturers, component suppliers and engineering consultancies. In 2021, the loose consortium restructured as The MISRA Consortium Limited.

The Centre for Software Reliability (CSR) is a distributed British organisation concerned with software reliability, including safety-critical issues. It consists of two sister organisations based at Newcastle University, UK. and City, University of London, London.

<span class="mw-page-title-main">TRAK</span> Enterprise architecture framework

TRAK, or The Rail Architecture Framework, is a general enterprise architecture framework aimed at systems engineers. It is based on MODAF 1.2.

The UK Large-Scale Complex IT Systems (LSCITS) Initiative is a research and graduate education programme focusing on the problems of developing large-scale, complex IT systems. The initiative is funded by the EPSRC, with more than ten million pounds of funding awarded between 2006 and 2013.

<span class="mw-page-title-main">Nexor</span>

Nexor Limited is a privately held company based in Nottingham, providing products and services to safeguard government, defence and critical national infrastructure computer systems. It was originally known as X-Tel Services Limited.

The United Kingdom has a diverse cyber security community, interconnected in a complex network.

The Trustworthy Software Foundation (TSFdn) is a UK not-for-profit organisation, with stated aim of improving software.

References

  1. "Safety-Critical Systems Club website". UK. Retrieved 21 October 2016.
  2. Bowen, Jonathan P. (1993). "Formal methods in safety-critical standards". Proceedings of the 1993 Software Engineering Standards Symposium. Brighton, UK: IEEE Computer Society Press. pp. 168–177. doi:10.1109/SESS.1993.263953.
  3. "Safety-Critical Systems Club". NationalRural. 1 May 2007. Archived from the original on 23 October 2016.
  4. 1 2 "SCSC About the club". SCSC. Retrieved 31 August 2022.
  5. "Safety-critical Systems Symposium". Google Books . Retrieved 1 September 2022.
  6. Clinical Risk Management Data Safety, NHS Digital and SCSC, 27 September 2018, retrieved 13 October 2022
  7. Dstl (12 October 2021), Crumbs! Understanding Data: a Dstl biscuit book, Ministry of Defence
  8. Dstl (6 October 2021), Assurance of Artificial Intelligence and Autonomous Systems: a Dstl biscuit book, Ministry of Defence
  9. DStan (28 July 2021), Defence Standard 00-055 Part 01/Issue 5 - Requirements for Safety of Programmable Elements (PE) in Defence Systems , Ministry of Defence
  10. Structured Assurance Case Metamodel (SACM), Object Management Group, April 2022
  11. Malcolm, Bob (February 1991), Safety Critical Systems Research Programme, DTI, London: DTI
  12. 1 2 "The National CSR – Structure and History". Centre for Software Reliability . City University of London. Archived from the original on 23 October 2016.
  13. "Safety Critical Systems" (PDF). Briefing Note. Vol. 20. UK: Parliamentary Office of Science and Technology. January 1991. Retrieved 21 October 2016.
  14. "SERC Critical Systems R&D Programme". SERC. 1991.
  15. Safety-related Systems, Professional Brief, IEE, October 1991
  16. "Safety-Critical Systems Club". The Computer Bulletin . BCS. July 2001. Archived from the original on 17 January 2019.
  17. Redmill, Felix; Anderson, Tom, eds. (1997). "The Safety-Critical Systems Club". Safer Systems: Proceedings of the Fifth Safety-critical Systems Symposium. Brighton: Springer. p. ix. doi:10.1007/978-1-4471-0975-4.
  18. 1 2 Redmill, Felix (May 2016). "25 Years of Safety Systems and of the Safety-Critical Systems Club". Safety Systems. Vol. 25, no. 3. SCSC.
  19. "Safety Critical Systems Club C.I.C." Companies House . Retrieved 31 August 2022.