Quality and Reliability Engineering International

Last updated


Related Research Articles

Engineering statistics combines engineering and statistics using scientific methods for analyzing data. Engineering statistics involves data concerning manufacturing processes such as: component dimensions, tolerances, type of material, and fabrication process control. There are many methods used in engineering analysis and they are often displayed as histograms to give a visual of the data as opposed to being just numerical. Examples of methods are:

  1. Design of Experiments (DOE) is a methodology for formulating scientific and engineering problems using statistical models. The protocol specifies a randomization procedure for the experiment and specifies the primary data-analysis, particularly in hypothesis testing. In a secondary analysis, the statistical analyst further examines the data to suggest other questions and to help plan future experiments. In engineering applications, the goal is often to optimize a process or product, rather than to subject a scientific hypothesis to test of its predictive adequacy. The use of optimal designs reduces the cost of experimentation.
  2. Quality control and process control use statistics as a tool to manage conformance to specifications of manufacturing processes and their products.
  3. Time and methods engineering use statistics to study repetitive operations in manufacturing in order to set standards and find optimum manufacturing procedures.
  4. Reliability engineering which measures the ability of a system to perform for its intended function and has tools for improving performance.
  5. Probabilistic design involving the use of probability in product and system design
  6. System identification uses statistical methods to build mathematical models of dynamical systems from measured data. System identification also includes the optimal design of experiments for efficiently generating informative data for fitting such models.
<span class="mw-page-title-main">Safety engineering</span> Engineering discipline which assures that engineered systems provide acceptable levels of safety

Safety engineering is an engineering discipline which assures that engineered systems provide acceptable levels of safety. It is strongly related to industrial engineering/systems engineering, and the subset system safety engineering. Safety engineering assures that a life-critical system behaves as needed, even when components fail.

In reliability engineering, the term availability has the following meanings:

In engineering, maintainability is the ease with which a product can be maintained to:

Genichi Taguchi was an engineer and statistician. From the 1950s onwards, Taguchi developed a methodology for applying statistics to improve the quality of manufactured goods. Taguchi methods have been controversial among some conventional Western statisticians, but others have accepted many of the concepts introduced by him as valid extensions to the body of knowledge.

In systems engineering, dependability is a measure of a system's availability, reliability, maintainability, and in some cases, other characteristics such as durability, safety and security. In real-time computing, dependability is the ability to provide services that can be trusted within a time-period. The service guarantees must hold even when the system is subject to attacks or natural failures.

Design for Six Sigma (DFSS) is an engineering design process, business process management method related to traditional Six Sigma. It is used in many industries, like finance, marketing, basic engineering, process industries, waste management, and electronics. It is based on the use of statistical tools like linear regression and enables empirical research similar to that performed in other fields, such as social science. While the tools and order used in Six Sigma require a process to be in place and functioning, DFSS has the objective of determining the needs of customers and the business, and driving those needs into the product solution so created. It is used for product or process design in contrast with process improvement. Measurement is the most important part of most Six Sigma or DFSS tools, but whereas in Six Sigma measurements are made from an existing process, DFSS focuses on gaining a deep insight into customer needs and using these to inform every design decision and trade-off.

Failure mode and effects analysis is the process of reviewing as many components, assemblies, and subsystems as possible to identify potential failure modes in a system and their causes and effects. For each component, the failure modes and their resulting effects on the rest of the system are recorded in a specific FMEA worksheet. There are numerous variations of such worksheets. An FMEA can be a qualitative analysis, but may be put on a quantitative basis when mathematical failure rate models are combined with a statistical failure mode ratio database. It was one of the first highly structured, systematic techniques for failure analysis. It was developed by reliability engineers in the late 1950s to study problems that might arise from malfunctions of military systems. An FMEA is often the first step of a system reliability study.

In the context of software engineering, software quality refers to two related but distinct notions:

<span class="mw-page-title-main">ISO/IEC 9126</span> Former ISO and IEC standard

ISO/IEC 9126Software engineering — Product quality was an international standard for the evaluation of software quality. It has been replaced by ISO/IEC 25010:2011.

Methods engineering is a subspecialty of industrial engineering and manufacturing engineering concerned with human integration in industrial production processes.

Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability describes the ability of a system or component to function under stated conditions for a specified period of time. Reliability is closely related to availability, which is typically described as the ability of a component or system to function at a specified moment or interval of time.

The cleanroom software engineering process is a software development process intended to produce software with a certifiable level of reliability. The cleanroom process was originally developed by Harlan Mills and several of his colleagues including Alan Hevner at IBM. The focus of the cleanroom process is on defect prevention, rather than defect removal. The name "cleanroom" was chosen to evoke the cleanrooms used in the electronics industry to prevent the introduction of defects during the fabrication of semiconductors. The cleanroom process first saw use in the mid to late 1980s. Demonstration projects within the military began in the early 1990s. Recent work on the cleanroom process has examined fusing cleanroom with the automated verification capabilities provided by specifications expressed in CSP.

Product engineering refer to the process of designing and developing a device, assembly, or system such that it be produced as an item for sale through some product manufacturing process. Product engineering usually entails activity dealing with issues of cost, producibility, quality, performance, reliability, serviceability, intended lifespan and user features. These product characteristics are generally all sought in the attempt to make the resulting product attractive to its intended market and a successful contributor to the business of the organization that intends to offer the product to that market. It includes design, development and transitioning to manufacturing of the product. The term encompasses developing the concept of the product and the design and development of its hardware and software components. After the initial design and development is done, transitioning the product to manufacture it in volumes is considered part of product engineering.

Probabilistic design is a discipline within engineering design. It deals primarily with the consideration of the effects of random variability upon the performance of an engineering system during the design phase. Typically, these effects are related to quality and reliability. Thus, probabilistic design is a tool that is mostly used in areas that are concerned with quality and reliability. For example, product design, quality control, systems engineering, machine design, civil engineering and manufacturing. It differs from the classical approach to design by assuming a small probability of failure instead of using the safety factor.

Quality engineering is the discipline of engineering concerned with the principles and practice of product and service quality assurance and control. In software development, it is the management, development, operation and maintenance of IT systems and enterprise architectures with a high quality standard.

<span class="mw-page-title-main">Project commissioning</span> Process of assuring all systems and components are operational

Project commissioning is the process of assuring that all systems and components of a building or industrial plant are designed, installed, tested, operated, and maintained according to the operational requirements of the owner or final client. A commissioning process may be applied not only to new projects but also to existing units and systems subject to expansion, renovation or revamping.

Site reliability engineering (SRE) is a set of principles and practices that incorporates aspects of software engineering and applies them to IT infrastructure and operations. The main objectives are to create highly reliable and scalable software systems. Site reliability engineering has been described as a specific implementation of DevOps.

<span class="mw-page-title-main">École polytechnique de l'université d'Angers</span>

École polytechnique de l'université d'Angers is a French engineering College created in 1991.