Taguchi loss function

Last updated

The Taguchi loss function is graphical depiction of loss developed by the Japanese business statistician Genichi Taguchi to describe a phenomenon affecting the value of products produced by a company. Praised by Dr. W. Edwards Deming (the business guru of the 1980s American quality movement), [1] it made clear the concept that quality does not suddenly plummet when, for instance, a machinist exceeds a rigid blueprint tolerance. Instead 'loss' in value progressively increases as variation increases from the intended condition. This was considered a breakthrough in describing quality, and helped fuel the continuous improvement movement.

Contents

The concept of Taguchi's quality loss function was in contrast with the American concept of quality, popularly known as goal post philosophy, the concept given by American quality guru Phil Crosby. Goal post philosophy emphasizes that if a product feature doesn't meet the designed specifications it is termed as a product of poor quality (rejected), irrespective of amount of deviation from the target value (mean value of tolerance zone). This concept has similarity with the concept of scoring a 'goal' in the game of football or hockey, because a goal is counted 'one' irrespective of the location of strike of the ball in the 'goal post', whether it is in the center or towards the corner. This means that if the product dimension goes out of the tolerance limit the quality of the product drops suddenly.

Through his concept of the quality loss function, Taguchi explained that from the customer's point of view this drop of quality is not sudden. The customer experiences a loss of quality the moment product specification deviates from the 'target value'. This 'loss' is depicted by a quality loss function and it follows a parabolic curve mathematically given by L = k(y–m)2, where m is the theoretical 'target value' or 'mean value' and y is the actual size of the product, k is a constant and L is the loss. This means that if the difference between 'actual size' and 'target value' i.e. (ym) is large, loss would be more, irrespective of tolerance specifications. In Taguchi's view tolerance specifications are given by engineers and not by customers; what the customer experiences is 'loss'. This equation is true for a single product; if 'loss' is to be calculated for multiple products the loss function is given by L = k[S2 + ( – m)2], where S2 is the 'variance of product size' and is the average product size.

Overview

The Taguchi loss function is important for a number of reasonsprimarily, to help engineers better understand the importance of designing for variation.

See also

Taguchi also focus on Robust design of model.

Related Research Articles

W. Edwards Deming American professor, author, and consultant

William Edwards Deming was an American engineer, statistician, professor, author, lecturer, and management consultant. Educated initially as an electrical engineer and later specializing in mathematical physics, he helped develop the sampling techniques still used by the U.S. Department of the Census and the Bureau of Labor Statistics.

Pearson correlation coefficient type of coefficient

In statistics, the Pearson correlation coefficient, also referred to as Pearson's r, the Pearson product-moment correlation coefficient (PPMCC), or the bivariate correlation, is a statistic that measures linear correlation between two variables X and Y. It has a value between +1 and −1. A value of +1 is total positive linear correlation, 0 is no linear correlation, and −1 is total negative linear correlation.

Six Sigma () is a set of techniques and tools for process improvement. It was introduced by American engineer Bill Smith while working at Motorola in 1986. Jack Welch made it central to his business strategy at General Electric in 1995. A six sigma process is one in which 99.99966% of all opportunities to produce some feature of a part are statistically expected to be free of defects.

Control chart statistical process control tool

Control charts, also known as Shewhart charts or process-behavior charts, are a statistical process control tool used to determine if a manufacturing or business process is in a state of control. It is more appropriate to say that the control charts are the graphical device for Statistical Process Monitoring (SPM). Traditional control charts are mostly designed to monitor process parameters when underlying form of the process distributions are known. However, more advanced techniques are available in the 21st century where incoming data streaming can-be monitored even without any knowledge of the underlying process distributions. Distribution-free control charts are becoming increasingly popular.

Genichi Taguchi was an engineer and statistician. From the 1950s onwards, Taguchi developed a methodology for applying statistics to improve the quality of manufactured goods. Taguchi methods have been controversial among some conventional Western statisticians, but others have accepted many of the concepts introduced by him as valid extensions to the body of knowledge.

Taguchi methods are statistical methods, sometimes called robust design methods, developed by Genichi Taguchi to improve the quality of manufactured goods, and more recently also applied to engineering,biotechnology, marketing and advertising. Professional statisticians have welcomed the goals and improvements brought about by Taguchi methods, particularly by Taguchi's development of designs for studying variation, but have criticized the inefficiency of some of Taguchi's proposals.

In mathematical optimization and decision theory, a loss function or cost function is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost" associated with the event. An optimization problem seeks to minimize a loss function. An objective function is either a loss function or its negative, in which case it is to be maximized.

Geometric dimensioning and tolerancing standardized code for engineers to specify parts for manufacturing

Geometric Dimensioning and Tolerancing (GD&T) is a system for defining and communicating engineering tolerances. It uses a symbolic language on engineering drawings and computer-generated three-dimensional solid models that explicitly describe nominal geometry and its allowable variation. It tells the manufacturing staff and machines what degree of accuracy and precision is needed on each controlled feature of the part. GD&T is used to define the nominal geometry of parts and assemblies, to define the allowable variation in form and possible size of individual features, and to define the allowable variation between features.

Engineering tolerance permissible limit or limits of variation in value or dimension

Engineering tolerance is the permissible limit or limits of variation in:

  1. a physical dimension;
  2. a measured value or physical property of a material, manufactured object, system, or service;
  3. other measured values ;
  4. in engineering and safety, a physical distance or space (tolerance), as in a truck (lorry), train or boat under a bridge as well as a train in a tunnel ;
  5. in mechanical engineering the space between a bolt and a nut or a hole, etc.

Statistical process control (SPC) is a method of quality control which employs statistical methods to monitor and control a process. This helps to ensure that the process operates efficiently, producing more specification-conforming products with less waste. SPC can be applied to any process where the "conforming product" output can be measured. Key tools used in SPC include run charts, control charts, a focus on continuous improvement, and the design of experiments. An example of a process where SPC is applied is manufacturing lines.

A tolerance interval is a statistical interval within which, with some confidence level, a specified proportion of a sampled population falls. "More specifically, a 100×p%/100×(1−α) tolerance interval provides limits within which at least a certain proportion (p) of the population falls with a given level of confidence (1−α)." "A tolerance interval (TI) based on a sample is constructed so that it would include at least a proportion p of the sampled population with confidence 1−α; such a TI is usually referred to as p-content − (1−α) coverage TI." "A upper tolerance limit (TL) is simply a 1−α upper confidence limit for the 100 p percentile of the population."

Statistical learning theory is a framework for machine learning drawing from the fields of statistics and functional analysis. Statistical learning theory deals with the problem of finding a predictive function based on data. Statistical learning theory has led to successful applications in fields such as computer vision, speech recognition, and bioinformatics.

In the context of software engineering, software quality refers to two related but distinct notions:

Quality management ensures that an organization, product or service is consistent. It has four main components: quality planning, quality assurance, quality control and quality improvement. Quality management is focused not only on product and service quality, but also on the means to achieve it. Quality management, therefore, uses quality assurance and control of processes as well as products to achieve more consistent quality. What a customer wants and is willing to pay for it determines quality. It is a written or unwritten commitment to a known or unknown consumer in the market. Thus, quality can be defined as fitness for intended use or, in other words, how well the product performs its intended function.

The following is a glossary of terms used in the mathematical sciences statistics and probability.

In process improvement efforts, the process capability index or process capability ratio is a statistical measure of process capability: the ability of a process to produce output within specification limits. The concept of process capability only holds meaning for processes that are in a state of statistical control. Process capability indices measure how much "natural variation" a process experiences relative to its specification limits and allows different processes to be compared with respect to how well an organization controls them.

ANOVA gauge repeatability and reproducibility is a measurement systems analysis technique that uses an analysis of variance (ANOVA) random effects model to assess a measurement system.

In business, engineering, and manufacturing, quality has a pragmatic interpretation as the non-inferiority or superiority of something; it's also defined as being suitable for its intended while satisfying customer expectations. Quality is a perceptual, conditional, and somewhat subjective attribute and may be understood differently by different people. Consumers may focus on the specification quality of a product/service, or how it compares to competitors in the marketplace. Producers might measure the conformance quality, or degree to which the product/service was produced correctly. Support personnel may measure quality in the degree that a product is reliable, maintainable, or sustainable.

Lean dynamics is a business management practice that emphasizes the same primary outcome as lean manufacturing or lean production of eliminating wasteful expenditure of resources. However, it is distinguished by its different focus of creating a structure for accommodating the dynamic business conditions that cause these wastes to accumulate in the first place.

SDI Tools commercial software

SDI Tools is a set of commercial software add-in tools for Microsoft Excel developed and distributed by Statistical Design Institute, LLC., a privately owned company located in Texas, United States.

References

  1. Deming, W. Edwards (1993). The New Economics: For Industry, Government, Education. MIT Press. ISBN   0-911379-05-3.