Design for Six Sigma

Last updated

Design for Six Sigma (DFSS) is a collection of best-practices for the development of new products and processes. It is sometimes deployed as an engineering design process or business process management method. DFSS originated at General Electric to build on the success they had with traditional Six Sigma; but instead of process improvement, DFSS was made to target new product development. It is used in many industries, like finance, marketing, basic engineering, process industries, waste management, and electronics. It is based on the use of statistical tools like linear regression and enables empirical research similar to that performed in other fields, such as social science. While the tools and order used in Six Sigma require a process to be in place and functioning, DFSS has the objective of determining the needs of customers and the business, and driving those needs into the product solution so created. It is used for product or process design in contrast with process improvement. [1] Measurement is the most important part of most Six Sigma or DFSS tools, but whereas in Six Sigma measurements are made from an existing process, DFSS focuses on gaining a deep insight into customer needs and using these to inform every design decision and trade-off.

Contents

There are different options for the implementation of DFSS. Unlike Six Sigma, which is commonly driven via DMAIC (Define - Measure - Analyze - Improve - Control) projects, DFSS has spawned a number of stepwise processes, all in the style of the DMAIC procedure. [2]

DMADV, define – measure – analyze – design – verify, is sometimes synonymously referred to as DFSS, although alternatives such as IDOV (Identify, Design, Optimize, Verify) are also used. The traditional DMAIC Six Sigma process, as it is usually practiced, which is focused on evolutionary and continuous improvement manufacturing or service process development, usually occurs after initial system or product design and development have been largely completed. DMAIC Six Sigma as practiced is usually consumed with solving existing manufacturing or service process problems and removal of the defects and variation associated with defects. It is clear that manufacturing variations may impact product reliability. So, a clear link should exist between reliability engineering and Six Sigma (quality). In contrast, DFSS (or DMADV and IDOV) strives to generate a new process where none existed, or where an existing process is deemed to be inadequate and in need of replacement. DFSS aims to create a process with the end in mind of optimally building the efficiencies of Six Sigma methodology into the process before implementation; traditional Six Sigma seeks for continuous improvement after a process already exists.

DFSS as an approach to design

DFSS seeks to avoid manufacturing/service process problems by using advanced techniques to avoid process problems at the outset (e.g., fire prevention). When combined, these methods obtain the proper needs of the customer, and derive engineering system parameter requirements that increase product and service effectiveness in the eyes of the customer and all other people. This yields products and services that provide great customer satisfaction and increased market share. These techniques also include tools and processes to predict, model and simulate the product delivery system (the processes/tools, personnel and organization, training, facilities, and logistics to produce the product/service). In this way, DFSS is closely related to operations research (solving the knapsack problem), workflow balancing. DFSS is largely a design activity requiring tools including: quality function deployment (QFD), axiomatic design, TRIZ, Design for X, design of experiments (DOE), Taguchi methods, tolerance design, robustification and Response Surface Methodology for a single or multiple response optimization. While these tools are sometimes used in the classic DMAIC Six Sigma process, they are uniquely used by DFSS to analyze new and unprecedented products and processes. It is a concurrent analyzes directed to manufacturing optimization related to the design.

Critics

Response surface methodology and other DFSS tools uses statistical (often empirical) models, and therefore practitioners need to be aware that even the best statistical model is an approximation to reality. In practice, both the models and the parameter values are unknown, and subject to uncertainty on top of ignorance. Of course, an estimated optimum point need not be optimum in reality, because of the errors of the estimates and of the inadequacies of the model. The uncertainties can be handled via a Bayesian predictive approach, which considers the uncertainties in the model parameters as part of the optimization. The optimization is not based on a fitted model for the mean response, E[Y], but rather, the posterior probability that the responses satisfies given specifications is maximized according to the available experimental data. [3]

Nonetheless, response surface methodology has an effective track-record of helping researchers improve products and services: For example, George Box's original response-surface modeling enabled chemical engineers to improve a process that had been stuck at a saddle-point for years. [4]

Distinctions from DMAIC

Proponents of DMAIC, DDICA (Design Develop Initialize Control and Allocate) and Lean techniques might claim that DFSS falls under the general rubric of Six Sigma or Lean Six Sigma (LSS). Both methodologies focus on meeting customer needs and business priorities as the starting-point for analysis. [5] [1]

It is often seen that[ weasel words ] the tools used for DFSS techniques vary widely from those used for DMAIC Six Sigma. In particular, DMAIC, DDICA practitioners often use new or existing mechanical drawings and manufacturing process instructions as the originating information to perform their analysis, while DFSS practitioners often use simulations and parametric system design/analysis tools to predict both cost and performance of candidate system architectures. While it can be claimed that[ weasel words ] two processes are similar, in practice the working medium differs enough so that DFSS requires different tool sets in order to perform its design tasks. DMAIC, IDOV and Six Sigma may still be used during depth-first plunges into the system architecture analysis and for "back end" Six Sigma processes; DFSS provides system design processes used in front-end complex system designs. Back-front systems also are used. This makes 3.4 defects per million design opportunities if done well.

Traditional six sigma methodology, DMAIC, has become a standard process optimization tool for the chemical process industries. However, it has become clear that[ weasel words ] the promise of six sigma, specifically, 3.4 defects per million opportunities (DPMO), is simply unachievable after the fact. Consequently, there has been a growing movement to implement six sigma design usually called design for six sigma DFSS and DDICA tools. This methodology begins with defining customer needs and leads to the development of robust processes to deliver those needs. [6]

Design for Six Sigma emerged from the Six Sigma and the Define-Measure-Analyze-Improve-Control (DMAIC) quality methodologies, which were originally developed by Motorola to systematically improve processes by eliminating defects. Unlike its traditional Six Sigma/DMAIC predecessors, which are usually focused on solving existing manufacturing issues (i.e., "fire fighting"), DFSS aims at avoiding manufacturing problems by taking a more proactive approach to problem solving and engaging the company efforts at an early stage to reduce problems that could occur (i.e., "fire prevention"). The primary goal of DFSS is to achieve a significant reduction in the number of nonconforming units and production variation. It starts from an understanding of the customer expectations, needs and Critical to Quality issues (CTQs) before a design can be completed. Typically in a DFSS program, only a small portion of the CTQs are reliability-related (CTR), and therefore, reliability does not get center stage attention in DFSS. DFSS rarely looks at the long-term (after manufacturing) issues that might arise in the product (e.g. complex fatigue issues or electrical wear-out, chemical issues, cascade effects of failures, system level interactions). [7]

Similarities with other methods

Arguments about what makes DFSS different from Six Sigma demonstrate the similarities between DFSS and other established engineering practices such as probabilistic design and design for quality. In general Six Sigma with its DMAIC roadmap focuses on improvement of an existing process or processes. DFSS focuses on the creation of new value with inputs from customers, suppliers and business needs. While traditional Six Sigma may also use those inputs, the focus is again on improvement and not design of some new product or system. It also shows the engineering background of DFSS. However, like other methods developed in engineering, there is no theoretical reason why DFSS cannot be used in areas outside of engineering. [8] [9]

Software engineering applications

Historically, although the first successful Design for Six Sigma projects in 1989 and 1991 predate establishment of the DMAIC process improvement process, Design for Six Sigma (DFSS) is accepted in part because Six Sigma organisations found that they could not optimise products past three or four Sigma without fundamentally redesigning the product, and because improving a process or product after launch is considered less efficient and effective than designing in quality. ‘Six Sigma’ levels of performance have to be ‘built-in’.

DFSS for software is essentially a non superficial modification of "classical DFSS" since the character and nature of software is different from other fields of engineering. The methodology describes the detailed process for successfully applying DFSS methods and tools throughout the software product design, covering the overall Software Development life cycle: requirements, architecture, design, implementation, integration, optimization, verification and validation (RADIOV). The methodology explains how to build predictive statistical models for software reliability and robustness and shows how simulation and analysis techniques can be combined with structural design and architecture methods to effectively produce software and information systems at Six Sigma levels.

DFSS in software acts as a glue to blend the classical modelling techniques of software engineering such as object-oriented design or Evolutionary Rapid Development with statistical, predictive models and simulation techniques. The methodology provides Software Engineers with practical tools for measuring and predicting the quality attributes of the software product and also enables them to include software in system reliability models.

Data mining and predictive analytics application

Although many tools used in DFSS consulting such as response surface methodology, transfer function via linear and non linear modeling, axiomatic design, simulation have their origin in inferential statistics, statistical modeling may overlap with data analytics and mining,

However, despite that DFSS as a methodology has been successfully used as an end-to-end [technical project frameworks ] for analytic and mining projects, this has been observed by domain experts to be somewhat similar to the lines of CRISP-DM

DFSS is claimed to be better suited for encapsulating and effectively handling higher number of uncertainties including missing and uncertain data, both in terms of acuteness of definition and their absolute total numbers with respect to analytic s and data-mining tasks, six sigma approaches to data-mining are popularly known as DFSS over CRISP [ CRISP- DM referring to data-mining application framework methodology of SPSS ]

With DFSS data mining projects have been observed to have considerably shortened development life cycle . This is typically achieved by conducting data analysis to pre-designed template match tests via a techno-functional approach using multilevel quality function deployment on the data-set.

Practitioners claim that progressively complex KDD templates are created by multiple DOE runs on simulated complex multivariate data, then the templates along with logs are extensively documented via a decision tree based algorithm

DFSS uses Quality Function Deployment and SIPOC for feature engineering of known independent variables, thereby aiding in techno-functional computation of derived attributes

Once the predictive model has been computed, DFSS studies can also be used to provide stronger probabilistic estimations of predictive model rank in a real world scenario

DFSS framework has been successfully applied for predictive analytics pertaining to the HR analytics field, This application field has been considered to be traditionally very challenging due to the peculiar complexities of predicting human behavior.

Related Research Articles

Engineering statistics combines engineering and statistics using scientific methods for analyzing data. Engineering statistics involves data concerning manufacturing processes such as: component dimensions, tolerances, type of material, and fabrication process control. There are many methods used in engineering analysis and they are often displayed as histograms to give a visual of the data as opposed to being just numerical. Examples of methods are:

  1. Design of Experiments (DOE) is a methodology for formulating scientific and engineering problems using statistical models. The protocol specifies a randomization procedure for the experiment and specifies the primary data-analysis, particularly in hypothesis testing. In a secondary analysis, the statistical analyst further examines the data to suggest other questions and to help plan future experiments. In engineering applications, the goal is often to optimize a process or product, rather than to subject a scientific hypothesis to test of its predictive adequacy. The use of optimal designs reduces the cost of experimentation.
  2. Quality control and process control use statistics as a tool to manage conformance to specifications of manufacturing processes and their products.
  3. Time and methods engineering use statistics to study repetitive operations in manufacturing in order to set standards and find optimum manufacturing procedures.
  4. Reliability engineering which measures the ability of a system to perform for its intended function and has tools for improving performance.
  5. Probabilistic design involving the use of probability in product and system design
  6. System identification uses statistical methods to build mathematical models of dynamical systems from measured data. System identification also includes the optimal design of experiments for efficiently generating informative data for fitting such models.
<span class="mw-page-title-main">Systems engineering</span> Interdisciplinary field of engineering

Systems engineering is an interdisciplinary field of engineering and engineering management that focuses on how to design, integrate, and manage complex systems over their life cycles. At its core, systems engineering utilizes systems thinking principles to organize this body of knowledge. The individual outcome of such efforts, an engineered system, can be defined as a combination of components that work in synergy to collectively perform a useful function.

Six Sigma () is a set of techniques and tools for process improvement. It was introduced by American engineer Bill Smith while working at Motorola in 1986.

Quality assurance (QA) is the term used in both manufacturing and service industries to describe the systematic efforts taken to assure that the product(s) delivered to customer(s) meet with the contractual and other agreed upon performance, design, reliability, and maintainability expectations of that customer. The core purpose of Quality Assurance is to prevent mistakes and defects in the development and production of both manufactured products, such as automobiles and shoes, and delivered services, such as automotive repair and athletic shoe design. Assuring quality and therefore avoiding problems and delays when delivering products or services to customers is what ISO 9000 defines as that "part of quality management focused on providing confidence that quality requirements will be fulfilled". This defect prevention aspect of quality assurance differs from the defect detection aspect of quality control and has been referred to as a shift left since it focuses on quality efforts earlier in product development and production and on avoiding defects in the first place rather than correcting them after the fact.

Automotive engineering, along with aerospace engineering and naval architecture, is a branch of vehicle engineering, incorporating elements of mechanical, electrical, electronic, software, and safety engineering as applied to the design, manufacture and operation of motorcycles, automobiles, and trucks and their respective engineering subsystems. It also includes modification of vehicles. Manufacturing domain deals with the creation and assembling the whole parts of automobiles is also included in it. The automotive engineering field is research intensive and involves direct application of mathematical models and formulas. The study of automotive engineering is to design, develop, fabricate, and test vehicles or vehicle components from the concept stage to production stage. Production, development, and manufacturing are the three major functions in this field.

<span class="mw-page-title-main">Product lifecycle</span> Duration of processing of products from inception, to engineering, design & manufacture

In industry, product lifecycle management (PLM) is the process of managing the entire lifecycle of a product from its inception through the engineering, design and manufacture, as well as the service and disposal of manufactured products. PLM integrates people, data, processes, and business systems and provides a product information backbone for companies and their extended enterprises.

Kansei engineering aims at the development or improvement of products and services by translating the customer's psychological feelings and needs into the domain of product design. It was founded by Mitsuo Nagamachi, Professor Emeritus of Hiroshima University. Kansei engineering parametrically links the customer's emotional responses to the properties and characteristics of a product or service. In consequence, products can be designed to bring forward the intended feeling.

Quality management ensures that an organization, product or service consistently functions well. It has four main components: quality planning, quality assurance, quality control and quality improvement. Quality management is focused not only on product and service quality, but also on the means to achieve it. Quality management, therefore, uses quality assurance and control of processes as well as products to achieve more consistent quality. Quality control is also part of quality management. What a customer wants and is willing to pay for it, determines quality. It is a written or unwritten commitment to a known or unknown consumer in the market. Quality can be defined as how well the product performs its intended function.

A functional software architecture (FSA) is an architectural model that identifies enterprise functions, interactions and corresponding IT needs. These functions can be used as a reference by different domain experts to develop IT-systems as part of a co-operative information-driven enterprise. In this way, both software engineers and enterprise architects can create an information-driven, integrated organizational environment.

Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability describes the ability of a system or component to function under stated conditions for a specified period of time. Reliability is closely related to availability, which is typically described as the ability of a component or system to function at a specified moment or interval of time.

<span class="mw-page-title-main">Operations management</span> In business operations, controlling the process of production of goods

Operations management is concerned with designing and controlling the production of goods or services, ensuring that businesses are efficient in using resources to meet customer requirements.

Advanced product quality planning (APQP) is a framework of procedures and techniques used to develop products in industry, particularly in the automotive industry. It differs from Six Sigma in that the goal of Six Sigma is to reduce variation, but has similarities to Design for Six Sigma (DFSS).

DMAIC or define, measure, analyze, improve and control refers to a data-driven improvement cycle used for improving, optimizing and stabilizing business processes and designs. The DMAIC improvement cycle is the core tool used to drive Six Sigma projects. However, DMAIC is not exclusive to Six Sigma and can be used as the framework for other improvement applications.

Lean Six Sigma is a process improvement approach that uses a collaborative team effort to improve performance by systematically removing operational waste and reducing process variation. It combines Lean Management and Six Sigma to increase the velocity of value creation in business processes.

Quality by design (QbD) is a concept first outlined by quality expert Joseph M. Juran in publications, most notably Juran on Quality by Design. Designing for quality and innovation is one of the three universal processes of the Juran Trilogy, in which Juran describes what is required to achieve breakthroughs in new products, services, and processes. Juran believed that quality could be planned, and that most quality crises and problems relate to the way in which quality was planned.

Lean integration is a management system that emphasizes creating value for customers, continuous improvement, and eliminating waste as a sustainable data integration and system integration practice. Lean integration has parallels with other lean disciplines such as lean manufacturing, lean IT, and lean software development. It is a specialized collection of tools and techniques that address the unique challenges associated with seamlessly combining information and processes from systems that were independently developed, are based on incompatible data models, and remain independently managed, to achieve a cohesive holistic operation.

SDI Tools

SDI Tools is a set of commercial software add-in tools for Microsoft Excel developed and distributed by Statistical Design Institute, LLC., a privately owned company located in Texas, United States.

Product cost management (PCM) is a set of tools, processes, methods, and culture used by firms who develop and manufacture products to ensure that a product meets its profit target.

Industrial and production engineering (IPE) is an interdisciplinary engineering discipline that includes manufacturing technology, engineering sciences, management science, and optimization of complex processes, systems, or organizations. It is concerned with the understanding and application of engineering procedures in manufacturing processes and production methods. Industrial engineering dates back all the way to the industrial revolution, initiated in 1700s by Sir Adam Smith, Henry Ford, Eli Whitney, Frank Gilbreth and Lilian Gilbreth, Henry Gantt, F.W. Taylor, etc. After the 1970s, industrial and production engineering developed worldwide and started to widely use automation and robotics. Industrial and production engineering includes three areas: Mechanical engineering, industrial engineering, and management science.

pSeven For designing software used in electronics and embedded systems

pSeven is a DSE software platform that was developed by DATADVANCE that features design, simulation and analysis capabilities and assists in design decisions. It provides integration with third party CAD and CAE software tools, multi-objective and robust optimization algorithms, data analysis, and uncertainty quantification tools.

References

  1. 1 2 Chowdhury, Subir (2002) Design for Six Sigma: The revolutionary process for achieving extraordinary profits, Prentice Hall, ISBN   9780793152247
  2. Hasenkamp, Torben; Ölme, Annika (2008). "Introducing Design for Six Sigma at SKF". International Journal of Six Sigma and Competitive Advantage. 4 (2): 172–189. doi:10.1504/IJSSCA.2008.020281.
  3. Peterson, John J. (2004-04-01). "A Posterior Predictive Approach to Multiple Response Surface Optimization". Journal of Quality Technology. 36 (2): 139–153. doi:10.1080/00224065.2004.11980261. ISSN   0022-4065. S2CID   116581405.
  4. "Response Surfaces, Mixtures, and Ridge Analyses, 2nd Edition | Wiley". Wiley.com. Retrieved 2022-04-09.
  5. Bertels, Thomas (2003) Rath & Strong's Six Sigma Leadership Handbook. John Wiley and Sons. pp 57-83 ISBN   0-471-25124-0.
  6. Lee, Sunggyu (2012). Lee, Sunggyu (ed.). Encyclopedia of Chemical Processing Vol 1. Taylor & Francis. pp. 2719–2734. doi:10.1081/E-ECHP. ISBN   978-0-8247-5563-8.
  7. "Design for Reliability: Overview of the Process and Applicable Techniques". www.reliasoft.com.
  8. Javier Lloréns-Montes, F.; Molina, Luis M. (May 2006). "Six Sigma and management theory: Processes, content and effectiveness". Total Quality Management & Business Excellence. 17 (4): 485–506. doi:10.1080/14783360500528270. ISSN   1478-3363.
  9. "Six Sigma roadmap for product and process development", Six Sigma for Medical Device Design, CRC Press, pp. 35–63, 2004-11-15, retrieved 2023-10-15

Further reading