Wayne Nelson (statistician)

Last updated

For the American musician, see Wayne Nelson.

Wayne Nelson is an American statistician. His main contributions to the reliability theory are the Nelson-Aalen Estimator for lifetime data, various statistical procedures for accelerated life testing and both: nonparametric and parametric procedures for recurrent data analysis.

Contents

Early life and education

Nelson was born in Chicago in 1936. He studied Physics at Caltech and graduated with a Bachelor of Science in 1958. Nelson obtained a Master of Science in Physics from the University of Illinois in 1959, then a Ph.D. in statistics from the same university in 1965. [1]

Career

Nelson was employed from 1965 to 1989 at General Electric R&D. [2] He was also an adjunct professor teaching graduate courses on applications of statistics at Union College and Rensselaer Polytechnic Institute. [1] Currently, Nelson works as a private consultant and legal expert witness in statistical analysis and modeling of data in many industries; including automotive, aviation, electric power, electronics, materials, medical devices, microelectronics, military hardware, nuclear power, railroad, software, and transportation. [2]

Work

His research work focuses on collecting and analyzing reliability data, laboratory tests, accelerated tests, quality control, measurement error analysis, planned experiments, sampling, and data analysis. Nelson worked with Odd Aalen on constructing the Nelson Aalen estimator. [3] [4] [5] , a non-parametric approximation of the cumulative hazard function that can account for both failure and censored data. He also developed a method to estimate Weibull distribution (with few or no failures) for products with evolutionary design (same shape parameter β). [6] In the late 1960s, Wayne developed a cumulative hazard analysis method for nonparametric estimation of a population's cumulative life distribution. The resulting estimate is most conveniently displayed and interpreted on a probability plot. Until Wayne developed his method practitioners relied on crude approximations for such analyses. Wayne’s paper "Hazard Plotting for Incomplete Failure Data" [3] in the inaugural issue of the J. of Quality technology received the Brumbaugh Award for the ASQ as the 1969 [7] paper that made a great contribution to the development of industrial applications of quality control. Moreover, his paper "Theory and Applications of Hazard Plotting for Censored Failure Data" was reprinted in the 40th Anniversary issue of Technometrics (2000) [8] as one of the "Two Classics in Reliability Theory." Dr. Wayne also developed software that is widely used in reliability analysis; STATPAC is the first complete package for analysis of reliability and accelerated test data, including censored and interval data. It was the first to provide probability plots, confidence limits, maximum likelihood fitting of many models including accelerated life test models, proper analysis of step-stress data, residuals and their analyses, and a simple user interface. Its versatile reliability features stimulated imitations in SPLUS, SAS, JMP, ReliaSoft, WinSmith, and others. [9] Also, POWNOR, a software that fits the power-(log)normal distribution to censored life data on specimens of differing sizes. This was developed on his NSF-NIST-ASA senior research fellowship at NIST to develop better statistical models for electromigration failures of microcircuits. [10]

Selected publications

Books

Papers

Awards

Related Research Articles

In computing, stress testing can be applied to either hardware or software. It is used to determine the maximum capability of a computer system and is often used for purposes such as scaling for production use and ensuring reliability and stability. Stress tests typically involve running a large amount of resource-intensive processes until the system either crashes or nearly does so.

Survival analysis is a branch of statistics for analyzing the expected duration of time until one event occurs, such as death in biological organisms and failure in mechanical systems. This topic is called reliability theory or reliability analysis in engineering, duration analysis or duration modelling in economics, and event history analysis in sociology. Survival analysis attempts to answer certain questions, such as what is the proportion of a population which will survive past a certain time? Of those that survive, at what rate will they die or fail? Can multiple causes of death or failure be taken into account? How do particular circumstances or characteristics increase or decrease the probability of survival?

<span class="mw-page-title-main">Walter A. Shewhart</span> American statistician

Walter Andrew Shewhart was an American physicist, engineer and statistician, sometimes known as the father of statistical quality control and also related to the Shewhart cycle.

<span class="mw-page-title-main">Control chart</span> Process control tool to determine if a manufacturing process is in a state of control

Control charts are graphical plots used in production control to determine whether quality and manufacturing processes are being controlled under stable conditions. The hourly status is arranged on the graph, and the occurrence of abnormalities is judged based on the presence of data that differs from the conventional trend or deviates from the control limit line. Control charts are classified into Shewhart individuals control chart and CUSUM(CUsUM)(or cumulative sum control chart)(ISO 7870-4).

Failure mode and effects analysis is the process of reviewing as many components, assemblies, and subsystems as possible to identify potential failure modes in a system and their causes and effects. For each component, the failure modes and their resulting effects on the rest of the system are recorded in a specific FMEA worksheet. There are numerous variations of such worksheets. An FMEA can be a qualitative analysis, but may be put on a quantitative basis when mathematical failure rate models are combined with a statistical failure mode ratio database. It was one of the first highly structured, systematic techniques for failure analysis. It was developed by reliability engineers in the late 1950s to study problems that might arise from malfunctions of military systems. An FMEA is often the first step of a system reliability study.

In the context of software engineering, software quality refers to two related but distinct notions:

Failure rate is the frequency with which an engineered system or component fails, expressed in failures per unit of time. It is usually denoted by the Greek letter λ (lambda) and is often used in reliability engineering.

Prognostics is an engineering discipline focused on predicting the time at which a system or a component will no longer perform its intended function. This lack of performance is most often a failure beyond which the system can no longer be used to meet desired performance. The predicted time then becomes the remaining useful life (RUL), which is an important concept in decision making for contingency mitigation. Prognostics predicts the future performance of a component by assessing the extent of deviation or degradation of a system from its expected normal operating conditions. The science of prognostics is based on the analysis of failure modes, detection of early signs of wear and aging, and fault conditions. An effective prognostics solution is implemented when there is sound knowledge of the failure mechanisms that are likely to cause the degradations leading to eventual failures in the system. It is therefore necessary to have initial information on the possible failures in a product. Such knowledge is important to identify the system parameters that are to be monitored. Potential uses for prognostics is in condition-based maintenance. The discipline that links studies of failure mechanisms to system lifecycle management is often referred to as prognostics and health management (PHM), sometimes also system health management (SHM) or—in transportation applications—vehicle health management (VHM) or engine health management (EHM). Technical approaches to building models in prognostics can be categorized broadly into data-driven approaches, model-based approaches, and hybrid approaches.

Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability describes the ability of a system or component to function under stated conditions for a specified period. Reliability is closely related to availability, which is typically described as the ability of a component or system to function at a specified moment or interval of time.

Software assurance (SwA) is a critical process in software development that ensures the reliability, safety, and security of software products. It involves a variety of activities, including requirements analysis, design reviews, code inspections, testing, and formal verification. One crucial component of software assurance is secure coding practices, which follow industry-accepted standards and best practices, such as those outlined by the Software Engineering Institute (SEI) in their CERT Secure Coding Standards (SCS).

<span class="mw-page-title-main">Kaplan–Meier estimator</span> Non-parametric statistic used to estimate the survival function

The Kaplan–Meier estimator, also known as the product limit estimator, is a non-parametric statistic used to estimate the survival function from lifetime data. In medical research, it is often used to measure the fraction of patients living for a certain amount of time after treatment. In other fields, Kaplan–Meier estimators may be used to measure the length of time people remain unemployed after a job loss, the time-to-failure of machine parts, or how long fleshy fruits remain on plants before they are removed by frugivores. The estimator is named after Edward L. Kaplan and Paul Meier, who each submitted similar manuscripts to the Journal of the American Statistical Association. The journal editor, John Tukey, convinced them to combine their work into one paper, which has been cited more than 34,000 times since its publication in 1958.

In the statistical area of survival analysis, an accelerated failure time model is a parametric model that provides an alternative to the commonly used proportional hazards models. Whereas a proportional hazards model assumes that the effect of a covariate is to multiply the hazard by some constant, an AFT model assumes that the effect of a covariate is to accelerate or decelerate the life course of a disease by some constant. There is strong basic science evidence from C. Elegans experiments by Stroustrup et al. indicating that AFT models are the correct model for biological survival processes.

<span class="mw-page-title-main">Dorian Shainin</span> American engineer, author, and professor (1914–2000)

Dorian Shainin was an American quality consultant, aeronautics engineer, author, and college professor most notable for his contributions in the fields of industrial problem solving, product reliability, and quality engineering, particularly the creation and development of the "Red X" concept.

The Nelson–Aalen estimator is a non-parametric estimator of the cumulative hazard rate function in case of censored data or incomplete data. It is used in survival theory, reliability engineering and life insurance to estimate the cumulative number of expected events. An "event" can be the failure of a non-repairable component, the death of a human being, or any occurrence for which the experimental unit remains in the "failed" state from the point at which it changed on. The estimator is given by

In statistics, the exponentiated Weibull family of probability distributions was introduced by Mudholkar and Srivastava (1993) as an extension of the Weibull family obtained by adding a second shape parameter.

Accelerated life testing is the process of testing a product by subjecting it to conditions in excess of its normal service parameters in an effort to uncover faults and potential modes of failure in a short amount of time. By analyzing the product's response to such tests, engineers can make predictions about the service life and maintenance intervals of a product.

<span class="mw-page-title-main">Roy Billinton</span>

Roy Billinton is a Canadian scholar and a Distinguished Emeritus Professor at the University of Saskatchewan, Saskatoon, Saskatchewan, Canada. In 2008, Billinton won the IEEE Canada Electric Power Medal for his research and application of reliability concepts in electric power system. In 2007, Billinton was elected a Foreign Associate of the United States National Academy of Engineering for "contributions to teaching, research and application of reliability engineering in electric power generation, transmission, and distribution systems."

<span class="mw-page-title-main">Stock sampling</span>

Stock sampling is sampling people in a certain state at the time of the survey. This is in contrast to flow sampling, where the relationship of interest deals with duration or survival analysis. In stock sampling, rather than focusing on transitions within a certain time interval, we only have observations at a certain point in time. This can lead to both left and right censoring. Imposing the same model on data that have been generated under the two different sampling regimes can lead to research reaching fundamentally different conclusions if the joint distribution across the flow and stock samples differ sufficiently.

Nancy Robbins Mann is an American statistician known for her research on quality management, reliability estimation, and the Weibull distribution.

Hypertabastic survival models were introduced in 2007 by Mohammad Tabatabai, Zoran Bursac, David Williams, and Karan Singh. This distribution can be used to analyze time-to-event data in biomedical and public health areas and normally called survival analysis. In engineering, the time-to-event analysis is referred to as reliability theory and in business and economics it is called duration analysis. Other fields may use different names for the same analysis. These survival models are applicable in many fields such as biomedical, behavioral science, social science, statistics, medicine, bioinformatics, medical informatics, data science especially in machine learning, computational biology, business economics, engineering, and commercial entities. They not only look at the time to event, but whether or not the event occurred. These time-to-event models can be applied in a variety of applications for instance, time after diagnosis of cancer until death, comparison of individualized treatment with standard care in cancer research, time until an individual defaults on loans, relapsed time for drug and smoking cessation, time until property sold after being put on the market, time until an individual upgrades to a new phone, time until job relocation, time until bones receive microscopic fractures when undergoing different stress levels, time from marriage until divorce, time until infection due to catheter, and time from bridge completion until first repair.

References

  1. 1 2 "Wayne Nelson Biography".
  2. 1 2 Rodgers, Tim (January 2017). "Wayne Nelson Interview".
  3. 1 2 Nelson, W. (1969). "Hazard plotting for incomplete failure data". Journal of Quality Technology. 1: 27–52. doi:10.1080/00224065.1969.11980344.
  4. Nelson, W. (1972). "Theory and applications of hazard plotting for censored failure data". Technometrics. 14 (4): 945–965. doi:10.1080/00401706.1972.10488991.
  5. Aalen, Odd (1978). "Nonparametric inference for a family of counting processes". Annals of Statistics. 6 (4): 701–726. doi: 10.1214/aos/1176344247 . JSTOR   2958850.
  6. Nelson, Wayne (1985). "Weibull Analysis of Reliability Data with Few or No Failures". Journal of Quality Technology. 17 (3): 140–146. doi:10.1080/00224065.1985.11978953.
  7. 1 2 "BRUMBAUGH AWARD WINNERS".
  8. Nelson, W. (2000). "Theory and applications of hazard plotting for censored failure data". Technometrics. 42 (1, Special 40th Anniversary): 12–25. doi:10.2307/1271428. JSTOR   1271428.
  9. Meeker, William; Escobar, Luis (August 2001). "Software for Reliability Data Analysis and Test Planning". Iowa State University - Digital Repository: 5.
  10. Nelson, Wayne; Dognanksoy, Necip (February 2017). "A Computer Program POWNOR for Fitting the Power-Normal and -Lognormal Models to Life or Strength Data from Specimens of Various Sizes" (PDF). NISTIR 4760 (National Bureau of Standards).{{cite journal}}: Cite journal requires |journal= (help)
  11. "Fulbright Scholar Directory".
  12. "SHEWHART MEDALISTS".
  13. "WAYNE NELSON Receives Lifetime Achievement Award" (PDF). Summer_Current_Source_Newsletter. IEEE. Summer 2005.
  14. "SHAININ MEDALISTS".
  15. "Gerald J. Hahn Q&P Achievement Award".