Katrina Groth

Last updated
Katrina Groth
Born1982 (age 4142)
Education University of Maryland (BS), (MS), (PhD)
Occupations
  • Engineer
  • professor
TitleAssociate Professor
Scientific career
Institutions
Thesis A data-informed model of performance shaping factors and their interdependencies for use in human reliability analysis  (2009)
Website enme.umd.edu/clark/faculty/807/Katrina-Groth

Katrina Groth (born 1982) is an American mechanical engineer and professor. Groth is an associate professor in Mechanical Engineering at the University of Maryland, College Park, where she is the associate director for research for the Center for Risk and Reliability and the director of the Systems Risk and Reliability Analysis lab (SyRRA). Groth previously served as the Principal Research & Development Engineer at Sandia National Laboratories. [1]

Contents

Biography

Groth received a Bachelor of Science in Nuclear Engineering from the University of Maryland in 2004. [1] She received a Master of Science in Reliability Engineering in 2008 and a Ph.D. in Reliability Engineering in 2009, both from the University of Maryland. [1]

From 2009 to 2017, Groth worked for the Sandia National Laboratories. [1] While working at Sandia, Groth developed Hydrogen Plus Other Alternative Fuels Risk Assessment Models (HyRAM+), a software toolkit integrating publicly available hydrogen storage data and models. [2] HyRAM+ was used to develop both the American and international safety standards for hydrogen fueling stations—NFPA 2 [3] and ISO 19880–1. [4] [5]

In 2017, Groth joined the University of Maryland's School of Engineering. There, Groth is the associate director of the Center for Risk & Reliability. Groth is also the director of the Systems Risk & Reliability Analysis (SyRRA) laboratory. [5]

In 2021, Groth received the NSF CAREER award for Modernizing Risk Assessment Through Systematic Integration of Probabilistic Risk Assessment (PRA) and Prognostics and Health Management (PHM). [6]

Groth serves on the board of the National Museum of Nuclear Science & History. [7]

Honors and awards

Notable works

Related Research Articles

<span class="mw-page-title-main">Safety engineering</span> Engineering discipline which assures that engineered systems provide acceptable levels of safety

Safety engineering is an engineering discipline which assures that engineered systems provide acceptable levels of safety. It is strongly related to industrial engineering/systems engineering, and the subset system safety engineering. Safety engineering assures that a life-critical system behaves as needed, even when components fail.

<span class="mw-page-title-main">Fault tree analysis</span> Failure analysis system used in safety engineering and reliability engineering

Fault tree analysis (FTA) is a type of failure analysis in which an undesired state of a system is examined. This analysis method is mainly used in safety engineering and reliability engineering to understand how systems can fail, to identify the best ways to reduce risk and to determine event rates of a safety accident or a particular system level (functional) failure. FTA is used in the aerospace, nuclear power, chemical and process, pharmaceutical, petrochemical and other high-hazard industries; but is also used in fields as diverse as risk factor identification relating to social service system failure. FTA is also used in software engineering for debugging purposes and is closely related to cause-elimination technique used to detect bugs.

Prognostics is an engineering discipline focused on predicting the time at which a system or a component will no longer perform its intended function. This lack of performance is most often a failure beyond which the system can no longer be used to meet desired performance. The predicted time then becomes the remaining useful life (RUL), which is an important concept in decision making for contingency mitigation. Prognostics predicts the future performance of a component by assessing the extent of deviation or degradation of a system from its expected normal operating conditions. The science of prognostics is based on the analysis of failure modes, detection of early signs of wear and aging, and fault conditions. An effective prognostics solution is implemented when there is sound knowledge of the failure mechanisms that are likely to cause the degradations leading to eventual failures in the system. It is therefore necessary to have initial information on the possible failures in a product. Such knowledge is important to identify the system parameters that are to be monitored. Potential uses for prognostics is in condition-based maintenance. The discipline that links studies of failure mechanisms to system lifecycle management is often referred to as prognostics and health management (PHM), sometimes also system health management (SHM) or—in transportation applications—vehicle health management (VHM) or engine health management (EHM). Technical approaches to building models in prognostics can be categorized broadly into data-driven approaches, model-based approaches, and hybrid approaches.

In the field of human factors and ergonomics, human reliability is the probability that a human performs a task to a sufficient standard. Reliability of humans can be affected by many factors such as age, physical health, mental state, attitude, emotions, personal propensity for certain mistakes, and cognitive biases.

Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability describes the ability of a system or component to function under stated conditions for a specified period. Reliability is closely related to availability, which is typically described as the ability of a component or system to function at a specified moment or interval of time.

Probabilistic risk assessment (PRA) is a systematic and comprehensive methodology to evaluate risks associated with a complex engineered technological entity or the effects of stressors on the environment.

In science, engineering, and research, expert elicitation is the synthesis of opinions of authorities of a subject where there is uncertainty due to insufficient data or when such data is unattainable because of physical constraints or lack of resources. Expert elicitation is essentially a scientific consensus methodology. It is often used in the study of rare events. Expert elicitation allows for parametrization, an "educated guess", for the respective topic under study. Expert elicitation generally quantifies uncertainty.

Quantification of Margins and Uncertainty (QMU) is a decision support methodology for complex technical decisions. QMU focuses on the identification, characterization, and analysis of performance thresholds and their associated margins for engineering systems that are evaluated under conditions of uncertainty, particularly when portions of those results are generated using computational modeling and simulation. QMU has traditionally been applied to complex systems where comprehensive experimental test data is not readily available and cannot be easily generated for either end-to-end system execution or for specific subsystems of interest. Examples of systems where QMU has been applied include nuclear weapons performance, qualification, and stockpile assessment. QMU focuses on characterizing in detail the various sources of uncertainty that exist in a model, thus allowing the uncertainty in the system response output variables to be well quantified. These sources are frequently described in terms of probability distributions to account for the stochastic nature of complex engineering systems. The characterization of uncertainty supports comparisons of design margins for key system performance metrics to the uncertainty associated with their calculation by the model. QMU supports risk-informed decision-making processes where computational simulation results provide one of several inputs to the decision-making authority. There is currently no standardized methodology across the simulation community for conducting QMU; the term is applied to a variety of different modeling and simulation techniques that focus on rigorously quantifying model uncertainty in order to support comparison to design margins.

Human Cognitive Reliability Correlation (HCR) is a technique used in the field of Human Reliability Assessment (HRA), for the purposes of evaluating the probability of a human error occurring throughout the completion of a specific task. From such analyses measures can then be taken to reduce the likelihood of errors occurring within a system and therefore lead to an improvement in the overall levels of safety. There exist three primary reasons for conducting an HRA; error identification, error quantification and error reduction. As there exist a number of techniques used for such purposes, they can be split into one of two classifications; first generation techniques and second generation techniques. First generation techniques work on the basis of the simple dichotomy of ‘fits/doesn’t fit’ in the matching of the error situation in context with related error identification and quantification and second generation techniques are more theory based in their assessment and quantification of errors. HRA techniques have been utilised in a range of industries including healthcare, engineering, nuclear, transportation and business sector; each technique has varying uses within different disciplines.

Tecnica Empirica Stima Errori Operatori (TESEO) is a technique in the field of Human reliability Assessment (HRA), that evaluates the probability of a human error occurring throughout the completion of a specific task. From such analyses measures can then be taken to reduce the likelihood of errors occurring within a system and therefore lead to an improvement in the overall levels of safety. There exist three primary reasons for conducting an HRA; error identification, error quantification and error reduction. As there exist a number of techniques used for such purposes, they can be split into one of two classifications; first generation techniques and second generation techniques. First generation techniques work on the basis of the simple dichotomy of ‘fits/doesn’t fit’ in the matching of the error situation in context with related error identification and quantification and second generation techniques are more theory based in their assessment and quantification of errors. ‘HRA techniques have been utilised in a range of industries including healthcare, engineering, nuclear, transportation and business sector; each technique has varying uses within different disciplines.

The Technique for human error-rate prediction (THERP) is a technique that is used in the field of Human Reliability Assessment (HRA) to evaluate the probability of human error occurring throughout the completion of a task. From such an analysis, some corrective measures could be taken to reduce the likelihood of errors occurring within a system. The overall goal of THERP is to apply and document probabilistic methodological analyses to increase safety during a given process. THERP is used in fields such as error identification, error quantification and error reduction.

Human error assessment and reduction technique (HEART) is a technique used in the field of human reliability assessment (HRA), for the purposes of evaluating the probability of a human error occurring throughout the completion of a specific task. From such analyses measures can then be taken to reduce the likelihood of errors occurring within a system and therefore lead to an improvement in the overall levels of safety. There exist three primary reasons for conducting an HRA: error identification, error quantification, and error reduction. As there exist a number of techniques used for such purposes, they can be split into one of two classifications: first-generation techniques and second generation techniques. First generation techniques work on the basis of the simple dichotomy of 'fits/doesn't fit' in the matching of the error situation in context with related error identification and quantification and second generation techniques are more theory based in their assessment and quantification of errors. HRA techniques have been used in a range of industries including healthcare, engineering, nuclear, transportation, and business sectors. Each technique has varying uses within different disciplines.

Influence Diagrams Approach (IDA) is a technique used in the field of Human reliability Assessment (HRA), for the purposes of evaluating the probability of a human error occurring throughout the completion of a specific task. From such analyses measures can then be taken to reduce the likelihood of errors occurring within a system and therefore lead to an improvement in the overall levels of safety. There exist three primary reasons for conducting an HRA; error identification, error quantification and error reduction. As there exist a number of techniques used for such purposes, they can be split into one of two classifications; first generation techniques and second generation techniques. First generation techniques work on the basis of the simple dichotomy of ‘fits/doesn’t fit’ in the matching of the error situation in context with related error identification and quantification and second generation techniques are more theory based in their assessment and quantification of errors. ‘HRA techniques have been utilised in a range of industries including healthcare, engineering, nuclear, transportation and business sector; each technique has varying uses within different disciplines.

A Technique for Human Event Analysis (ATHEANA) is a technique used in the field of human reliability assessment (HRA). The purpose of ATHEANA is to evaluate the probability of human error while performing a specific task. From such analyses, preventative measures can then be taken to reduce human errors within a system and therefore lead to improvements in the overall level of safety.

<span class="mw-page-title-main">Probability box</span> Characterization of uncertain numbers consisting of both aleatoric and epistemic uncertainties

A probability box is a characterization of uncertain numbers consisting of both aleatoric and epistemic uncertainties that is often used in risk analysis or quantitative uncertainty modeling where numerical calculations must be performed. Probability bounds analysis is used to make arithmetic and logical calculations with p-boxes.

Probability bounds analysis (PBA) is a collection of methods of uncertainty propagation for making qualitative and quantitative calculations in the face of uncertainties of various kinds. It is used to project partial information about random variables and other quantities through mathematical expressions. For instance, it computes sure bounds on the distribution of a sum, product, or more complex function, given only sure bounds on the distributions of the inputs. Such bounds are called probability boxes, and constrain cumulative probability distributions.

<span class="mw-page-title-main">Wolfgang Kröger</span>

Wolfgang Kröger has been full professor of Safety Technology at the ETH Zurich since 1990 and director of the Laboratory of Safety Analysis simultaneously. Before being elected Founding Rector of International Risk Governance Council (IRGC) in 2003, he headed research in nuclear energy and safety at the Paul Scherrer Institut (PSI). After his retirement early 2011 he became the Executive Director of the newly established ETH Risk Center. He has both Swiss and German citizenship and lives in Kilchberg, Zürich. His seminal work lies in the general area of reliability, risk and vulnerability analysis of large-scale technical systems, initially single complicated systems like nuclear power plants of different types and finally complex engineered networks like power supply systems, the latter coupled to other critical infrastructure and controlled by cyber-physical systems. He is known for his continuing efforts to advance related frameworks, methodology, and tools, to communicate results including uncertainties as well as for his successful endeavor in stimulating trans-boundary cooperation to improve governance of emerging systemic risks. His contributions to shape and operationalize the concept of sustainability and - more recently - the concept of resilience are highly valued. Furthermore, he is in engaged in the evaluation of smart clean, secure, and affordable energy systems and future technologies, including new ways of exploiting nuclear energy. The development and certification of cooperative automated vehicles, regarded as a cornerstone of future mobility concepts, are matter of growing interest.

<span class="mw-page-title-main">Mohammad Modarres</span>

Mohammad Modarres is an Iranian American scientist and educator in the fields of nuclear and reliability engineering. He is a Distinguished Scholar-Teacher and Nicole Y. Kim Eminent Professor of the University of Maryland. Within the University of Maryland A. James Clark School of Engineering, Modarres founded world's first graduate curriculum in reliability engineering, which has now become a leading academic program both nationally and internationally with over 400 Master's and PhD graduates. As the Director of the UMD Center for Risk and Reliability, Modarres serves as international expert on reliability and risk analysis to various commercial and government organizations including US DOE, US NRC, NASA. A PhD graduate of the Massachusetts Institute of Technology, from the scientific school of Norman C. Rasmussen, he has authored numerous books and hundreds of scholarly papers in the fields of nuclear and reliability engineering.

<span class="mw-page-title-main">Domino effect accident</span> Accident that causes one or more consequential accidents

A domino effect accident is an accident in which a primary undesired event sequentially or simultaneously triggers one or more secondary undesired events in nearby equipment or facilities, leading to secondary accidents more severe than the primary event. Thus, a domino effect accident is actually a chain of multiple events, which can be likened to a falling row of dominoes. The term knock-on accident is also used.

Dr. Alan D. Swain III was a human factors engineer who specialized in weapons systems and nuclear power plants. He was a Distinguished Member of Technical Staff at Sandia National Laboratories, where he developed the technique for human error-rate prediction (THERP). According to a bibliometrics analysis performed in 2020, Swain is the most highly cited author in the field of human reliability analysis.

References

  1. 1 2 3 4 "Groth, Katrina". Faculty Directory. University of Maryland. Retrieved 13 October 2022.
  2. 1 2 Groth, Katrina M.; Hecht, Ethan S. (March 2017). "HyRAM: A methodology and toolkit for quantitative risk assessment of hydrogen systems". International Journal of Hydrogen Energy. 42 (11): 7485–7493. doi: 10.1016/j.ijhydene.2016.07.002 .
  3. "Hydrogen Technologies Code". NFPA 2. National Fire Protection Association. Retrieved 14 October 2022.
  4. "ISO 19880-1". International Organization for Standardization. 2020. Retrieved 14 October 2022.
  5. 1 2 "Katrina Groth". People. SyRRA Lab. Retrieved 13 October 2022.
  6. 1 2 "Modernizing Risk Assessment Through Systematic Integration of Probabilistic Risk Assessment (PRA) and Prognostics and Health Management (PHM)". Awards. National Science Foundation. Retrieved 14 October 2022.
  7. "Board of Trustees". Connect. The National Museum of Nuclear Science & History. Retrieved 13 October 2022.
  8. "Junior Faculty Outstanding Research Award | A. James Clark School of Engineering, University of Maryland". eng.umd.edu. Retrieved 10 May 2024.
  9. "Award Recipients". Honors and Awards. American Nuclear Society. Retrieved 13 October 2022.
  10. "Award Recipients". Honors and Awards. American Nuclear Society. 2021. Retrieved 14 October 2022.
  11. "2016 Annual Merit Review Awards". Hydrogen Program. Department of Enery. Retrieved 14 October 2022.
  12. Koning, Patti (27 November 2015). "Bringing it home: Katrina Groth and Ethan Hecht win inaugural Robert Schefer Best Paper award". Sandia LabNews. Sandia National Laboratories. Retrieved 14 October 2022.
  13. Groth, Katrina M.; Al-Douri, Ahmad (1 January 2023). "Chapter 15 - Hydrogen safety, risk, and reliability analysis". Hydrogen Economy (Second Edition). Academic Press: 487–510. doi:10.1016/B978-0-323-99514-6.00012-1. ISBN   9780323995146 . Retrieved 17 May 2023.
  14. Groth, Katrina M.; Mosleh, Ali (December 2012). "A data-informed PIF hierarchy for model-based Human Reliability Analysis". Reliability Engineering & System Safety. 108: 154–174. doi:10.1016/j.ress.2012.08.006 . Retrieved 13 October 2022.
  15. Groth, Katrina M.; Swiler, Laura P. (July 2013). "Bridging the gap between HRA research and HRA practice: A Bayesian network version of SPAR-H". Reliability Engineering & System Safety. 115: 33–42. doi:10.1016/j.ress.2013.02.015 . Retrieved 13 October 2022.
  16. Groth, Katrina M; Mosleh, Ali (August 2012). "Deriving causal Bayesian networks from human reliability analysis data: A methodology and example model". Proceedings of the Institution of Mechanical Engineers, Part O: Journal of Risk and Reliability. 226 (4): 361–379. doi:10.1177/1748006X11428107. S2CID   112729968 . Retrieved 13 October 2022.
  17. Moradi, Ramin; Groth, Katrina M. (3 May 2019). "Hydrogen storage and delivery: Review of the state of the art technologies and risk and reliability analysis". International Journal of Hydrogen Energy. 44 (23): 12254–12269. doi:10.1016/j.ijhydene.2019.03.041. ISSN   0360-3199. S2CID   133375970 . Retrieved 15 October 2022.