Technology readiness level

Last updated
NASA technology readiness levels NASA TRL Meter.svg
NASA technology readiness levels

Technology readiness levels (TRLs) are a method for estimating the maturity of technologies during the acquisition phase of a program. TRLs enable consistent and uniform discussions of technical maturity across different types of technology. [1] TRL is determined during a technology readiness assessment (TRA) that examines program concepts, technology requirements, and demonstrated technology capabilities. TRLs are based on a scale from 1 to 9 with 9 being the most mature technology. [1]

Contents

TRL was developed at NASA during the 1970s. The US Department of Defense has used the scale for procurement since the early 2000s. By 2008 the scale was also in use at the European Space Agency (ESA). [2] The European Commission advised EU-funded research and innovation projects to adopt the scale in 2010. [1] TRLs were consequently used in 2014 in the EU Horizon 2020 program. In 2013, the TRL scale was further canonized by the International Organization for Standardization (ISO) with the publication of the ISO 16290:2013 standard. [1]

A comprehensive approach and discussion of TRLs has been published by the European Association of Research and Technology Organisations (EARTO). [3] Extensive criticism of the adoption of TRL scale by the European Union was published in The Innovation Journal, stating that the "concreteness and sophistication of the TRL scale gradually diminished as its usage spread outside its original context (space programs)". [1]

Definitions

TRLNASA usage [4] European Union [5]
1Basic principles observed and reportedBasic principles observed
2Technology concept and/or application formulatedTechnology concept formulated
3Analytical and experimental critical function and/or characteristic proof-of conceptExperimental proof of concept
4Component and/or breadboard validation in laboratory environmentTechnology validated in lab
5Component and/or breadboard validation in relevant environmentTechnology validated in relevant environment (industrially relevant environment in the case of key enabling technologies)
6System/subsystem model or prototype demonstration in a relevant environment (ground or space)Technology demonstrated in relevant environment (industrially relevant environment in the case of key enabling technologies)
7System prototype demonstration in a space environmentSystem prototype demonstration in operational environment
8Actual system completed and "flight qualified" through test and demonstration (ground or space)System complete and qualified
9Actual system "flight proven" through successful mission operationsActual system proven in operational environment (competitive manufacturing in the case of key enabling technologies; or in space)


Assessment tools

DAU Decision Point / TPMM Transition Mechanisms TPMM transition mech.jpg
DAU Decision Point / TPMM Transition Mechanisms

A Technology Readiness Level Calculator was developed by the United States Air Force. [6] This tool is a standard set of questions implemented in Microsoft Excel that produces a graphical display of the TRLs achieved. This tool is intended to provide a snapshot of technology maturity at a given point in time. [7]

The Defense Acquisition University (DAU) Decision Point (DP) Tool originally named the Technology Program Management Model was developed by the United States Army. [8] and later adopted by the DAU. The DP/TPMM is a TRL-gated high-fidelity activity model that provides a flexible management tool to assist Technology Managers in planning, managing, and assessing their technologies for successful technology transition. The model provides a core set of activities including systems engineering and program management tasks that are tailored to the technology development and management goals. This approach is comprehensive, yet it consolidates the complex activities that are relevant to the development and transition of a specific technology program into one integrated model. [9]

Uses

The primary purpose of using technology readiness levels is to help management in making decisions concerning the development and transitioning of technology. It is one of several tools that are needed to manage the progress of research and development activity within an organization. [10]

Among the advantages of TRLs: [11]

Some of the characteristics of TRLs that limit their utility: [11]

TRL models tend to disregard negative and obsolescence factors. There have been suggestions made for incorporating such factors into assessments. [12]

For complex technologies that incorporate various development stages, a more detailed scheme called the Technology Readiness Pathway Matrix has been developed going from basic units to applications in society. This tool aims to show that a readiness level of a technology is based on a less linear process but on a more complex pathway through its application in society. [13]

History

Technology readiness levels were conceived at NASA in 1974 and formally defined in 1989. The original definition included seven levels, but in the 1990s NASA adopted the nine-level scale that subsequently gained widespread acceptance. [14]

Original NASA TRL Definitions (1989) [15]

Level 1 – Basic Principles Observed and Reported
Level 2 – Potential Application Validated
Level 3 – Proof-of-Concept Demonstrated, Analytically and/or Experimentally
Level 4 – Component and/or Breadboard Laboratory Validated
Level 5 – Component and/or Breadboard Validated in Simulated or Realspace Environment
Level 6 – System Adequacy Validated in Simulated Environment
Level 7 – System Adequacy Validated in Space

The TRL methodology was originated by Stan Sadin at NASA Headquarters in 1974. [14] Ray Chase was then the JPL Propulsion Division representative on the Jupiter Orbiter design team. At the suggestion of Stan Sadin, Chase used this methodology to assess the technology readiness of the proposed JPL Jupiter Orbiter spacecraft design.[ citation needed ] Later Chase spent a year at NASA Headquarters helping Sadin institutionalize the TRL methodology. Chase joined ANSER in 1978, where he used the TRL methodology to evaluate the technology readiness of proposed Air Force development programs. He published several articles during the 1980s and 90s on reusable launch vehicles utilizing the TRL methodology. [16]

These documented an expanded version of the methodology that included design tools, test facilities, and manufacturing readiness on the Air Force Have Not program.[ citation needed ] The Have Not program manager, Greg Jenkins, and Ray Chase published the expanded version of the TRL methodology, which included design and manufacturing.[ citation needed ] Leon McKinney and Chase used the expanded version to assess the technology readiness of the ANSER team's Highly Reusable Space Transportation (HRST) concept. [17] ANSER also created an adapted version of the TRL methodology for proposed Homeland Security Agency programs. [18]

The United States Air Force adopted the use of technology readiness levels in the 1990s.[ citation needed ]

In 1995, John C. Mankins, NASA, wrote a paper that discussed NASA's use of TRL, extended the scale, and proposed expanded descriptions for each TRL. [1] In 1999, the United States General Accounting Office produced an influential report [19] that examined the differences in technology transition between the DOD and private industry. It concluded that the DOD takes greater risks and attempts to transition emerging technologies at lesser degrees of maturity than does private industry. The GAO concluded that use of immature technology increased overall program risk. The GAO recommended that the DOD make wider use of technology readiness levels as a means of assessing technology maturity prior to transition. [20]

In 2001, the Deputy Under Secretary of Defense for Science and Technology issued a memorandum that endorsed use of TRLs in new major programs. Guidance for assessing technology maturity was incorporated into the Defense Acquisition Guidebook. [21] Subsequently, the DOD developed detailed guidance for using TRLs in the 2003 DOD Technology Readiness Assessment Deskbook.

Because of their relevance to Habitation, 'Habitation Readiness Levels (HRL)' were formed by a group of NASA engineers (Jan Connolly, Kathy Daues, Robert Howard, and Larry Toups). They have been created to address habitability requirements and design aspects in correlation with already established and widely used standards by different agencies, including NASA TRLs. [22] [23]

More recently, Dr. Ali Abbas, Professor of chemical engineering and Associate Dean of Research at the University of Sydney and Dr. Mobin Nomvar, a chemical engineer and commercialisation specialist, have developed Commercial Readiness Level (CRL), a nine-point scale to be synchronised with TRL as part of a critical innovation path to rapidly assess and refine innovation projects to ensure market adoption and avoid failure. [24]

In the European Union

The European Space Agency [1] adopted the TRL scale in the mid-2000s. Its handbook [2] closely follows the NASA definition of TRLs. In 2022, the ESA TRL Calculator was released to the public. The universal usage of TRL in EU policy was proposed in the final report of the first High Level Expert Group on Key Enabling Technologies, [25] and it was implemented in the subsequent EU framework program, called H2020, running from 2013 to 2020. [1] This means not only space and weapons programs, but everything from nanotechnology to informatics and communication technology.

See also

Related Research Articles

<span class="mw-page-title-main">Systems engineering</span> Interdisciplinary field of engineering

Systems engineering is an interdisciplinary field of engineering and engineering management that focuses on how to design, integrate, and manage complex systems over their life cycles. At its core, systems engineering utilizes systems thinking principles to organize this body of knowledge. The individual outcome of such efforts, an engineered system, can be defined as a combination of components that work in synergy to collectively perform a useful function.

<span class="mw-page-title-main">Iterative and incremental development</span> Development methodology

Iterative and incremental development is any combination of both iterative design and incremental build model for development.

Technology transfer (TT), also called transfer of technology (TOT), is the process of transferring (disseminating) technology from the person or organization that owns or holds it to another person or organization, in an attempt to transform inventions and scientific outcomes into new products and services that benefit society. Technology transfer is closely related to knowledge transfer.

In software project management, software testing, and software engineering, verification and validation is the process of checking that a software engineer system meets specifications and requirements so that it fulfills its intended purpose. It may also be referred to as software quality control. It is normally the responsibility of software testers as part of the software development lifecycle. In simple terms, software verification is: "Assuming we should build X, does our software achieve its goals without any bugs or gaps?" On the other hand, software validation is: "Was X what we should have built? Does X meet the high-level requirements?"

ISO/IEC 15504Information technology – Process assessment, also termed Software Process Improvement and Capability dEtermination (SPICE), is a set of technical standards documents for the computer software development process and related business management functions. It is one of the joint International Organization for Standardization (ISO) and International Electrotechnical Commission (IEC) standards, which was developed by the ISO and IEC joint subcommittee, ISO/IEC JTC 1/SC 7.

Capability Maturity Model Integration (CMMI) is a process level improvement training and appraisal program. Administered by the CMMI Institute, a subsidiary of ISACA, it was developed at Carnegie Mellon University (CMU). It is required by many U.S. Government contracts, especially in software development. CMU claims CMMI can be used to guide process improvement across a project, division, or an entire organization.

Software assurance (SwA) is a critical process in software development that ensures the reliability, safety, and security of software products. It involves a variety of activities, including requirements analysis, design reviews, code inspections, testing, and formal verification. One crucial component of software assurance is secure coding practices, which follow industry-accepted standards and best practices, such as those outlined by the Software Engineering Institute (SEI) in their CERT Secure Coding Standards (SCS).

Performance engineering encompasses the techniques applied during a systems development life cycle to ensure the non-functional requirements for performance will be met. It may be alternatively referred to as systems performance engineering within systems engineering, and software performance engineering or application performance engineering within software engineering.

<span class="mw-page-title-main">Marine Corps Systems Command</span> Acquisition command of the United States Marine Corps

The Marine Corps Systems Command (MCSC) is the acquisition command of the United States Marine Corps, made up of Marines, sailors, civilians and contractors. As the only systems command in the Marine Corps, MCSC serves as Head of Contracting Authority and exercises technical authority for all Marine Corps ground weapon and information technology programs. MCSC is headquartered at Marine Corps Base Quantico.

<span class="mw-page-title-main">Under Secretary of Defense for Acquisition and Sustainment</span>

The Under Secretary of Defense for Acquisition and Sustainment, or USD (A&S), is the Principal Staff Assistant (PSA) and advisor to the Secretary of Defense for all matters relating to acquisition and sustainment in the Department of Defense. This includes the DoD Acquisition System; system design and development; production; logistics and distribution; installation maintenance, management, and resilience; military construction; procurement of goods and services; material readiness; maintenance; environment and energy resilience ; utilities; business management modernization; International Armaments Cooperation, Cooperative Acquisition and International Agreements, Promoting exportability of military components to allies and partners; nuclear, chemical and biological defense programs; and nuclear command, control, and communications.

<span class="mw-page-title-main">Integrated master plan</span>

In the United States Department of Defense, the Integrated Master Plan (IMP) and the Integrated Master Schedule (IMS) are important program management tools that provide significant assistance in the planning and scheduling of work efforts in large and complex materiel acquisitions. The IMP is an event-driven plan that documents the significant accomplishments necessary to complete the work and ties each accomplishment to a key program event. The IMP is expanded to a time-based IMS to produce a networked and multi-layered schedule showing all detailed tasks required to accomplish the work effort contained in the IMP. The IMS flows directly from the IMP and supplements it with additional levels of detail——both then form the foundations to implement an Earned Value Management System.

In software engineering, a software development process or software development life cycle (SDLC) is a process of planning and managing software development. It typically involves dividing software development work into smaller, parallel, or sequential steps or sub-processes to improve design and/or product management. The methodology may include the pre-definition of specific deliverables and artifacts that are created and completed by a project team to develop or maintain an application.

In the United States military integrated acquisition lifecycle the Technical section has multiple acquisition "Technical Reviews". Technical reviews and audits assist the acquisition and the number and types are tailored to the acquisition. Overall guidance flows from the Defense Acquisition Guidebook chapter 4, with local details further defined by the review organizations. Typical topics examined include adequacy of program/contract metrics, proper staffing, risks, budget, and schedule.

The manufacturing readiness level (MRL) is a measure to assess the maturity of manufacturing readiness, similar to how technology readiness levels (TRL) are used for technology readiness. They can be used in general industry assessments, or for more specific application in assessing capabilities of possible suppliers.

Adaptive Vehicle Make was a portfolio of programs overseen by DARPA, of the United States Department of Defense. AVM attempted to address revolutionary approaches to the design, verification, and manufacturing of complex defense systems and vehicles. The three primary programs were META, Instant Foundry Adaptive through Bits (iFAB), and Fast Adaptable Next-Generation Ground Vehicle programs. Many components of the program leveraged crowdsourcing and were open source and the ultimate intent was to crowdsource a next generation combat vehicle. The program was managed by Nathan Wiedenman under DARPA's Tactical Technology Office. A Proposer's Day was held and several Broad Agency Announcements released on 7 October 2010. The AVM program was ended in February 2014 without building and testing a complete vehicle.

Transition management is a governance approach that aims to facilitate and accelerate sustainability transitions through a participatory process of visioning, learning and experimenting. In its application, transition management seeks to bring together multiple viewpoints and multiple approaches in a 'transition arena'. Participants are invited to structure their shared problems with the current system and develop shared visions and goals which are then tested for practicality through the use of experimentation, learning and reflexivity. The model is often discussed in reference to sustainable development and the possible use of the model as a method for change.

Roelf Johannes (Roel) Wieringa is a Dutch computer scientist who was a professor of Information Systems at the University of Twente, specialized in the "integration of formal and informal specification and design techniques".

Big data maturity models (BDMM) are the artifacts used to measure big data maturity. These models help organizations to create structure around their big data capabilities and to identify where to start. They provide tools that assist organizations to define goals around their big data program and to communicate their big data vision to the entire organization. BDMMs also provide a methodology to measure and monitor the state of a company's big data capability, the effort required to complete their current stage or phase of maturity and to progress to the next stage. Additionally, BDMMs measure and manage the speed of both the progress and adoption of big data programs in the organization.

<span class="mw-page-title-main">Lynx X-ray Observatory</span> Proposed NASA space telescope

The Lynx X-ray Observatory (Lynx) is a NASA-funded Large Mission Concept Study commissioned as part of the National Academy of Sciences 2020 Astronomy and Astrophysics Decadal Survey. The concept study phase is complete as of August 2019, and the Lynx final report has been submitted to the Decadal Survey for prioritization. If launched, Lynx would be the most powerful X-ray astronomy observatory constructed to date, enabling order-of-magnitude advances in capability over the current Chandra X-ray Observatory and XMM-Newton space telescopes.

John C. Mankins is a former NASA physicist known for his work on space-based solar power.

References

  1. 1 2 3 4 5 6 7 8 Mihaly, Heder (September 2017). "From NASA to EU: the evolution of the TRL scale in Public Sector Innovation" (PDF). The Innovation Journal. 22: 1–23. Archived from the original (PDF) on October 11, 2017.
  2. 1 2 "Technology Readiness Levels Handbook for Space Applications" (PDF) (1 revision 6 ed.). ESA. September 2008. TEC-SHS/5551/MG/ap.
  3. "The TRL Scale as a Research & Innovation Policy Tool, EARTO Recommendations" (PDF). European Association of Research & Technology Organisations. 30 April 2014.
  4. "Technology Readiness Level Definitions" (PDF). nasa.gov. Retrieved 6 September 2019.PD-icon.svg This article incorporates text from this source, which is in the public domain .
  5. "Technology readiness levels (TRL); Extract from Part 19 - Commission Decision C(2014)4995" (PDF). ec.europa.eu. 2014. Retrieved 11 November 2019. CC-BY icon.svg Material was copied from this source, which is available under a Creative Commons Attribution 4.0 International License.
  6. Nolte, William L.; et al. (20 October 2003). "Technology Readiness Level Calculator, Air Force Research Laboratory, presented at the NDIA Systems Engineering Conference". Archived from the original on 13 May 2015.
  7. "Technology Assessment Calculator".
  8. Craver, Jeffrey T. (28 Dec 2020). "Decision Point / Technology Program Management Model, DAU". Defense Acquisition University.
  9. Jeff, Craver. "Decision Point / TPMM - Technology Program Management Model (only available to DOD components)".
  10. Christophe Deutsch; Chiara Meneghini; Ozzy Mermut; Martin Lefort. "Measuring Technology Readiness to improve Innovation Management" (PDF). INO. Archived from the original (PDF) on 2012-06-02. Retrieved 2011-11-27.
  11. 1 2 Ben Dawson (31 October 2007). "The Impact of Technology Insertion on Organisations" (PDF). Human Factors Integration Design Technology Centre. Archived from the original (PDF) on 26 April 2012.
  12. Ricardo Valerdi; Ron J. Kohl (March 2004). An Approach to Technology Risk Management (PDF). Engineering Systems Division Symposium MIT, Cambridge, MA, March 29-31, 2004. CiteSeerX   10.1.1.402.359 .[ dead link ]
  13. Vincent Jamier; Christophe Aucher (April 2018). "Demystifying Technology Readiness Levels for Complex Technologies". Leitat Projects Blog. Archived from the original on 2021-02-03. Retrieved 2018-08-28.
  14. 1 2 Banke, Jim (20 August 2010). "Technology Readiness Levels Demystified". NASA.
  15. Sadin, Stanley R.; Povinelli, Frederick P.; Rosen, Robert (October 1, 1988). The NASA technology push towards future space mission systems. International Astronautical Congress, 39th, Bangalore, India, Oct. 8-15, 1988.
  16. Chase, R.L. (26 June 1991). Methodology for Assessing Technological and Manufacturing Readiness of NASP-Technology Enabled Vehicles. 27th Joint Propulsion Conference, June 24-26, 1991, Sacramento CA. doi:10.2514/6.1991-2389. AIAA 91-2389.
  17. R. L. Chase; L. E. McKinney; H. D. Froning, Jr.; P. Czysz; et al. (January 1999). "A comparison of selected air-breathing propulsion choices for an aerospace plane". AIP Conference Proceedings. Vol. 458. American Institute of Physics. pp. 1133–8. doi:10.1063/1.57719. Archived from the original on 2016-03-11. Retrieved 2018-08-28.
  18. "Department of Homeland Security Science and Technology Readiness Level Calculator (Ver. 1.1) - Final Report and User"s Manual" (PDF). Homeland Security Institute. September 30, 2009. Archived from the original (PDF) on August 26, 2010.
  19. "Best Practices: Better Management of Technology Can Improve Weapon System Outcomes" (PDF). General Accounting Office. July 1999. GAO/NSIAD-99-162. Archived from the original (PDF) on 2021-02-24. Retrieved 2018-08-28.
  20. Defense Acquisition Guidebook Archived 2012-04-25 at the Wayback Machine
  21. Defense Acquisition Guidebook Archived 2012-04-25 at the Wayback Machine
  22. Häuplik-Meusburger and Bannova (2016). Space Architecture Education for Engineers and Architects. Springer. ISBN   978-3-319-19278-9.
  23. Cohen, Marc (2012). Mockups 101: Code and Standard Research for Space Habitat Analogues. AIAA Space 2012 Conference Pasadena, California.
  24. "Better Management of Commercialisation and Innovation". Australian Manufacturing.
  25. "High-Level Expert Group on Key Enabling Technologies – Final Report". June 2011. p. 31. Retrieved March 16, 2020.

Online