DigitalSE Logo

Risk and System Maturity: TRLs and SRLs in Risk Management

Abstract

Technology can be broadly defined as the invention, usage, and knowledge of tools and techniques for executing tasks or processes. Within systems engineering, the term technology is used for identifying the elements of a system that are continuously being updated, improved, and then introduced into a system. In theory, technology and system development follow similar evolution or maturation paths and a technology is inserted into a system based on its maturity, functionality, environmental readiness, and ability to interoperate with the intended system. Unfortunately, these maturation paths do not always match, and systems fail at the point of integration. In some organizations the Technology Readiness Level (TRL) is used to assess the maturity of evolving technologies (materials, components, devices, etc.) prior to incorporating that technology into a system or subsystem to address this issue. While some have described TRLs as a method to make maturity decisions, TRLs do not address risk, integration, or lifecycle maturity at the system level, which determines if the technology will result in successful development of the system. Based on these fundamental conjectures, a more comprehensive set of concerns becomes relevant when TRL is abstracted from the level of an individual technology to a system context, which may involve interplay between multiple technologies and more relevant when these technologies are integrated through the life cycle of a system acquisition process. To address these concerns, this chapter will explain the development and implementation of a System Readiness Level (SRL) index that incorporates the maturity level of specific components, and the interoperability of the entire system. This is achieved by incorporating the current TRL scale and an Integration Readiness Level (IRL). Using TRL to measure the maturity of each component technology and IRL to measure the integration of any two TRL assessed technologies, an SRL can be determined as the product of a function of TRL and IRL. The resultant SRL can assess overall system development and underscore potential areas requiring further development. This maturity assessment can then be correlated to the decisions regarding the potential acquisition of systems, which involve the dependency and interplay between performance, availability (reliability, maintainability, and supportability), process efficiency (system operations, maintenance, and logistics support), system lifecycle cost, and system maturity (measured by SRL).


Leads

Brian Sauser

University of North Texas

Publications

  1. Akritas , M. G. ( 1990 ). “ The rank transform method in some two-factor designs .” Journal of American Statistics Association 85 , 73 – 78 .

  2. Austin , M. & York , D. ( 2015 ). “ System readiness assessment (SRA): an illustrative example .” Procedia Computer Science 44 , 486 – 496 .

  3. DoD , ( 2012 ). The DoDAF Architecture Framework Version 2.02. Chief Information Officer US Department of Defense . Washington, DC , US Department of Defense .

  4. Dowling , T. and Pardoe , T. ( 2005 ). TIMPA - Technology Insertion Metrics, CR050825, Ministry of Defense – QINETIQ, 60 .

  5. Fan , C.-F. and Yih , S. ( 1994 ). Prescriptive metrics for software quality assurance . First Asia-Pacific Software Engineering Conference , Tokyo, Japan (7–9 December 1994), pp. 430 – 438 .

  6. GAO ( 1999 ). Best Practices: Better Management of Technology Development Can Improve Weapon System Outcomes . GAO. Washington, DC , U.S. Government Accountability Office . GAO/NSIAD-99-162.

  7. GAO ( 2002 ). DOD Faces Challenges in Implementing Best Practices . GAO. Washington, DC , U.S. Government Accountability Office . GAO-02-469T.

  8. GAO ( 2006 ). Best Practices: Stronger Practices Needed to Improve DoD Transition Process . GAO. Washington, DC , U.S. Government Accountability Office , GAO 06-883.

  9. GAO ( 2020 ). Technology Readiness Assessment Guide: Best Practices for Evaluating the Readiness of Technology for Use in Acquisition Programs and Projects . GAO. Washington, DC , U.S. Government Accountability Office . GAO-20-48G.

  10. Harjumaa , L. , Tervonen , I. , and Salmela , S. ( 2008 ). Steering the inspection process with prescriptive metrics and process patterns . Eighth International Conference on Quality Software , Oxford, UK (12–13 August 2008). pp. 285 – 293 .

  11. Lee , M. & Shin , W. ( 2000 ). “ An empirical analysis of the role of reference point in justice perception in R&D settings in Korea .” Journal of Engineering and Technology Management 17 ( 2 ), 175 – 191 .

  12. Lord , F. ( 1953 ). “ On the statistical treatment of football numbers .” American Psychologist 8 ( 750–751 ), 19 – 22 .

  13. Magnaye , R. B. , Sauser , B.J. & Ramirez-Marquez , J.E. ( 2010 ). “ System development planning using readiness levels in a cost of development minimization model .” Systems Engineering 13 ( 4 ), 311 – 323 .

  14. Sauser , B. , Verma , D. , Ramirez-Marquez , J. , and Gove , R. ( 2006 ). From TRL to SRL: the concept of systems readiness levels . Conference on Systems Engineering Research (CSER) , Los Angeles, CA (7 April 2006).

  15. Sauser , B. , Ramirez-Marquez , J.E. , Henry , D. & DiMarzio , D. ( 2008 ). “ A system maturity index for the systems engineering life cycle .” International Journal of Industrial and Systems Engineering 3 ( 6 ), 673 – 691 .

  16. Sauser , B. , Gove , R. , Forbes , E. & Ramirez-Marquez , J.E. ( 2010 ). “ Integration maturity metrics: development of an integration readiness level .” Information Knowledge Systems Management 9 ( 1 ), 17 – 46 .

  17. Shah , D. A. & Madden , L.V. ( 2004 ). “ Nonparametric analysis of ordinal data in designed factorial experiments .” Phytopathology 94 ( 1 ), 33 – 43 .

  18. Tervonen , I. and Iisakka , J. ( 1996 ). Monitoring software inspections with prescriptive metrics . Proceedings of the Fifth European Conference on Software Quality , Dublin, Ireland (16–20 September 1996). pp. 105 – 114 .

  19. Yan , S. , Xu , Y. , Yang , M. , Zhang , Z. , Peng , M. , Yu , X. , & Zhang , H. ( 2006 ). “ A subjective evaluation study on human-machine Interface of marine meter based on RBF network .” Journal of Harbin Engineering University 27 , 560 – 567 .

SERC Logo

The Systems Engineering Research Center (SERC) was established in the Fall of 2008 as a government-designated University Affiliated Research Center (UARC). The SERC has produced 15 years of research, focused on an updated systems engineering toolkit (methods, tools, and practices) for the complex cyber-physical systems of today and tomorrow.


Follow us on

LinkedIn