Costs of Achieving Software Technology Readiness
Software & IT Track
Technology is advancing with lightning speed. Many of the complex systems required by the Department of Defense (DoD) and NASA require technological innovations to achieve sophisticated mission and state of the art accomplishments. Technology readiness assessments for complex systems and subsystems are important for both the government and its contractors to ensure proposed technologies meet program requirements. Without good technology assessments, programs may be funded that have little or no chance of success because success assumes use of technologies not yet confirmed to be realistic. The defense acquisition community recognizes this and has developed detailed guidelines for Technology Readiness Assessments (TRAs).
Wikipedia defines Technology Readiness Level (TRL) as a measure used to assess the maturity of evolving technologies prior to incorporating that technology into a system or subsystem. The US Department of Defense (DoD), NASA and many large corporations use TRLs to determine whether the design of systems is technically feasible prior to the start of any significant design or development activities. Both the DoD and NASA have developed TRL scales from 1 to 9 to facilitate technology maturity assessment. A level 1 represents technology that still resides on cocktail napkins and in laboratory experiments while a 9 represents technology that has been deployed successfully in systems currently fielded.
As new technologies are required and proposed, the cost community must become an important component to program success. A good cost estimate for a program relying on currently immature or non-existent technology should include the costs associated with bringing that technology to a maturity level of 6 or greater. There exists much good guidance in the literature for how to do this in the hardware estimation world.
One problem frequently cited with the definition and application of TRLs (as defined by both the DoD and NASA) is that they are very focused on hardware. And much success has been realized with their application in the hardware world. Software is different than hardware. It’s not as obvious what software technology means and how it is best assessed. Software brings almost infinite possibilities for advancements of the state of the art, but these possibilities require the right mix of hardware, tools, people and processes. Assessing the state of software technology becomes a ‘softer’ exercise. Consequently the issues associated with estimating the costs of transcending Software TRLs become less obvious as well.
Ms. Minkiewicz leads the Cost Research Department as Chief Scientist at PRICE Systems. In this role, she is responsible for the research and analysis necessary to keep the suite of PRICE Estimating products responsive to current cost trends. She works with industry leaders to collect and maintain cost research data and offers analyses of this data to the cost estimating community through the PRICE products.
Arlene’s most recent accomplishments include the development of a catalog of cost estimating relationships for systems and system of systems projects that will be delivered to the cost estimating community as part of the TruePlanning suite. Her most recent research on software sizing has been published in the March 2009 issue of Crosstalk.
Arlene frequently publishes articles on estimation and measurement in publications such as Software Development Magazine and Crosstalk. She speaks frequently on these topics at conference such as STC, ISPA, SCEA, IEEE Aerospace Conference, SEPG, and many others. Her ‘The Real Costs of COTS-Based Software Systems’ paper was recognized in 2004 by ISPA and SCEA as Best Paper in the Software Track. Her paper “A Case Study and Assessment of a COTS Upgrade for a Satellite Ground System”, co-authored with Marilee Wheaton of the Aerospace Corporation, received Best Paper in Software Track in 2006 by SCEA and her paper “The Evolution of Hardware Estimating” received Best Paper in the Hardware and EVM Track at the 2007 ISPA/SCEA joint conference.