2010-SOF05

Posted by

Improving Software Cost Estimates Using the Univariate Model

Software & IT Track

Downloadable Files:

SOF05-Opaska

Abstract:

Despite its drawbacks, the univariate (first-order) model for estimating software remains a widely used technique throughout the government and private sector. This presentation will provide three techniques that can be applied to improve the consistency and accuracy of this method. These techniques involve using a reliable code counting sizing metric, performing thorough analysis of the actual SLOC count, and normalizing the SLOC.
The univariate model is a simplistic linear method used for estimating software development effort/cost by multiplying a productivity factor by the software product size. Software product size is frequently measured using source lines of code (SLOC) or function points. Moreover, the productivity factor is usually a composite factor based on historical actuals from multiple languages or a group of individual factors that are based on actuals from a particular language. The popularity of this method is the ability to quickly and easily generate cost estimates that can be traced back to historical actuals. However, steps can be taken to further improve the estimates obtained from this method.
First, the technique used for collecting actual SLOC counts should be sound. SLOC counts are typically used as the sizing metric in the univariate model. Code counters should be used to collect consistent actual SLOC counts. Physical SLOC should be used as the sizing metric because virtually all code counters produce similar physical counts as opposed to logical counts, which vary greatly across counters and omit certain languages.
Secondly, the actual SLOC counts need to be thoroughly analyzed to identify duplicate, pre-existing, and COTS/GOTS/FOSS code. These types of code should not be included in actual SLOC count when computing productivity. Otherwise, the amount of code actually developed would be misstated.
Finally, SLOC normalization should occur to account for productivity variances relating to the development language(s) used. Because historical productivity data is typically not available by language, actual SLOC counts should be normalized to a ‘standard’ language. The resulting productivity factor will then be based on normalized SLOC counts.
Future software estimates using this method need to include a normalization step to be consistent with the basis used to develop the factor. Once the software product size is determined, the estimate should then be normalized/converted to the ‘standard’ language that is tied to the productivity factor. The resulting normalized software product size can then be multiplied by the productivity factor. Additionally, the units of the software product size should be consistent with units that were used to count the historic actuals (Physical SLOC).

Author:

Brian Opaska
OPS Consulting
Brian S. Opaska is currently a Senior Cost Analyst for OPS Consulting. He performs cost and risk analysis on programs within the Intelligence Community and is leading an internal research project focusing on collecting, normalizing, and analyzing program technical data and historical costs that will be used to calibrate parametric software tools and update software productivity factors. Brian also supports the development/refinement of internal cost tools/processes and software sizing efforts. Prior to working for OPS Consulting, Mr. Opaska was a Principal Affordability Systems Engineer for Northrop Grumman. He led the affordability efforts on multiple programs as well as supported the development of multiple affordability tools including a cross-sector cost model and a Design to Cost (DTC) cost management tool that was estimated to save the enterprise $414,400 annually. Additionally, he conducted quantitative cost risk analysis using Monte Carlo simulation, performed cost driver sensitivity analysis, and supported proposal efforts to ensure that they were affordable, best value, and low risk to their customers. Mr. Opaska is currently a Certified Cost Estimator/Analyst (CCEA). He has a Bachelor of Science in Industrial and Systems Engineering from Virginia Tech in 2001 and an MBA from Johns Hopkins University in 2006.