The Percentile Problem: How Much Is Enough?
It is common practice for federal government agencies to require cost estimates at the 80th percentile. For example, the Air Force requires 80th percentile cost estimates as a matter of routine. Recently completed work for the Coast Guard also involved the mandated use of 80th percentile estimates to support budgeting. It is widely thought that this practice reduces risk, by making cost overruns less likely. The difference between the 80th percentile estimate and the point estimate (which may be a very large number, because the point estimate likely resides at a very low percentile of the true cost distribution) is often as defined as “risk dollars.”
But when it comes to risk dollars, we must ask the question, as Alain Enthoven famously did when developing the Planning, Programming, Budgeting and Execution (PPBE) process, “how much is enough?”
A portfolio of programs, each estimated at the 80th percentile, will yield and estimate of total cost that, under certain assumptions, is at roughly the 98th percentile! When aggregated over multiple DoD programs, Services, and fiscal years, the implications of this problem can be significant.
Compounding the problem is the fact that most programs’ costs are not independent. In fact, even within a specific program, most work breakdown structure (WBS) elements are transparently not independent. Correlation among programs, or the WBS elements within them, makes defining a distribution of enterprise-wide cost considerably more difficult.
The author argues that if the relationship between two programs or WBS elements is not known, one is better off assuming them independent, rather than potentially misspecifying a joint distribution. The independence assumption, in these cases, is “unbiased,” in that it presupposes neither a positive nor inverse correlation among the elements. Estimation of the distribution of enterprise-wide total cost is also made much simpler.
When the relationship between two such elements is known, the author argues for modeling that relationship directly into the cost estimate. When the dependencies are intricate, simulation techniques such as Monte Carlo and Latin Hypercube do an excellent job of approximating the distribution of enterprise-wide total cost.
Using this combination of techniques, the author proposes a method for determining the appropriate percentile at which each program should budget, such that the sum of those estimates lies as roughly the 80th percentile of the distribution of enterprise-wide total cost.
Kevin Cincotta is a Research Fellow at LMI, formerly the Logistics Management Institute. His primary areas of expertise are cost analysis, database creation and management, and statistics. Mr. Cincotta has led numerous cost analysis tasks for the Departments of the Army, and Air Force since joining LMI in September, 2003. Most notably, he led the subtask associated with development of the APES database.
From 2001 to 2003, Mr. Cincotta served as a Senior Cost Analyst at MCR, LLC. He worked closely with government clients at the Missile Defense Agency (MDA) to develop a radar cost model, which was presented by MCR at the 2004 Society of Cost Estimating and Analysis (SCEA) conference.
Mr. Cincotta also led several cost analysis-related tasks at the New Vectors (formerly Vector Research, Incorporated and the Altarum Institute) from 1997 to 2001. As a Senior Cost Analyst and Systems Developer, he assisted in creating life cycle cost estimates
(LCCEs) for myriad DOD projects, including the Standard Procurement System (SPS), the Defense Occupational Health Readiness System (DOHRS), and the Simplified Tax and Wage Reporting System (STAWRS).
Mr. Cincotta is a Society for Cost Estimating and Analysis (SCEA)-certified analyst, and has presented at both SCEA and DODCAS on several other occasions. He holds a master’s degree in economics and philosophy from the London School of Economics and Political Science, and a bachelor’s in the same fields from the University of Virginia.