2014 Workshop Risk Papers

Excel Based Schedule Risk and Cost Estimates (RI-1)

William Evans – Associate, Booz Allen Hamilton

As the cost estimating community has moved towards the integration of schedule and cost risk into a single resource, there have been multiple tools developed which combine Program Life Cycle Cost Estimate (PLCCE) information with a Program Integrated Master Schedule (IMS). The integration of the two acquisition artifacts can be effort intensive and may require the purchase of additional software packages. Additionally, utilizing black box tools to perform schedule risk eliminates the auditability and traceability that is inherent in Excel based PLCCEs. To minimize integration effort among separate estimating tools and improve PLCCE accuracy, traceability, and auditability, an alternative methodology is possible to perform schedule risk organically in MS Excel as part of PLCCE development.

Booz Allen Hamilton PLCCE Schedule Risk Methodology:
As part of PLCCE development and cost estimation within the Space and Naval Warfare Enterprise (SPAWAR), Booz Allen has successfully developed a method for integrating schedule and cost risk in Excel. By eliminating the use of macros that interact with MS Project and the use of an external scheduling tool, the PLCCE uses Excel based schedule risk and Monte Carlo-based risk analysis to create Joint Confidence Intervals. The combination of schedule and cost risk in Excel enables a unified and auditable methodology for creating Joint Confidence Intervals without the use of a separate black box tool or macros. This methodology has helped the client align contract delivery dates to better align estimates costs with program budget.

Benefits:
Utilizing Excel based schedule risk allows cost modelers to improve PLCCE/MS Excel based estimates without the aid of an external scheduling Risk/Uncertainty tool. Incorporating schedule risk methodology directly into Excel based estimates eliminates any black box methodology used by MS Project style schedule risk models. Organically performing cost and schedule risk assessments in Excel create more robust cost estimates than estimates that rely on cost risk alone. Additionally, when the schedule risk methodology is paired with form controls in Excel, it can be used to dynamically re-phase the cost profile of a program, enabling immediate impacts to be discerned from programmatic decisions. The organic schedule and cost risk methodology is used in conjunction with either Oracle Crystal Ball or Booz Allen Hamilton’s Argo tool to develop auditable Joint Confidence Levels and most robust cost estimates.

Disadvantages:
Performing schedule risk in Excel can prove cumbersome to cost modelers who are novice with MS Excel. Additionally, the methodology must be customized for each PLCCE. This method has only been prototyped on one ACAT III program with a limited subset of task level estimates.

Summary:
The emphasis on robust cost and schedule estimating solutions has resulted in the creation of multiple solutions for analysts and clients. Excel based integrated cost and schedule risk is only one methodology for solving client problems. Incorporating cost and schedule risk in Excel leads to an increased ability to audit and trace the schedule and cost risk methodology throughout an Excel based PLCCE, improving the confidence and robustness of the estimate. While there are hurdles to the implementation of an Excel based schedule risk solution, when combined with form controls, the benefits to PLCCE auditability and usability are immense.


Using Bayesian Belief Networks with Monte Carlo Simulation Modeling (RI-2)

Marina Dombrovskaya – Senior Consultant, Booz Allen Hamilton

One of the main aspects of creating a Monte Carlo simulation cost estimate is the accuracy in defining uncertainty and risk parameters associated with the cost components of the model. It is equally important to assess and accurately represent inter-dependencies between uncertain variables and risks, which are measured via correlation. Since oftentimes historical data is insufficient for a rigorous statistical analysis, both probability distribution and correlation are commonly estimated via a subject matter opinion. However, inherent complexity of variable inter-dependencies is often overlooked during such estimates which could significantly affect results of Monte Carlo simulation model. Bayesian belief networks offer an alternative methodology for estimating probabilities and correlation between variables in a complex cost estimating model. Bayesian belief networks are graphical probabilistic models that represent random variables (cost components or risks) and their conditional dependencies with assigned Bayesian probabilities. They provide a visual representation of inter-dependencies among random variables and estimate probabilities of events that lack direct data. This talk will discuss benefits and various methods of applying Bayesian belief networks within a Monte Carlo simulation cost estimating model and explore these methods through hands on examples.


Expert Elicitation of a Maximum Duration Using Risk Scenarios (RI-3)

Marc Greenberg

As acquisition programs become less mature, more advanced and more complex, there is an ever-increasing burden on the cost analyst to employ methods of eliciting requirements, schedule and cost uncertainties from one of more subject matter experts (SMEs). Arguably, the most common technique a cost analyst uses today to elicit such data is to ask each SME for the lowest, most likely and highest value which, consequently, produces a triangular distribution.
Eliciting and using a triangular distribution has its advantages. Getting the SME to provide the three input values takes only a few minutes, the SME can provide a reasonable basis for his or her input values and the distribution represents the SME’s first-order approximation of what s/he believes to be the uncertainty. However, this common process of depicting uncertain input parameters typically produces optimistic estimates. More specifically, structural limitations inherent to the triangular distribution coupled with the optimistic bias of the SME tend to produce optimistic estimates.

This paper provides with a brief review on a current method to elicit a most-likely commute time, a “practical maximum” commute time and risk factors that contribute to commute delays. This paper continues by showing how these risk factors can be organized into an objective hierarchy of risk factors, leading to the creation of a customized risk work breakdown structure (WBS). The cost estimator (i.e., interviewer) uses this risk WBS as a reference for interviewing the SME as follows:

1. Describe the practical worst commute case for each individual risk factor.
2. Estimate the risk-adjusted commute time associated with each individual risk factor.
3.Estimate the annual frequency associated with #1. Calculate the probability of occurrence.
4. Multiply risk-adjusted commute time by the probability of occurrence to get expected value.
5. Rank individual risk cases from highest expected value to lowest expected value.
6. Specify feasible combinations of worst case risks that could occur during the SMEs commute. Note: Each feasible combination is described as a “risk scenario”
7. With results from #6, calculate the probability of each risk scenario.
8. With results from #6, calculate the risk-adjusted commute time of each risk scenario.
9. Select commute time that has the lowest probability. This is the adjusted practical maximum.
10. Using most-likely and adjusted practical maximum durations, solve for maximum commute time.
11. Iterate from #1-#10 as needed. Provide a graphical representation to aid the SME.

Due the likely time intensiveness of such an interview process, this approach is intended to be used primarily for estimating durations of critical path activities and/or costs of high dollar items. The steps described not only help prevent SMEs from anchoring towards a most-likely estimate, but produces a maximum value that the cost estimator can quickly describe in terms of a feasible worst case scenario.


Quantifying the Necessity of Risk Mitigation Strategies (RI-4)

James Northington – Analyst, Tecolote Research Inc.
Christopher Schmidt – Senior Consultant, Cobec Consulting Inc.
Chuck Knight – Consultant, Deloitte Consulting

A project’s risk management plan is a three step process that involves identifying risks, formulations of risk mitigation strategies, and the analysis of the cost/schedule impact of these risk mitigation strategies. Each risk is assessed for its likelihood to occur and the impact it would have on the program should the risk become an issue. These two parameters are plotted on a risk cube to show which program risks are of a higher priority.

The assessments of these parameters tend to suffer greatly from a high level of subjectivity. While necessary early in a program due to lack of data and program specific information, a program will evolve and generate additional data. This data if incorporated correctly into the risk process, can increase the accuracy of the measurements of program impacts, and ergo, the significance of risk mitigation strategies. With a small amount of additional, focused effort, programs can reduce subjectivity in the risk management process throughout the remainder of the program thereby providing an accurate and defendable position for the incorporation of risk mitigation strategies.

This paper will begin by highlighting flaws with the current risk management process, walk through the new proposed methodology for risk mitigation, and provide a quantitative example of the process in action using raw data. In the end, the proposed methodology will provide a greater understanding of program risks, a measurement of importance of implementing a risk mitigation strategy, a measurement of the mitigation strategy’s subsequent impact, and a quantitative measurement of benefit for Program Mangers to defend their risk mitigation strategies.


Improved Decision Making with Sensitivity Analysis (RI-5)

Blake Boswell – Analytic Tool Developer, Booz Allen Hamilton

In constrained budget environments, Project Managers are often faced with tough decisions on how to balance project requirements with available funding. Therefore, it is critical for estimating models to not only serve as accurate predictors of future cost and schedule outcomes, but also to provide Project Managers the ability to explore trade-off scenarios, measure the effectiveness of potential decision strategies, and gain a greater understanding of what actions can improve the likelihood of project success.

To provide decision makers actionable intelligence, the technique of Sensitivity Analysis (SA) is often applied in the field project estimating. SA methods are related to probabilistic estimating models based upon Monte Carlo simulation or similar techniques for combining distributions for uncertainty in model inputs in order to estimate uncertainty in model outputs. Proper application of SA methods can provide insight into what is causing poor project performance, and what action is required by decision makers to ensure program success. However, shortcomings exist in conventional SA applications: SA metrics are often esoteric and become lost in translation between the analyst and program managers ? reducing their ability to provide the information needed by decision makers; standard SA practices have not kept pace with the increasing complexity of estimating techniques leading to misapplication and misinterpretation of results; and powerful SA techniques that have been proven effective in other estimating fields are often overlooked because they are not yet part of the standard project estimating lexicon.

In this study, we review common applications of SA methods to project estimation including a description of each method as well as its advantages and disadvantages. Additionally, we explore the topic of Global Sensitivity Analysis (GSA), which is a process for measuring the overall contribution of uncertain model inputs to variation in model outputs and is a popular technique for model validation in engineering and life sciences. GSA techniques are applicable to a robust class of estimating models including models that currently dominate the field of Integrated Cost and Schedule Risk Analysis. This study seeks to improve the ability estimating models to serve as a decision informing tools that help project managers make the right choices to improve the likelihood of program success.


Affordability Engineering for Better Alternative Selection and Risk Reduction (RI-6)

Marlena McWilliams
Bob Koury – Chief Solution Architect, PRICE Systems LLC

Affordability engineering approach is based on the simple foundation that the system design and architecture should define the system cost. Developing an affordable accurate estimate means you must attain data. One of the primary focus areas of the government is affordability, due to the current budget crisis and complex, uncertain security environment. The current Deputy Secretary of Defense defined affordability as “cost effective capability”. Additionally, The Deputy Secretary of Defense chartered the Defense Systems Affordability Council (DSAC) to develop and guide the implementation of an integrated DOD strategy for better, faster, cheaper modernization. In this leadership role, the DSAC has enumerated three top level goals for the Department:

• Field high-quality defense products quickly; support them responsively.
• Lower the total ownership cost of defense products.
• Reduce the overhead cost of the acquisition and logistics infrastructure.

In order to accomplish these goals the pricing and engineering community must become more than just two organizations that support the effort they must become a joint hybrid organization which bridges the gap between technical and cost performance. Secondly, the need to use models that speaks an “engineering” language which enables the rapid translation of design concepts to program and fiscal impacts. Thirdly, the need for availability to attain some actual data and crosswalk, calibrate and map over estimates in any format required for quick what-if analysis and capability requirement justification.
This paper will outline the process and steps to how to implement affordability into your estimating environment to understand system requirements vs. system costs and affordability; and provide best value identifying and accepting the most affordable, feasible, and effective system or alternative. The need to evaluate and assign a best value is essential to both the government (DoD) and the contractors supplying systems / alternatives to the government.


Risk Adjusted Inflation Indices (RI-7)

James Black – Cost Analysis Division, NASA

It is often observed that Office of the Secretary of Defense (OSD) inflation rates are different than prime contractor specific inflation rates seen in Forward Pricing Rate Agreements/Proposals (FPRAs/FPRPs) and in commodity group composite rates (e.g. Global Insight indices).
Yet, it is a standard practice in many cost estimating organizations to use OSD inflation rates for escalating costs in estimates without giving consideration to a range of different possible inflation rates. This can result in cost estimates that underestimate the effects of inflation, especially for programs that have many years of procurement and/or operations & support (where the compounding effects of inflation are significant).
This paper proposes an approach to create risk adjusted inflation indices based on defined risk distributions, thus giving consideration to a range of different inflation rate possibilities.

As an example, consider the following comparison between the current approach to calculating future-year weighted indices and the proposed risk adjusted approach (using a Monty Carlo simulation and triangular distribution as an example). Also, this example uses the hypothetical appropriation type titled “ABC”; in practice this would be Weapons Procurement Navy (WPN), Aircraft Procurement Navy (APN), etc.

Current approach to calculating future-year weighted indices:

Static OSD Inflation Rates for ABC * Outlays for ABC = OSD Weighted Indices for ABC;

Proposed risk adjusted approach to calculating future-year weighted indices:

Simulation Output * Outlays for ABC = Risk Adjusted Weighted Indices for ABC;

Where,
• Simulation Output = Monty Carlo Simulation with Triangular(Minimum, Mode, Maximum);
• Minimum = Smallest ABC Inflation Rate Observed Over Previous ‘X’ Years;
• Mode = Static future-year OSD Inflation Rate for ABC; Maximum = Largest ABC Inflation Rate Observed Over Previous ‘X’ Years;

In this example, the analyst would select the ‘X’ years that the minimum and maximum functions use. Yet, defining these minimum and maximum functions as such would be at the discretion of the analyst. Also, the selection of the distribution type would not be limited to triangular; continuous distributions (e.g. lognormal, beta, etc.) may be considered more appropriate. Additionally, the selection of simulation type would not be limited to strictly Monty Carlo. Care would need to be taken with the assignment of correlation coefficients between the aforementioned inflation distribution and any other distributions as well.
In the above example, the “Risk Adjusted Weighted Indices” would be used in place of the “OSD Weighted Indices” when performing escalation on cost elements that use the appropriation type “ABC”. Using this approach to generate Risk Adjusted Weighted Indices would enable cost estimates to consider a range of different possible inflation rates, rather than assuming a single static rate is representative of all future-year inflation.


Critique of Cost-Risk Analysis and Frankenstein Spacecraft Designs: A Proposed Solution (RI-8)

Mohamed Elghefari  – Pasadena Applied Physics
Eric Plumer – NASA

When using the parametric method to estimate the cost of a spacecraft, cost analysts typically use the most likely value or best estimate for each technical input parameter required by the Cost Estimating Relationship (CER). The technical input parameters describe the physical, performance, and engineering characteristics of spacecraft subsystems. Examples of technical input parameters are mass, power requirements, data rate, memory capacity, solar array area, specific impulse, etc. These parameters are not typically known with sufficient precision to perfectly predict cost of the system particularly in the early stages of development. To produce some measure of cost risk, cost analysts go one step further and treat them as random input variables, and subjectively adopt probability distributions for modeling their uncertainties.

However, the various spacecraft subsystems are interdependent, and their designs are governed by key physical relationships, such as the Stefan-Boltzmann Law and the Rocket Equation (for missions requiring chemical propulsion). These key relationships analytically and implicitly relate the technical input variables of the various subsystems to one another and, yet, they are generally not upheld when cost analysts perform their cost-risk simulations. As a result, the generated spacecraft point designs (i.e., simulated sets of CER input variables) may be neither technically feasible nor buildable (i.e., “Frankenstein” designs), and the corresponding spacecraft cost estimates and program cost probability distribution are invalid.

In this paper, we present a historical data driven probabilistic cost growth model for adjusting spacecraft cost Current Best Estimate (CBE), for both earth orbiting and deep space missions. The model is sensitive to when, in the mission development life cycle, the spacecraft cost CBE is generated. The model is based on historical spacecraft data obtained from the NASA Cost Analysis Data Requirements (CADRe) database. This alternative cost-risk modeling approach encompasses the uncertainties of underlying design parameters of the spacecraft (i.e., cost drivers) without violating laws of physics or the theory of probability. In addition, it promotes realism in estimating NASA project costs by providing traceable and defensible data-derived measures of cost risk reflecting NASA’s historical cost-estimating performance.