2014 Workshop Handouts & Presentations

 

Below are copies of all of the presentations and handouts from the 2014 Workshop.  We encourage you to share them with colleagues or others who may find them of use, but ask that the files not be distributed to large groups or used for demonstration without the author’s permission.  If you are unsure whether or not an intended use of these files is appropriate, please email us at iceaa@iceaaonline.org

Authors and presenters: many presentations were changed just before the Workshop or other versions of the presentations were used.  If the file here is not the latest version, please email the new version to us and we will post the new file here.

Click the links below to jump to that track.

Papers:

The Business and Art of Cost Estimating
Earned Value Management
Information Technology
Life Cycle Costing
Methods and Models
Cost Management
Parametrics
Risk
Space

 


Business & Art of Cost Estimating Papers:

NPS and AFIT Masters Degree in Cost Estimating and Analysis (BA-1)

Daniel Nussbaum – Visiting Professor, Naval Postgraduate School
Greg Mislick – Lecturer, Naval Postgraduate School

This presentation provides an update on the Joint, all-Distance Learning Masters Degree in Cost Estimating and Analysis offered at the Naval Postgraduate School (NPS) and Air Force Institute of Technology (AFIT).

The Navy and the Air Force are collaborating to meet a need for a distance learning educational program by providing a Master’s Degree in Cost Estimating and Analysis. The two schools have designed and developed the content, have graduated two cohorts, and recently started their fourth cohort. Further information is available at http://www.nps.edu/DL/Degree_Progs/MCEA.asp

The presentation will incorporate details and requirements about the program, achievements to date, research undertaken by current students, and lessons learned from our experience so far in this innovative program.

BA-1 – Presentation – NPS and AFIT Masters Degree in Cost Estimating and Analysis

BA-1 – Handout – NPS and AFIT Masters Degree in Cost Estimating and Analysis


A Comprehensive CES and BCA Approach for Lifelong Learning (BA-3)

Kevin Cincotta – Technical Director, ICF International
Darcy Lilley – Chief Learning Officer, Air Force Air Mobility Command

The Air Force Air Mobility Command (AMC) Enterprise Learning Office (ELO) mission is to transform AMC into a premier Air Force learning organization, achieve learning through optimum approaches and develop Mobility Airmen into life-long learners who demonstrate institutional Air Force competencies with a positive approach to managing their own learning. In this context, learning has three main components: training, education, and experience. The re-engineering of learning to develop and deploy optimum approaches focuses on all components. AMC ELO is initially focusing on training.

Training is generally represented as only one line within a cost estimate. At most, there are cost elements for Initial Training (as Investment) and Recurring Training (as Opearations and Support). Very little cost research has been published regarding training as the service to be procured. Leveraging prior work done by the North Atlantic Treaty Organization (NATO), the Office of the Secretary of Defense (OSD) Program Analysis and Evaluation (PA&E) Economic Analysis (EA) Guide, and the Department of Homeland Security (DHS) Information Technology (IT) Life Cycle Cost (LCC) Work Breakdown Structure (WBS), we developed a first-of-its-kind comprehensive Cost Element Structure (CES) for training. The CES decomposes training into its core elements, broken out according to 1.0 Investment, 2.0 Operating and Support, and 3.0 Decommissioning. Important distinctions are made between among training hardware and software, and those pieces of hardware and software that are ancillary to core training efforts. Training is a complex endeavor that includes elements of both IT and non-IT acquisitions, so that a comprehensive, mutually exclusive, and completely exhaustive CES?with an accompanying data dictionary?is essential.

Finally, and perhaps counter-intuitively, training (and more generally, learning) is an abstract concept whose outputs are generally not “defense materiel items.” Therefore, the structure is properly service, not product-oriented, and several principles of MIL-STD-881-C do not directly apply.
AMC ELO is using this CES to evaluate several alternatives with respect to re-engineering training of airmen, and their general learning experience, in the context of a business case analysis (BCA).

This paper presents the training CES, conveys its value in the broader context of transforming learning, and outlines an approach for using the CES in the context of a BCA. Finally, preliminary results of the BCA are presented and interpreted.

BA-3 – Presentation – A Comprehensive CES and BCA Approach for Lifelong Learning *Best Paper: Business & Art of Cost Estimating Track

BA-3 – Handout – A Comprehensive CES and BCA Approach for Lifelong Learning


BOE Development: Scope Evaluation and Criteria (BA-4)

Michael Butterworth – West Coast Cost Analysis Lead, TASC Inc.
Demetrius Prado – Cost Analyst, TASC Inc.

Current basis-of-estimates (BOEs) do not identify succinctly, how the scope of the estimate was derived and how actuals used for build-up, analogous and parametric methodologies fit the technology and product under consideration. We believe that one of the many problems leading to poor BOE development revolves around the lack of identifying the scope of work and understanding the effort needed in particular skill mix to perform the task.

When coming up with a solution to this problem, we must ask ourselves, “How easy is it for evaluators to determine price realism vs. price reasonableness in justifying the estimate?”

We focus on establishing grading criteria for 3 categories during BOE development:
-Hardware
-Software
-Services

The contracting, pricing and cost estimating communities are being challenged with developing defensible Basis of Estimates generated during proposal Task Assessment creation. The Basis of Estimate provides the cost estimating rationale and resulting cost estimate for a specific cost element. They should be prepared by the most technically knowledgeable and skilled at the specific task. These estimates should be subject to an independent review to improve win probability by increasing cost credibility and transparency.

While developing BOE grading criteria, we take into consideration other problems we normally face during estimate creation:
-Grading subjectively vs. objectively
-Lack of documentation
-Time consuming (time limitations)

-Lack of Subject Matter Experts to write Task Assessments
This paper presents a process that solves this problem – a process where someone can create valid, credible and transparent Basis of Estimates by means of 1) understanding the criteria by which BOEs are evaluated (grading scale), 2) providing the criteria by which you identify the scope of the estimate that is being built, and 3) assuring objective criteria to evaluate and grade the Basis of Estimate are used in an independent review. In addition, we will tailor the grading criteria to allow the BOE writer to easily identify the scope of the task assessment and BOE at varying levels, even if the proposal team is faced with limited knowledge and resources.
BA-4 – Presentation – BOE Development and Evaluation and Criteria

BA-4 – Handout – BOE Development and Evaluation and Criteria


Long Term Affordability through Knowledge Based Bid & Proposal (BA-5)

Zachary Jasnoff – Vice President, Professional Services, PRICE Systems LLC

According to a 2013 GAO report, “positive acquisition outcomes require the use of a knowledge-based approach to product development that demonstrates high levels of knowledge before significant commitments are made. In essence, knowledge supplants risk over time.” Often times, acquisition proposals that are not knowledge-based introduce significant risk and while seeming reasonable in the short term, cannot sustain long term affordability. These proposals are often based on grassroots engineering judgment and subjective subject matter expert opinion and may be overly optimistic. This paper will present through a recent study with a major defense corporation, the evolution to knowledge based bid and proposal demonstrating best practices using sophisticated toolsets.

Evolving toward long term affordability starts with credible cost estimating at the bid and proposal stage of a project. As the project evolves, it is critical to capture historical information over the course of the program to support a knowledge based bid and proposal system that is accurate and credible for predicating future outcomes. At the heart of knowledge based bid and proposal system is the ability to collect, normalize, analyze data from prior programs. According to the GAO, the building of this knowledge occurs a) when resources match requirements; b) the product design is stable and c) manufacturing processes are mature. In addition, it is critical that any knowledge based bid & proposal system includes both a top down (parametric) and bottoms up (detailed build up) crosswalk comparison and validation of estimates. A crosswalk helps ensure that all programmatic areas are estimated and discrepancies resolved. When taken together, knowledge-based bid and proposals provides a sound foundation for long term affordability while balancing price-to-win and market considerations.

To support long term affordability, estimators require new and sophisticated toolsets to capture and analyze myriads of information to both produce and implement actionable knowledge. These tools provide the ability to quickly capture and normalize large amounts of data, and produce knowledge based findings that lay the foundation for long term affordability.

BA-5 – Presentation – Long Term Affordability through Knowledge Based Bid & Proposal

BA-5 – Handout – Long Term Affordability through Knowledge Based Bid & Proposal


What Happens to a Cost Estimate When It’s Done? (BA-6)

William Barfield – Executive Cost Estimator Associate, Quantech Services, Inc.
David Bach – Director of Business Analytics, Quantech Services, Inc.

What happens to a cost estimate when it is done, or “finished?” Cost estimates are used to support a wide variety of financial decisions. When estimates are done and the decision has been made, is the estimate still useful after the decision, or does it become “shelf-ware?” We surveyed the international cost community to determine how we develop, document, use, and archive our various kinds of cost estimates. We analyze what happens to completed cost estimates by describing the types that are typically developed, the types of decisions they support, who makes these estimates, the various storage methods, and the (potential) reuse of the cost information. We describe and compile some real-life observations of practitioners and compare them to various cost guidelines (e.g., GAO, DoD, FAA, MOD, CEBoK) that stipulate the maintenance, archival, and disposition recommendations for cost estimates. Survey summary statistics are reviewed and we conclude that a well prepared, documented, and stored cost estimate can be the difference between wasted effort and informed decision making.

BA-6 – Presentation – What Happens to a Cost Estimate When It’s Done

BA-6 – Handout – What Happens to a Cost Estimate When It’s Done


Update on The Cost FACTS (Factors, Analogies, CER’s & Tools/Studies) Group – Enhancing KM by leveraging Enterprise Social Networking (BA-8)

Daniel Harper – MITRE Corporation
Ruth Dorr – MITRE Corporation

Cost FACTS is a community-driven initiative to bring together the cost estimating community across MITRE, Government agencies, the contractor community, and academia to share reusable, non-restricted cost information and to encourage dialogue throughout the worldwide cost community. Cost FACTS provides an easily accessible repository of reusable cost estimating FACTS, i.e., Factors, Analogies, Cost Estimating Relationships (CERs), and Tools/Studies. These FACTS may have been originally gathered as the basis for a cost estimate in one agency, but have applicability across many agencies. Cost FACTS members within the MITRE community have also been working to gather basic, non-restricted information from completed cost estimates to build a repository of cost estimating structures factors for use as a basis for future estimates. Since its inception 18 months ago, Cost FACTS has increased its membership exponentially, reaching across all centers within MITRE, government agencies supported by MITRE, as well as others in the industry supporting those agencies. Cost FACTS is currently being recommended in several Government agencies as a tool to assist the cost estimating community and improve cost estimates (e.g., Commerce, DHS, CMS). The Nominees have put over 400 hours into Cost FACTS in FY 2013 alone, gathering “the FACTS,” managing the Handshake site itself, presenting the idea internally within MITRE as well as externally at cost estimating group meeting and professional and society conferences. This is an independent, MITRE employee-initiated effort that has been partially funded through the use of department developmental funds, with a significant amount of work being conducted on an unfunded basis.

BA-8 – Presentation – Getting (and sharing!) the FACTS Factors, Analogies, CERs and Tools Studies

BA-8 – Handout – Getting (and sharing!) the FACTS Factors, Analogies, CERs and Tools Studies


Space Shuttle Cost Analysis: A Success Story? (BA-9)

Humboldt Mandell – Research Fellow, The University of Texas Center for Space Research

In the aftermath of the highly successful Apollo lunar program, NASA struggled for a few years to find a meaningful program which would satisfy long range national space strategies, as well as reflecting the realities of the rapidly changing political environment in the nation. The Space Shuttle emerged from the need to lower the costs of orbital cargo delivery for construction of a space station and enabling Mars exploration, but also was highly constrained by DOD requirements for cargo mass, cargo size, and once-around cross range.

A key part of the program initiation process was to find a vehicle which would fit the ever-changing budget profiles that were emerging from the Nixon Administration Office of Management and Budget, not only for total cost, but also peak annual funding.

Early in his presidency, Nixon appointed a Space Task Group, chaired by Vice President Spiro Agnew. This body made recommendations for both budgets and program timing for a Space Shuttle, a Space Station, and in some options, a human Mars mission. Nixon chose the single program, the Shuttle.
Cost analysts struggled with (at first primitive) methods to follow the rapidly-changing budgetary constraints and technical configurations, to help the design force to find solutions to the cost and peak funding constraints. In one summer over 40 configurations were costed, forcing rapid changes in the sophistication of the cost estimation techniques.

This paper tells that story, and describes the major success of meeting the original cost and peak funding commitments, and why the success was attained. It also admits to the failure of the estimates for operations costs, and presents the reasons for the failure.

BA-9 – Presentation – Space Shuttle Cost Analysis A Success Story

BA-9 – Handout – Space Shuttle Cost Analysis A Success Story


EVM Papers:

Big Data Meets Earned Value Management (EV-1)

Glen Alleman – Program Planning and Controls Lead, Niwot Ridge, LLC.
Thomas Coonce – Adjunct Research Staff Member, Institute for Defense Analyses

When the result of an action is of consequence, but cannot be known in advance with precision, forecasting may reduce decision risk by supplying additional information about the possible outcomes.

Data obtained from observations collected sequentially over time are common. Earned Value Management is an example where project performance data (BCWP) is collected from the status reports of planned work (BCWS) and a forecast of future performance is needed to manage the program.
With this periodic data, cumulative Cost Performance Index (CPIcum) and Schedule Performance Index (SPIcum) are produced. During the accumulation of BCWP and ACWP, variances on a period-by-period basis are washed out leaving only the cumulative data. This cumulative past and current period point value data is used to calculate an Estimate At Completion (EAC), Estimate To Complete (ETC), and a To Complete Performance Index (TCPI), using algebraic formulas. None of the past statistical behavior of the program is used for these calculations.

Earned Value Management System engines maintain period-by-period data in their underlying databases. With this time series performance information, analysis of trends, cost and schedule forecasts, and confidence levels of these performance estimates can be calculated using probabilistic techniques based on the Autoregressive Integrated Moving Average (ARIMA). Statistical forecasting techniques can be univariate (one driving variables) or multivariate (more than one driving variable). The three components of ARIMA can be tuned. Autoregression (AR) depends on a linear combination of previous observed values with a lag plus and error term. The Moving Average (MA) assumes the observed values are error terms plus some linear combination of previous random error terms up to a maximum lag. Integration (I) joins AR and MA for a powerful tool to forecast future behavior from past behavior.

Using ARIMA in place of cumulative and single point values provides a statistically informed EAC and ETC to the program in ways not available using standard Earned Value Management calculations. These observed behaviors might appear as random, orderly, or noisy processes. Using ARIMA reveals underlying trends not available through standard EVM engines calculations. The simple algebraic forecasting of EAC fails to recognize the underlying statistical nature of a project’s performance measures. Using a simple moving average formula of past cumulative observations equally hides the underlying statistical nature of the performance numbers. ARIMA can adjust the autoregression and moving averages attributes to reveal future performance not available with simple algebraic or linear smoothing.

With these leading indicators of cost and schedule performance, the program manager can take action to keep the program GREEN before it is too late.

EV-1 – Presentation – Big Data Meets Earned Value Management

EV-1 – Paper – Big Data Meets Earned Value Management

EV-1 – Handout – Big Data Meets Earned Value Management


Don’t Let EVM Data Mislead You (EV-2)

Steve Sheamer – Herren Associates

EVM data is a popular data source for cost estimators and for good reason; in theory, it should provide most of the data needed to develop an estimate for a program. For completed programs, it provides historical costs by WBS and for programs that are in work it provides a measure of the work completed, work remaining, and forecast of the work remaining. But during a period of frequent cost overruns, estimates built using EVM data often fail to forecast the extent of program overruns. Leveraging real world examples and first-hand experience with several of DoD’s largest acquisition programs, this paper will discuss common mistakes that cost estimators make when using EVM data to develop a cost forecast. The author will also provide recommendations to help those in the cost community better utilize EVM data and clearly understand potential pitfalls that could lead to significant estimating errors.

EV-2 – Presentation – Don’t Let EVM Data Mislead You

EV-2 – Handout – Don’t Let EVM Data Mislead You


Trust but Verify – An Improved Estimating Technique Using the Integrated Master Schedule (IMS) (EV-3)

Eric Lofgren – Cost Analyst, Technomics, Inc.

It has long been the wonder of management why the Integrated Master Schedule (IMS) fails to give advanced warning of impending schedule delays. The oft-touted Government Accountability Office (GAO) 14-Point Check for Schedule Quality analyzes schedule health using key metrics, leading one to assume that such a test authenticates schedule realism. Why, then, do practitioners find themselves caught off guard to slips when their IMS appears in good health? Answers to this question follow when one attempts to independently trace IMS development over time. This paper presents the results, including a significantly improved new metric for independently estimating final schedule duration, as well as a startling conclusion about project planning and schedule maintenance.

As “living documents,” schedules evolve with the circumstances affecting a program. This implies that while looking at historical schedules may garner some additional context, only the current schedule incorporates the relevant information necessary for calculating the project end date. However, all maturing schedules descend from an original baseline agreed to by both client and performer. The metric proposed in this paper takes such a baseline and independently tracks near-term activities from the original schedule through subsequent schedules. In theory, the process mirrors what scheduling software does. A key difference is that the metric ignores behavior often associated with risk: baseline changes; task re-sequencing; insertion of hard-constraints, leads, and lags; and so on.

This paper relies on completed contracts for which schedules were available throughout the duration. With dozens of contract schedules analyzed across several Major Defense Acquisition Programs (MDAPs), all complete contracts showed remarkably consistent results: before contracts reached their schedule mid-points, the independent metric had quickly jumped up and stabilized near the true eventual end date (in most cases, a significant schedule slip). At that same point, while the contractor IMS deviated on average 16 months from the true end date, the independent metric found the value to within 2 months. Composite measures of accuracy abstract away from the power of this new metric, which is demonstrated by individual case studies.

Because the independent metric far outperformed the IMS in predicting contract schedule, one may conclude that the performance of early, well-defined, tasks compared to the initial baseline makes a good leading indicator for where the final schedule will end up. It also implies that schedule is unlikely to be saved by managers working-around issues, schedulers entering constraints, or CAMs planning optimistically, even if the reshuffling works in theory. This metric finds a strong place with decision makers for its early warning capabilities and its ease of visual comprehension. It also affords sufficient detail for the analyst to have pointed discussions with the scheduler, focusing attention on the sources of potential risk. Such a form of analysis becomes more important as project complexity grows, ensuring cross-IMS sanity and providing a second assessment of contract schedule.

EV-3 – Presentation- Trust but Verify – An Improved Estimating Technique Using the Integrated Master Schedule (IMS) *Best Paper: Earned Value Management Track

EV-3 – Paper – Trust but Verify – An Improved Estimating Technique Using the Integrated Master Schedule (IMS)

EV-3 – Handout – Trust but Verify – An Improved Estimating Technique Using the Integrated Master Schedule (IMS)


A Cure For Unanticipated Cost and Schedule Growth (EV-4)

Thomas Coonce – Adjunct Research Staff Member, Institute for Defense Analyses
Glen Alleman – Program Planning and Controls Lead, Niwot Ridge, LLC.

Federal programs (DoD and civilian) often fail to deliver all that was promised and many times cost more than estimated and are often late.
Delivering programs with less capability than promised while exceeding the cost and planned undermines the Federal government’s credibility with taxpayers and contributes to the public’s negative support for these programs.

Many reasons have been hypothesized and documented for cost and schedule. The authors propose that government and contractors use the historical variability of the past programs to establish cost and schedule estimates at the outset and periodically update these estimates with up-to-date risks, to increase the probability of program success. For this to happen, the authors suggest a number of changes to the estimating, acquisition and contracting processes.

For government program offices, these include:

• Develop top-level probabilistic cost and schedule estimates based on the statistical variability of past programs with similar risks.
• Propose cost and schedule targets that have at least a Joint Confidence Level (JCL) of 70 percent;
• Develop a draft Integrated Master Plan (IMP) to achieve the desired capabilities and performance measures at this JCL.
• Through Request for Information (RFIs), seek advice from industry on the likelihood of delivering the stated capabilities within the proposed cost and schedule estimates, using a draft IMP and initial risk registers.
• Using this industry feedback, revise needed capabilities, cost and schedule to be included in the request for proposals (RFPs).
• Include realism of technical, cost, schedule and risks as a criteria for awarding contracts.
• Do not award contracts that have less than 50% JCL of meeting both cost and schedule targets.
• Ensure contractors have initial Program Management Baselines (PMBs) that have a JCL greater than 35%.

The authors suggest a number of improved processes for contractors for submitting an updated IMP, schedule distributions using their historical data in response to RFPs, and application of Technical Performance Measures (TPM) to objectively assess Earned Value Management performance (BCWP).

EV-4 – Presentation – A Cure For Unanticipated Cost and Schedule Growth

EV-4 – Paper – A Cure For Unanticipated Cost and Schedule Growth

EV-4 – Handout – A Cure For Unanticipated Cost and Schedule Growth


Unleashing the Power of MS Excel as an EVM Analysis Tool (EV-5)

Allen Gaudelli – Herren Associates
Steve Sheamer – Herren Associates

What do you do if you need to analyze or report on EVM data and you don’t have access to (or can’t afford) the latest industry software? Nearly everyone has a very powerful analysis and reporting tool on their desktop with the flexibility and capability to consolidate cost, schedule, and risk drivers into a single view. In this presentation, we will show you how to leverage and manipulate the inherent capabilities of Microsoft Excel to build interactive EVM dashboards that rival the reporting capabilities of many of the industry leading software tools. Also, the flexibility of Microsoft Excel makes the dashboards customizable so that the specific requirements of any program can be satisfied at no additional cost. The use of these dashboards improve a program manager’s ability to identify faulty contractor reporting, address contractor performance issues, and gain insight into lower level details of cost, schedule, and risk drivers. To complement the EVM dashboard, we will also demonstrate a VBA Tool that expedites the process of transferring the content of the EVM dashboard to PowerPoint. The use of this VBA tool eases the burden of copying, pasting and positioning multiple tables and charts into a standard brief. These tools, when used in parallel, have a synergistic effect resulting in superior program management.

Files not available


Design to Cost: Misunderstood and misapplied (EV-6)

Erin Barkel– Canadian Parliamentary Budget Office
Tolga Yalkin – Canadian Parliamentary Budget Office

The Canadian Department of Defence maintains that concerns over cost overruns are overstated because it adopts a design to cost approach. According to the US Government, design to cost “embodies the early establishment of realistic but rigorous cost targets and a determined effort to achieve them.” From the beginning of a project to its completion, “[c]ost is addressed on a continuing basis as part of a system’s development and production process.”

In numerous projects, the statements, actions, and omissions of the Canadian Department of Defence suggest that it views design to cost as involving the second but not the first step. In other words, while it says that it addresses cost on a continuing basis, it fails to set “realistic but rigorous” budgets.

The failure to set realistic budgets can have serious consequences for programs. If the initial budget is not realistic, it may become necessary to re-baseline the program, starting with a completely new design in the hopes that the new design will fit within the budget. This phenomenon has been witnessed in a number of Canadian defence procurements, resulting in unnecessary expense and delay.

This paper explores a handful of Canadian procurements, illustrating the pitfalls of failing to set realistic budgets at the outset. It suggests that the refrains of the Canadian Department of Defence that budgetary concerns are minimized by design to cost are not true. Finally, it argues for more rigorous cost analysis prior to the setting of budgets, and provides suggestions on how this might be accomplished.

EV-6 – Presentation – Design to Cost Misunderstood and Misapplied

EV-6 – Handout – Design to Cost Misunderstood and Misapplied


Information Technology Papers:

Testing Benford’s Law with Software Code Counts (IT-2)

Chuck Knight – Consultant, Deloitte Consulting
Chris Kaldes – Deloitte Consulting

When analyzing a data set, common thinking may lead one to suspect that the leading digit of each data point would follow a uniform distribution, where each digit (1 through 9) has an equal probability of occurrence. Benford’s law, to the contrary, states that the first digit of each data point will conform to a nonuniform distribution. More specifically, it states that a data point is more likely to begin with a one than a two, a two more likely than a three, a three more likely than a four, and so on. In a real world example, forensic accounting has identified patterns in financial fraud cases where criminals have used fabricated numbers where the leading digit does not follow Benford’s law.

One of the largest cost drivers of a software development project is size. Counting source lines of code (SLOC) is one of the most common ways to estimate the size of software. Cost estimators typically use estimated SLOC counts to determine the labor required to develop the respective software and ultimately estimate the cost. When SLOC estimates are provided to a cost estimator (such as in a CARD, an ICBD, or input from a SME), should the estimator accept the estimates? Most estimators are not software developers or engineers and therefore it is reasonable to accept the SLOC estimates as it presents the best data available. This paper will present estimators with a quick test as a cross check using Benford’s law. If an estimator’s SLOC estimates do not pass this test, this paper will also discuss suggestions to mitigate some of the associated risks.

IT-2 – Presentation – Testing Benford’s Law with Software Code Counts

IT-2 – Handout – Testing Benford’s Law with Software Code Counts


Improved Method for Predicting Software Effort and Schedule (IT-3)

Wilson Rosa – AIS/C4ISR Branch Head, Naval Center for Cost Analysis
Barry Boehm – Professor Emeritus, University of Southern California
Ray Madachy – Associate Professor, Naval Postgraduate School
Brad Clark – Vice-President, Software Metrics Incorporated
Joseph P. Dean – Operating Location Chief, Hanscom AFB Air Force Cost Analysis Agency

This paper presents a set of effort and schedule estimating relationships for predicting software development using empirical data from 317 very recent US DoD programs. The first set predicts effort as a function of size and application type. The second predicts duration using size and staff level. The models are simpler and more viable to use for early estimates than traditional parametric cost models. Practical benchmarks are also provided to guide analysts in normalizing data.

Keywords: Cost Model; Effort Estimation; Schedule Estimation; Software Engineering

IT-3 – Presentation – Improved Method for Predicting Software Effort and Schedule  *Best Paper: Information Technology Track

IT-3 – Paper – Improved Method for Predicting Software Effort and Schedule

IT-3 – Handout – Improved Method for Predicting Software Effort and Schedule


Costs of Migration and Operation in the Cloud (IT-4)

Arlene Minkiewicz – Chief Scientist, PRICE Systems

At one level cloud computing is just Internet enabled time sharing. Instead of organizations investing in all the Information Technology (IT) assets such as hardware, software and infrastructure they need to meet business needs; cloud computing technology makes these resources available through the Internet. Cloud computing allows an organization to adopt a different economic model for meeting IT needs by reducing capital investments and increasing operational investments. Gartner has predicted that “the bulk of new IT spending by 2016 will be for could computing platforms and applications with nearly half of large enterprises having cloud deployments by 2017″[1]. McKinsey and Company predict that the total economic impact of cloud technology could be $1.7 trillion to $6.2 trillion by 2025[2].

“Cloud computing embraces cyber-infrastructure and builds upon decades of research in virtualization, distributed computing, grid computing and more recently networking, web, and software services”.[3] In other words, although the term cloud computing is relatively new, the concepts and technologies behind cloud computing have been emerging and evolving for some time. Consumers of cloud computing access hardware, software, and networking capabilities from third party providers in much the same way they get electricity or water from their utility companies.
The utility computing model offered in the cloud clearly brings benefits – especially to small and medium sized enterprises and any sort of startup business. In addition to the cost savings from not having to purchase all the hardware, software and infrastructure associated with running a business, cloud solutions bring agility, scalability, portability and on-demand availability as well. So what’s the downside?

While the potential for cost savings is real, as with all things – getting there is not free. A company with firmly entrenched legacy systems need to think about the trade-offs of migrating from the status quo into the cloud. Migration into the cloud could spur a host of activities. These include issues of installation and configuration, possible changes to code to adapt to the cloud hosts operational environment, possible changes to data base queries and schemas as well as adaption for changes in the way applications interface with legacy applications or other applications in the cloud. They also need to identify the cloud solution providers, understand their pricing models and determined a strategy to wisely and affordably move to the cloud.

This paper reports on on-going research into the costs and benefits of cloud computing. It begins with a discussion of cloud computing – what it is, what are the different types of cloud computing and how it is being used by businesses and the government. It then delves into the cost issues associated with moving to and operating in the cloud. Following this, there will be a discussion of the various pricing models and options currently offered by cloud providers. Finally, a methodology and model will be presented for using this information to understand the total cost of moving capabilities into the cloud.

IT-4 – Presentation – Costs of Migration and Operation in the Cloud

IT-4 – Paper – Costs of Migration and Operation in the Cloud

IT-4 – Handout – Costs of Migration and Operation in the Cloud


How I Continued to Stop Worrying and Love Software Resource Data Reports (IT-5)

Nicholas Lanham

This presentation highlights the trends and cost estimating relationships derived from detailed analysis of the August 2013 Office of the Secretary of Defense (OSD) Software Resource Data Report (SRDR) data. This analysis was conducted by Nicholas Lanham and Mike Popp and provides as follow-on analysis to the August 2012 SRDR brief developed and previously presented by Mike Popp, AIR 4.2. As described within the August 2013 presentation, the Government’s unprecedented view of the Department of Defense’s (DoD) most comprehensive software-specific database has continued to expand from 1,890 individual records in 2012 to 2,546 records within the 2013 analysis. This expansion has also allowed for previously developed software growth algorithms to be updated, including an increased number of paired (initial and final records) data, expanding from 142 paired data points within the 2012 presentation to 212 paired data points within the 2013 analysis. In addition, the latest 2013 SRDR data analysis has also driven the generation of a more comprehensive understanding between the relationships between experience level, requirements volatility, CMMI level, development process, code count type, new development, upgrade development, language type(s), and software productivity (Hours/ESLOC). As initially highlighted within the 2012 analysis, the latest analysis of the 2013 dataset further indicates the lack of software productivity influence that is driven by contractor-reported requirements volatility ratings, CMM levels, and/or staffing experience levels due to inconsistent SRDR reporting definitions between various contractors. Considering the significant increase in data records from 2012, this presentation further supports the derived initial ESLOC and percent change in software development hour relationship, as well as increases the number of records supporting the previously derived software productivity-rate relationships to software language type(s).

IT-5 – Presentation – How I Continued to Stop Worrying and Love Software Resource Data Reports

IT-5 – Handout – How I Continued to Stop Worrying and Love Software Resource Data Reports


Mobile Applications, Functional Analysis and Cost Estimation (IT-6)

Tammy Preuss

Mobile applications, their use and popularity, have increased exponentially in the past 7 years with the introduction of Apple’s iPhone, Google’s Android Operating System and mobile gaming platforms such as Microsoft’s XBox One. This increase in applications and the data used has challenged communication service providers to provide the needed bandwidth and has led to the quick deployment of high speed cellular networks such as LTE. New business models, revenue models, and even companies have been created on the success of one mobile application or a new piece of functionality.

Customers experience mobile applications differently than application on computers. In addition to their portability, customers interact with mobile applications through different interfaces. Using multi-touch screens, voice, rotation/alignment, camera interfaces and blowing air on the screen. These applications are changing our communication methods and allowing customer to personalize their interactions.

Functional Analysis, as defined by ISO/IEC 14143-1:2007 and documented in the IFPUG Counting Practices Manual (CPM 4.3.1) can quickly identify the functionality provided to the customer by documenting data and transactional functions. This method can be used to estimate costs at any time during the lifecycle of a mobile application.

This presentation will demonstrate how to derive cost estimates at different stages in a project’s lifecycle by using function points and the advantages of using an FP based size estimate over a SLOC based estimate. The intended audience is software cost estimators, project managers, and anyone who is interested in software measurement.

Keywords: Function Points, Software Estimation, Agile, Mobile Applications, Mobility, Project Management, Software Measurement

IT-6 – Presentation – Mobile Applications, Functional Analysis and Cost Estimation


In Pursuit of the One True Software Resource Data Reporting (SRDR) Database (IT-7)

Zachary McGregor-Dorsey – Cost Analyst, Technomics, Inc.

For many years, Software Resource Data Reports, collected by the Defense Cost and Resource Center (DCARC) on Major Defense Acquisition Programs (MDAPs), have been widely acknowledged as an important source of software sizing, effort, cost, and schedule data to support estimating. However, using SRDRs presents a number of data collection, normalization, and analysis challenges, which would in large part be obviated by a single robust relational database. The authors set out to build just such a database, and this paper describes their journey, pitfalls encountered along the way, and success in bringing to fruition a living artifact that can be of tremendous utility to the defense software estimating community.

SRDRs contain a wealth of data and metadata, and various attempts have been made by such luminaries in the field as Dr. Wilson Rosa and Mr. Mike Popp to excerpt and summarize the “good” data from SRDRs and make them available to the community. Such summaries typically involve subjective interpretations of the raw data, and by their nature are snapshots in time and may not distinguish between final data and those for which updates are expected.

The primary goal of this project was to develop an Access database, which would both store the raw source data in its original form at an atomic level, exactly as submitted by WBS element and reporting event, and allow evaluations, interpretations, and annotations of the data, including appropriate pairing of Initial and Final reports; mapping of SLOC to standard categories for the purposes of determining ESLOC; normalization of software activities to a standard set of activities; and storage of previous assessments, such as those of the aforementioned experts. The database design not only provides flexible queries for quick, reliable access to the desired data to support analysis, it also incorporates the DCARC record of submitted and expected SRDRs in order to track missing past data and anticipate future data.

The database is structured by Service, Program, Contract, Organization, CSDR Plan, and Reporting Event, and is flexible enough to include non-SRDR data. Perhaps its most innovative feature is the implementation of “movable” entities, wherein quantities such as Requirements, Effort, and SLOC, and qualities such as Language, Application Type, and Development Process can be reported at multiple levels and “rolled up” appropriately using a sophisticated set of queries. These movable entities enable the database to easily accommodate future changes made to the suggested format or reporting requirement found in the SRDR Data Item Description (DID).

This work was sponsored by the Office of the Deputy Assistant Secretary of the Army for Cost and Economics, and represents a continuation of the effort that produced the ICEAA 2013 Best Paper in the IT track, “ODASA-CE Software Growth Research.” A key motivation of the database is to be able to provide real-time updates to both that Software Growth Model and ODASA-CE’s Software Estimating Workbook. We are also collaborating with the SRDR Working Group on continual improvements to the database and how best to make it available to the broader community.

IT-7 – Presentation – In Pursuit of the One True Software Resource Data Reporting (SRDR) Database

IT-7 – Handout – In Pursuit of the One True Software Resource Data Reporting (SRDR) Database


Optimizing Total Cost of Ownership for Best Value IT Solutions: A Case Study using Parametric Models for Estimates of Alternative IT Architectures and Operational Approaches (IT-8)

Denton Tarbet – Senior Consultant, Galorath Incorporated
Kevin Woodward – Fellow, Lockheed Martin Corporation
Reggie Cole – Senior Fellow, Lockheed Martin Corporation

Because of a variety of architectures and deployment models, Information Technology (IT) has become more and more complex for organizations to manage and support. Current technology IT system architectures range from server based local systems to implementations of a Private Cloud to utilization of the Public Cloud. Determining a “best value architecture” for IT systems requires the ability to effectively understand not only the cost, but the relative performance, schedule and risk associated with alternative solutions. The search for best value changes the “price-only” focus to one of Total Cost of Ownership (TCO). To optimally select a “best value ” approach for an information systems (IS) architecture, the IT organization must have a method to develop high confidence performance, cost, schedule, and risk estimates for each alternative. In order to assess TCO, it is critical to be able to effectively estimate the cost of ongoing operations provided by an in-house data center technical support team vs. a Managed Service Contractor and the risks associated with each model.

This paper presents IT project management support methods that incorporate parametric effort estimation models into the process of establishing IT architectures, solutions, and ongoing support to optimize TCO relative to capability. A case study of applying a parametric information technology estimate model to the development of estimates for Managed Service, or the cost of ongoing operations, for complex IT systems is presented. IT estimates in the case study include analysis of alternative operational approaches to maintain multiple data centers located globally and a widely distributed user community. The estimates in the case study include systems engineering for architecture and system design, the IT infrastructure, end user support, service desk, documentation, software and database services, development and maintenance of custom applications software, training, purchased hardware, purchased software, and facilities. In addition, ongoing support or Managed Service estimates incorporate requirements for multiple Service Level Agreements which must be satisfied.

Parametric model results for the case study are provided to demonstrate the decision support process. Utilizing a proven optimization approach, it is demonstrated that, with the support of an effective estimation model to develop effort estimates for alternative approaches, it is possible to optimize TCO, and thus establish a “best value” IT solution.

IT-8 – Presentation – Optimizing Total Cost of Ownership for Best Value IT Solutions

IT-8 – Handout – Optimizing Total Cost of Ownership for Best Value IT Solutions


Estimating Hardware Storage Costs (IT-9)

Jenny Woolley – Scitor Corporation
William Black – Scitor Corporation

Estimating Commercial-off-the-Shelf (COTS) hardware storage volume and cost requirements can be challenging. Factors such as storage type, speed, configuration, and changing costs can potentially lead to estimating difficulties. This is especially true when a Redundant Array of Independent Disks (RAID) configuration is implemented. Due to the multiple attributes that can vary within each RAID level, as well as other factors that may influence the total storage volume needed, developing relationships for estimating long-term storage costs can become complicated.

This research will examine the estimation of RAID storage costs. Through the evaluation of historical procurement data, we will evaluate the costs associated with several common disk drive standards and how those costs may change over time. Other areas of consideration include storage needs for different RAID levels, the need for open storage, and changing storage needs within the government. By obtaining a better understanding of storage variations, analysts will be better able to predict storage volume needs and estimate the potential cost impacts of different storage requirements.

IT-9 – Presentation – Estimating Hardware Storage Costs

IT-9 – Handout – Estimating Hardware Storage Costs


Relating Cost to Performance: The Performance-Based Cost Model (IT-10)

Michael Jeffers – Senior Cost Analyst, Technomics, Inc.
Robert Nehring – Cost Analyst, Technomics, Inc.
Jean-Ali Tavassoli – Cost Analyst, Naval Surface Warfare Center Carderock Division
Kelly Meyers – Surface Combatant Team Lead, Naval Surface Warfare Center Carderock Division
Robert Jones – Senior Cost Analyst, Technomics, Inc.

For decades, in order to produce a cost estimate, estimators have been heavily reliant on the technical characteristics of a system, such as weight for hardware elements or source lines of code (SLOC) for software elements, as specified by designers and engineers. Quite often, a question will arise about the cost of adding additional performance requirements to a system design (or in a design-to-cost scenario, the savings to be achieved by removing requirements). Traditionally, the engineers will then have to undertake a design cycle to determine how the shift in requirements will change the system. The resultant technical outputs are finally given to the cost estimators, who will run them through their cost model to arrive at the cost impact. However, what if a single model could estimate the cost from the performance of the system alone? A Performance Based Cost Model (PBCM) can do just that.

First introduced in 1996, a PBCM is an early-stage rough-order-of-magnitude (ROM) cost estimating tool that is focused on relating cost to performance factors. PBCMs are parametric cost models that are integrated with a parametric engineering model so that they estimate cost as a function of performance by simultaneously estimating major physical characteristics. They are derived from historical data and engineering principles, consistent with experience. PBCMs are quick, flexible, and easy to use and have proven to be a valuable supplement to standard, detailed concept design and costing methods.

In this paper we explain essential PBCM concepts, including:
• A discussion of the interplay of capabilities, effectiveness, performance characteristics, and cost.
• How to identify the most meaningful cost drivers (i.e., performance characteristics, technology factors, and market conditions).
• How to identify the most meaningful output variables (i.e., those variables of prime interest to the PBCM user).
• How to create the mathematical structure that integrates cost drivers with cost and physical characteristics.
• How to obtain and normalize historical performance data, cost data, and technical data (physical characteristics).
• How to generate cost and physical characteristic equations.
• How to implement a PBCM.
• How to use a PBCM.

Files not available


Lessons Learned from the International Software Benchmark Standards Group (ISBSG) Database (IT-11)

Arlene Minkiewicz – Chief Scientist, PRICE Systems

As corporate subscribers and partners to the International Software Benchmarks Standards Group (ISBSG )Database, PRICE has access to a wealth of information about software projects. The ISBSG was formed in 1997 with the mission “To improve the management of IT resources by both business and government through the provision and exploitation of public repositories of software engineering knowledge that are standardized, verified, recent and representative of current technologies.” This database contains detailed information on close to 6000 development and enhancement projects and more than 500 maintenance and support projects. To the best of this author’s knowledge, this database is the largest, most trusted source of publically available software data that has been vetted and quality checked.

The data covers many industry sectors and types of businesses though it is weak on data in the aerospace and defense industries. Never the less, there are many things we can learn from analysis of this data. The Development and Enhancement database contains 121 columns of project information for each project submitted. This information includes information identifying the type of business and application, the programming language(s) used, Functional Size of the project in one of many Functional Measures available in the industry (IFPUG, COSMIC, NESMA, etc.), project effort normalized based on the project phases the report contains, Project Delivery Rate (PDR), elapsed project time, etc.
At PRICE we have used this data in many ways both to improve our estimating guidance and to improve our software CERs. One of the projects we accomplished with this data was the creation of a series of data driven cost modeling templates across industry sector and application type. These templates are pre-filled with relevant values for input parameters along with a risk range determined by the statistical goodness of the values as predictors within the data set studied.

This paper will introduce the ISBSG and the database that are available from the ISBSG. It then provides details of the data driven approach applied to develop these templates ? discussing research approach, methodology, tools used, findings and outcomes. This is followed by a discussion of lessons learned including the strengths and weaknesses of the database and the strength and weaknesses of the solutions derived from it. While particularly relevant to software estimators, this paper should be valuable to any estimator who lacks data or has data they are not quite sure what they might do with it.

IT-11 – Presentation – Costs of Migration and Operation in the Cloud

IT-11 – Paper – Costs of Migration and Operation in the Cloud

IT-11 – Handout – Costs of Migration and Operation in the Cloud


Software Maintenance: Recommendations for Estimating and Data Collection (IT-12)

Shelley Dickson – Operations Research Analyst, Naval Center for Cost Analysis
Bruce Parker – Naval Center for Cost Analysis
Alex Thiel – Operations Research Analyst, Naval Center of Cost Analysis
Corinne Wallshein – Technical Advisor, Naval Center for Cost Analysis

The software maintenance study reported at ICEAA in 2012 and 2013 continued to progress in 2013 in spite of the high data variability. This presentation summarizes the past years’ software maintenance data collection structure, categorizations, normalizations, and analyses. Software maintenance size, defect, cost, and effort data were collected from Fiscal Years (FY) 1992 – 2012. Parametric analyses were performed in depth on available variables included in or derived from this U.S. Department of Defense software maintenance data set. This description of the team’s decision making, derivations, and analyses may assist others in building cost estimating methodologies. Effort Estimating Relationships (EERs) presented may support future software maintenance estimation, including uncertainty distribution characterizations based on collected historical data. Recommended EERs include using an industry formula to compute Equivalent Source Lines of Code (ESLOC) software size and using the computed ESLOC to estimate an annual number of Full Time Equivalent (FTE) personnel during the software maintenance period, or combining yearly defects fixed and source lines of code to estimate annual software maintenance effort hours. Ultimately, the goal is to routinely collect and analyze data to develop defensible software maintenance cost estimating methodologies. A synopsis of the current phase of the study will be presented.

IT-12 – Presentation – Software Maintenance Recommendations for Estimating and Data Collection

IT-12 – Handout – Software Maintenance Recommendations for Estimating and Data Collection


An Update to the Use of Function Points in Earned Value Management for Software Development (IT-13)

Michael Thompson – Director, Cost and Performance Management, Cobec Consulting, Inc.
Daniel French – Cobec Consulting Inc.

In this follow up presentation to their 2013 ICEAA presentation, the authors detail their efforts in the successful implementation of an EVM methodology for a government software development project utilizing the International Function Point User Group (IPFUG) function point software sizing metric.

Traditionally it has been difficult to apply Earned Value Management (EVM) criteria to software development projects, as no tangible value is earned until the software is delivered to production. The process developed by the authors successfully addressed this by application of the International Standards Organization (ISO) approved function point software sizing methodology to measure EVM during the course of the software development lifecycle. The use of SLOC to determine progress in software development is difficult, if not impossible, as there is no standard SLOC counting rules and is heavily dependent upon language, platform and individual developer skills.

This presentation describes the opportunity that was presented to the team and how the recently completed pilot program was developed and implemented to address it. The authors will address how effective the pilot program was as far as identifying and resolving issues, measuring earned value, as well as the challenges and lessons learned with the development, implementation, and sustainment of the FP based EVM process.

IT-13 – Presentation – An Update to the Use of Function Points in Earned Value Management for Software Development

IT-13 – Handout – An Update to the Use of Function Points in Earned Value Management for Software Development


The Federal IT Dashboard: Potential Application for IT Cost & Schedule Analysis (IT-14)

Daniel Harper – MITRE Corporation

Federal agencies have experienced a growing demand for rapid turnaround cost and schedule estimates. This need is increasing as the pressure to deploy systems rapidly mounts. The push for Agile SW development compounds this problem.

A critical component in cost estimating is the data collection of costs for the various elements within the estimate. Analogous programs constitute a robust source for credible estimates. The problem is how to find analogous programs and how to capture the cost of elements within those programs at a sufficiently detailed level to use in a cost estimate and in a timely manner so that the cost data is still relevant.

The data for analogous programs already exists within government. The Open Government initiative has provided a way to overcome some of the problems of obtaining this data rapidly.

One example of a source of data for the analogous programs is the IT Spending Dashboard. The IT Dashboard is an open source, publicly available site that provides insight to over $80 billion of IT spending for over 700 major programs across 27 major agencies. The data from government sources such as the IT Dashboard can be use a source for current, analogues cost data.

Cost and schedule data for these programs is provided directly from agencies and can be accessed and exported into Microsoft Excel for analysis. Analysis of the cost and schedule elements for these programs can provide insight into historical spending and provide program managers with an important tool for predicting cost and schedule estimates for current programs.

IT-14 – Presentation – The Federal IT Dashboard Potential Application for IT Cost & Schedule Analysis

IT-14 – Paper – The Federal IT Dashboard Potential Application for IT Cost & Schedule Analysis

IT-14 – Handout – The Federal IT Dashboard Potential Application for IT Cost & Schedule Analysis


Trends in Enterprise Software Pricing from 2002 to 2011 (IT-17)

Ian Anderson – Senior Cost Analyst, Naval Center for Cost Analysis
Dara Logan – Operations Research Analyst, Naval Center for Cost Analysis

One of the biggest challenges in the cost estimating community is data collection. In the Information Technology (IT) cost community, technology is always evolving, while the data capturing it tend to be scarce and more difficult to use in building solid cost models. Fortunately, NCCA learned the Department of the Navy (DON) Chief Information Officer (CIO) has been collecting benchmarking measures, including pricing, since 2002 under the Enterprise Software Initiative (ESI) Blanket Purchasing Agreements (BPAs). DON CIO shared its data with NCCA so various data analyses could be conducted. NCCA generated statistical trends and pricing factors from the ESI IT Commercial Off-the-Shelf (COTS) products and services. Although this is a start, and the benefits from this initial analysis primarily assist the IT cost estimator, NCCA plans to continue its relationship with DON CIO so that by providing continued database updates, they too will reap benefits from forecast models developed in support of various cost estimating efforts. Currently, multiple programs use IT COTS, many are using ESI BPAs to procure IT, and NCCA expects these programs to continue using the same or similar products over their lifecycles. Therefore, the results from this continued analysis will certainly benefit the community’s efforts with producing credible estimating tools.

IT-17 – Presentation – Trends in Enterprise Software Pricing from 2002 to 2011


Estimating Cloud Computing Costs: Practical Questions for Programs (IT-18)

Kathryn Connor – Cost Analyst, RAND

Cloud computing has garnered the attention of the Department of Defense (DoD) as data and computer processing needs grow and budgets shrink. In the meantime, reliable literature on the costs of cloud computing in the government is still limited, but programs are interested in any solution that has potential to control growing data management costs. We found that cloud provider costs can be more or less expensive than traditional information system alternatives because of cost structure variations. RAND looked at the cost drivers for several data management approaches for one acquisition program to develop structured cost considerations for analysts approaching new cloud investments. These considerations can help analysts be comprehensive in their analysis until the DoD can develop more official guidance on cloud computing cost analysis.

IT-18 – Presentation – Estimating Cloud Computing Costs Practical Questions for Programs

IT-18 – Handout – Estimating Cloud Computing Costs Practical Questions for Programs


The Agile PM Tool: The Trifecta for Managing Cost, Schedule, and Scope (IT-19)

Blaze Smallwood – Cost Analyst, Booz Allen Hamilton
Omar Mahmoud – Lead Associate, Booz Allen Hamilton
complexity that can be unique to the development team of the specific project.

The growing number of DoD software projects that are adopting an “Agile” development philosophy requires cost estimators to not only adapt the methodologies and metrics they use to estimate software development costs, but also re-think their models to give PMs the information they need to effectively manage these programs. The Agile PM Tool is one manifestation of this trend as it provides a logical, dynamic approach for helping the government effectively manage the cost, schedule, and scope of their “Agile” projects.

IT-19 – Presentation – The Agile PM Tool The Trifecta for Managing Cost, Schedule, and Scope

IT-19 – Handout – The Agile PM Tool The Trifecta for Managing Cost, Schedule, and Scope

IT-20 – Presentation – Avoid Software Project Horror Stories – Check the Reality Value of the Estimate First!

IT-20 – Handout – Avoid Software Project Horror Stories – Check the Reality Value of the Estimate First!

 


Life Cycle Costing Papers:

Which Escalation Rate Should I Use? (LC-1)

Nathan Honsowetz – Senior Consultant, Cobec Consulting Inc.

Conducting life cycle cost estimates requires time frames of 10, 20, even 30 years, and with such long time frames it’s important to use appropriate escalation indices. Escalation can have a significant impact on cost estimates, especially estimates with longer time frames. However, often cost estimators insert a “standard” escalation index into their models without considering whether that index is appropriate for their estimate. In addition, risk is hardly ever applied to escalation as little consideration is applied to the appropriateness of the escalation factors used. This presentation will explain the common escalation indices used in cost estimation, including how they are developed, the purpose they are designed to serve, and how best to use them. Common mistakes will also be covered, such as mistaking nominal rates for real rates and using default escalation factors without further investigation. Escalation factors discussed include OMB, Global Insight labor, FAA labor, escalation in the private sector, and cases of special escalation such as energy.

LC-1 – Presentation – Which Escalation Rate Should I Use

LC-1 – Handout – Which Escalation Rate Should I Use


Ground Vehicle Reliability Analysis Using the Mean Cumulative Function (LC-2)

Caleb Fleming – Cost Analyst, Kalman & Company, Inc.

The primary focus of this paper is to demonstrate the significance of the non-parametric Mean Cumulative Function (MCF) as a comparative and predictive estimating tool for historical and future recurrent maintenance event costs.
Ground vehicle data is presented to demonstrate and address fundamental concepts, algorithmic computations, and potential shortcomings. This paper offers guidelines for identifying recurrent behaviors and outlines the MCF’s value and application to federal government and commercial cost estimating.

The MCF is a non-parametric estimator for determining repair costs and quantities as a function of time or age. Units within a population traditionally follow independent staircase function curves that flat-line between ages and are vertical (x = time) at specific ages. The pointwise average of the cumulative population function at a specific age is the MCF, offering an identifiable cumulative repair cost or quantity up to and at a particular point in time.

In addition to presenting cumulative costs and quantities, MCF plots are valuable tools for developing historical baseline repair rates by component. The cumulative repair rate at a particular age is the derivate of the MCF, calculated through differentiation. The derivatives that increase with age yield increasing repair rates, while derivatives decreasing with age yield decreasing repair rates.

Summative plots obtained from the analysis of MCF outputs have the potential to reveal statistically significant maintenance event tendencies towards constant recurrence, increasing recurrence, decreasing recurrence, and “bathtub effect” recurrence. Further, results generated from the MCF are comparable across analogous systems to reveal potential effects of varying engineering designs and maintenance concepts.

Applied specifically to the cost industry for federal government and commercial clients, the MCF is an alternative to parametric estimating and a slew of loosely defensible assumptions. The MCF generates precise point estimate values for component failure events, enabling the estimator to forecast repair parts demand, manpower requirements, and potential deadlining effects of particular component failures.

This paper surveys and outlines the fundamental MCF methodologies and explanations detailed in-depth in Wayne Nelson’s Recurrent Events Data Analysis for Product Repairs, Disease Recurrences, and Other Applications. Applying the MCF to vehicle maintenance data reveals recurring component failure behaviors, develops new guidelines for interpretation, and assists in data normalization and validation.

LC-2 – Presentation – Ground Vehicle Reliability Analysis Using the Mean Cumulative Function

LC-2 – Handout – Ground Vehicle Reliability Analysis Using the Mean Cumulative Function


Cost Overruns and Their Precursors: An Empirical Examination of Major Department of Defense Acquisition Programs (LC-3)

Alan Gideon – Senior Systems Engineer, Booz Allen Hamilton
Enrique Campos-Nanez – Senior Software Engineer, Epsilon Group
Pavel Fomin – Aerospace Engineer, United States Air Force
James Wasek – Senior Enterprise Architect, Science & Technology Solutions, Inc. (eSTS)

This paper proposes a model of acquisition program future cost for two specific acquisition domains – aircraft and ships – that takes into account the non-recurring developmental costs defined at program approval and each domain’s historic tendencies to exceed planned program cost. Technical and non-technical reasons for these observations are discussed. We begin with an exploratory analysis of trends in cost, schedule, and performance from the 1960s to the present. We use those results, the level of Research and Development funding assigned to each program, and a platform-specific characterization of technical risk as inputs to calculate the likely variance of an acquisition program’s future cost.

The root causes of all acquisition program risks can be categorized as either programmatic/business, cost, schedule, or technical. In turn, these lead to impacts in a program’s final cost, schedule, and/or technical performance. Clearly, inadequate funding of challenging programs increases risk, and providing greater R&D funding for high risk programs reduces the degree of risk attached to a given program. Thus, contracted performance specifications at the limits of current technology and inadequate RDT&E budgets can be seen as underestimated risk that may, in turn, drive a program to deliver late and over the budget established at program commitment. The authors believe that when a program’s technical risks are seen in their historical perspective, program outcome can be better managed. We investigate the part played by initial R&D funding to better improve program risk impact estimates, and extend that effect to examine the effect that these program cost vulnerabilities can have on enterprise portfolio risk profiles.

Department of Defense policy is to calculate a “most probable cost” for each acquisition program at a specified level of confidence, and then fund each program at its most probable cost. Portfolios of programs, either at the service level or the departmental level, are simply the sum of the individual funding levels. This approach is sometimes referred to as percentile funding, and is based on Markowitz’ portfolio theory, where bounding cost envelopes are defined by standard Gaussian marginal cost probabilities of the subject data. If the outcomes of activities within a given program or if the outcomes of programs within a particular portfolio followed Gaussian distributions, Markowitz’ theory would apply. However, Smart (2010) shows that acquisition program outcomes are better described by lognormal distributions, which have “fatter” tails. The authors’ model demonstrates that Smart’s assessment was optimistic for some product lines. The initial results of the authors’ model proposed model is compared to the present policy (Herzberg’s expected value-at-risk) and Smart’s Conditional Tail Expectation for this data.

Data was drawn from the Defense Acquisition Management Information retrieval (DAMIR) database and a number of open sources. The data was stripped of specific identifiers as required to protect proprietary data and normalized to increase the universality of the

LC-3 – Presentation – Cost Overruns and Their Precursors An Empirical Examination of Major Department of Defense Acquisition Programs  *Best Paper: Life Cycle Costing Track

LC-3 – Handouts – Cost Overruns and Their Precursors An Empirical Examination of Major Department of Defense Acquisition Programs


System Utilization: An In-depth Method of Modeling and Measuring Military Manpower Costs (LC-4)

Omar Mahmoud – Lead Associate, Booz Allen Hamilton

Establishing defendable cost estimating methodologies for capturing military manpower costs is a key component in any Program Life Cycle Cost Estimate. With a proven and systematic approach to estimating military manpower costs, a program can be confident in selecting a proper course of action among competing alternatives when conducting an EA, appropriately designate their ACAT level, avoid the pitfalls that lead to over/under estimating or double counting costs, and above all obtain a high level of confidence from their resource sponsor and Milestone Decision Authority.

LC-4 – Presentation – System Utilization An Indepth Method of Modeling and Measuring Military Manpower Costs

LC-4 – Handout – System Utilization An Indepth Method of Modeling and Measuring Military Manpower Costs


Integrating Sustainability into Weapon System Acquisition within the Department of Defense (DoD) (LC-5)

Walt Cooper – Senior Cost Analyst, Technomics, Inc.
Remmie Arnold – Cost Analyst, Technomics, Inc.

DoD acquisition and logistics professionals use the term sustainment to describe the support needed to operate and maintain a system over its lifetime. In the context of the DoD acquisition process, sustainability involves using resources to minimize mission, human health, and environmental impacts and associated costs during the life cycle. This paper will present a draft version of “DoD Guidance ? Integrating Sustainability into DoD Acquisitions,” initial findings from pilot studies, and the challenges and road ahead.

The DoD acquires weapons systems that must be sustained up to 30 years or more. Resources are at a premium and in many cases dwindling. Acquisition personnel must fully understand life cycle impacts and the costs of systems; otherwise, they could inadvertently “push downstream” significant impacts and associated costs to the operational, logistics, and installation management communities.

While sustainability is not a new topic, it is now, more than ever, an area of emphasis for the DoD. Executive Order 13514?Federal Leadership in Environmental, Energy and Economic Performance (05 Oct 2009) establishes an integrated strategy for sustainability in the federal government. In accordance with the EO, the DoD developed a Strategic Sustainability Performance Plan (SSPP), updated annually. The SSPP includes goals for efficiency and reductions in energy, water, solid waste, and the use of hazardous chemicals and materials. Further, reducing life cycle costs by acquiring more sustainable systems directly supports the Better Buying Power initiative and design for affordability goal established by USD(AT&L).

The Guide introduces the concept of Sustainability Analysis and provides guidance on how to complete such analyses and use the results to better inform tradeoff, design, and supportability decisions. A Sustainability Analysis consists of a Life Cycle Assessment (LCA), which assesses a system’s impacts to human health and the environment, and Life Cycle Costing (LCC), which attempts to capture relevant costs associated with the system throughout its life cycle. While this paper will briefly discuss LCA, the focus will be on LCC and providing additional detailed guidance on cost elements and techniques for identifying and quantifying sustainability-related costs that are often not included in the current acquisition cost structure. It describes how to use existing data from legacy systems or proxy data from similar systems to conduct an SLCA and estimate relevant sustainability-related costs.

Our discussion of findings from pilot studies will focus on the quantification of cost and environmental impacts-related differences between the use of chrome and non-chrome primer on two Navy aircraft. The paper will identify and explain the procedures used to identify specific sustainability-related cost elements for the two aircraft as well as the estimated cost differences between the two scenarios.

This paper will also enumerate the significant challenges associated with the analysis of sustainability-related costs. These challenges include the lack of standardized reporting procedures for and documentations of sustainability costs; the shortage of empirical data to be used as a foundation for developing cost estimating relationships and cost factors; and the requirement to establish and implement procedures to gather necessary sustainability data without creating onerous reporting requirements.

LC-5 – Presentation – Integrating Sustainability Into Weapon System Acquisition Within The Department Of Defense (DoD)

LC-5 – Handout – Integrating Sustainability Into Weapon System Acquisition Within The Department Of Defense (DoD)


Cost Analysis & Optimization of Repair Concepts Using Marginal Analysis (LC-6)

Justin Woulfe – EVP, Technical Services, WPI

OPRAL is an analytical model for determining the optimal repair locations and spares allocations in a multi-level hierarchical support organization to optimize Life Cycle Cost. With this model, the analyst can either treat repair decisions as fixed and given as input parameters, or using the OPRAL algorithm, evaluate several different repair strategies in order to find the optimal one, considering all aspects of life cycle cost. The dependence between repair and spares allocation decisions is strong,which why it is necessary to integrate the optimization of the two.

We have developed a model for simultaneous spares optimization and optimal location of repair facilities.

Perhaps the most fundamental property of OPRAL is that the model allows a simultaneous treatment of two problems that are central in the design of support or maintenance systems:

• What repair strategy should be used for items of a given type?
• What sparing strategy should be used for items of a given type?

The choice of repair strategy concerns whether to discard or repair faulty items of a given type. Furthermore, if the item is to be repaired, it also concerns where the repair should take place. The sparing strategy concerns the amount of spares to stock at each warehouse within the organization, when to reorder and how much to reorder.

The problem of determining the optimum repair strategy is sometimes called level-of-repair analysis or LORA. Unfortunately, and as the name suggests, a common assumption is that the repair strategy is limited by levels or echelons within the sup- port organization. In some cases, the restriction to levels is not severe, while in some asymmetric cases it is highly questionable. In any case, the limitation is artificial why we suggest it is removed. Thus, we propose that the acronym LORA should instead read Location Of Repair Analysis.

LC-6 – Presentation – Cost Analysis & Optimization of Repair Concepts Using Marginal Analysis

LC-6 – Handout – Cost Analysis & Optimization of Repair Concepts Using Marginal Analysis

 


Cost Management Papers:

A Balanced Approach to Meeting Fiscal Constraints (MG-1)

Steve Green – Department Head, Department of Management, United States Air Force Academy
Kevin Davis – Associate Professor and Director of Assessment, United States Air Force Academy
Kurt Heppard – Professor of Management and Deputy for Strategic Planning, United States Air Force Academy

The effective and systematic use of cost and budgeting information is a critical component of strategic planning and decision making in most organizations. The Department of Defense’s (DoD) current operational environment, scarce resources, and conflicting stakeholder expectations are resulting in extreme fiscal constraints. The result is the need to reconsider missions and goals, reassess priorities, entertain force structure alternatives, and ultimately reduce budgets. Linking this cost and budgeting information with financial goals, as well as goals for customer satisfaction, internal processes, and organizational growth and learning is a daunting challenge for many organizational leaders. In many cases, including various DoD organizations, managers have adopted a Balanced Scorecard approach and methodology focused on linking organizational goals and objectives with benchmarks and metrics that are based on cost and budget estimates as well as actual expenditures. This cost and performance driven strategic framework allows organizational leaders and strategic planners to systematically consider and envision cost effective ways to meet the overall strategic goals of the organization including the tough decisions associated with budget cuts. Similar to the DoD, universities and other institutions of higher learning are facing the same increasing pressure to meet the strategic expectations of their many constituencies while effectively managing costs. Since current spending levels may be unsustainable for many universities and institutions, the careful analysis and estimation of costs is of particular importance to strategic planners and decision makers. This paper reviews the current literature as it relates to cost-based strategic planning and strategy mapping at universities and institutions of higher learning. Applying concepts from the Balanced Scorecard approach to cost analysis and budgeting, this paper offers effective cost metrics and performance measures for the critical implementation of strategy maps which link many of the intangible measures of higher education effectiveness with tangible cost budget, and performance measures. The paper develops an integrated and aligned system of strategic goals, performance metrics, and cost parameters based on cost and performance data already collected at many universities and institutions of higher learning. We feel there may be lessons learned for DoD organizations.

MG-1 – Presentation – A Balanced Approach to Meeting Fiscal Constraints

MG-1 – Handout – A Balanced Approach to Meeting Fiscal Constraints


Cost Overruns and Cost Growth: A Three Decades Old Cost Performance Issue within DoD’s Acquisition Environment (MG-2)

Leone Young

For the last three decades, the US Department of Defense (DoD) has been encountering program performance issues such as inaccurate and unrealistic estimations for its acquisition programs, and its effort of eliminating cost overruns and cost growth phenomenon has been deemed ineffective as well. This paper examines and consolidates multiple government reports and studies, which pertain to the cost performance of DoD that has been viewed as unsatisfactory and problematic. The commonly accepted cost estimation methodologies and insightful processes and techniques initiated by NASA, RAND and USAF are also presented and analyzed. The research finds that he reliability and accuracy of any cost estimation methods depend heavily on rigid and mature systems designs and low fluctuations of requirement changes, which can only be accomplished through disciplined systems engineering practices. Performing multiple appropriate cost estimations in advance during the pre-Milestone A phase, particularly Engineering Builds method as knowledgeable data on systems designs and requirements become available, can offer a vast amount of benefits of cost savings as well as reduce uncertainty and risk reductions for systems programs. The research results summarize the overall strengths and weaknesses of the current state of the DOD acquisition programs, which is beneficial to the professionals in the defense industry, particularly cost estimators, program and project managers whose programs are constantly faced with financial challenges.

MG-2 – Presentation – Cost Overruns and Cost Growth A Three Decade Old Cost Performance Issue within DoD’s Acquisition Environment

MG-2 – Handout – Cost Overruns and Cost Growth A Three Decade Old Cost Performance Issue within DoD’s Acquisition Environment


Supplier Cost/Price Analyses – Best Practices for evaluating supplier proposals and quotes (MG-3)

David Eck – Director, Dixon Hughes Goodman
Mike Mardesich

During proposal planning, preparation, and review, an important but often over looked aspect is the evaluation of a supplier’s proposal or quote. Requirements in the Federal Acquisition Regulations (FAR) provide for certain visibility into supplier’s proposals and quotes, depending on the value of the proposal/quote. Additionally it is critical to note the FAR puts the onus of supplier cost (or price) analysis on the prime contractor. The expectation is that the same level of detail that the prime includes in its proposal should be included as support to a supplier’s proposal. Effectively the prime contractor must perform supplier cost/price analysis in accordance with the FAR. The actual scope of the prime contractors review will depend on each particular situation such as type of product or service, dollar value, purchase method, and subcontract type.

Material and subcontract costs are a significant cost element in most proposals. Estimating professionals are challenged to evaluate supplier proposals and quotes and include the best estimate of the supplier prices in their proposals. Inadequate analyses of supplier quotes and proposals may result in your proposal being returned to you by the government, and may result in your company losing a potential contract. This presentation will cover best practices in obtaining adequate information from the suppliers and techniques for evaluating that information. We will present:
• Techniques for obtaining adequate information and proposals from the supplier for evaluation
• Documentation required to support acquisition of commercial items and competitive awards
• Guidelines for evaluating supplier quotes and proposals
• Examples of working papers to document the evaluations of the suppliers’ quotes and proposals
• Best practices for evaluating supplier direct labor, material, forward pricing rates, and other costs
• Techniques for evaluating the technical aspects of supplier information such as direct labor hours and bill of material quantities
• Examples of analytical procedures to use in evaluating supplier information, including sampling techniques, learning curves, and regression analyses
• Examples of information to include in Cost Estimating Manuals

Your presenters have performed supplier cost/price analyses for many years – both with the government and in industry. If you would like to attend a discussion where hands-on experienced presenters communicate real time examples and current best practices to fit your needs, plan on attending.

MG-3 – Presentation – Supplier Cost Price Analyses – Best Practices for Evaluating Supplier Proposals and Quotes

MG-3 – Handout – Supplier Cost Price Analyses – Best Practices for Evaluating Supplier Proposals and Quotes


Innovative Business Agreements and Related Cost & Pricing Methods at NASA in Support of New Commercial Programs (MG-4)

Torrance Lambing – Lead in the Business and Cost Analysis Office, John F. Kennedy Space Center, NASA
James Roberts – Senior Cost Analyst, NASA Kennedy Space Center

On April 15, 2010 President Obama delivered a speech at Kennedy Space Center in which he outlined his new vision for the U.S. space program. Emphasis was placed on enabling the exploration of Space by Commercial entities instead of by the Government. Since that time, NASA’s role has changed in many instances from being a program manager – overseeing development of space launch hardware and conducting space exploration missions – to one of support and a provider of space-related facilities and infrastructure. Especially, since the ending of the Space Shuttle Program in mid-2011, many of NASA’s high-value facilities, which were designed to support Shuttle launches and landings, have become underutilized, and the need was recognized to find alternative uses that support Commercial Space activities.

At the 2011 ISPA/SCEA National Conference, during a time in which NASA was struggling with change to some of its old methods of doing business, the authors presented a discussion of the pricing methods available under the various legislative and legal authorities that allow NASA to enter into Agreements, and related cost estimating considerations. Since that time, NASA has updated and redefined some of its policies and procedures on pricing, as well as its methods of estimating costs of conducting business under these new Commercial Agreements.

This paper and presentation, focusing on Kennedy Space Center, will discuss changes and new methods of pricing and estimating the costs of NASA facilities and services to be provided to outside entities for use in new Commercial Space endeavors. It will also give an overview of new NASA regulations and documents that establish policy and guidance for entering into Agreements and how they are priced under the various types of Agreement currently being used at NASA. (These include Space Act Agreements, Commercial Space Launch Act Agreements, Use Permits, Enhanced-Use Leases, Memorandum of Agreements, etc. and include full-cost, actual cost recovery, direct cost-only, and market based pricing.) The presentation will also provide some insight into the decisions and cost/pricing problems that NASA has faced in making available its facilities, equipment and services under these various Agreements.

This discussion will also work through some actual examples of Agreement pricing and related cost estimating exercises. The presentation will hopefully, provide some valuable insight into current activities and changes at NASA, as well as how costs and pricing policies are being impacted and developed to support this “new way of doing business” for NASA.

MG-4 – Presentation – Innovative Business Agreements and Related Cost & Pricing Methods at NASA in Support of New Commercial Programs *Best Paper: Cost Management Track

MG-4 – Handout – Innovative Business Agreements and Related Cost & Pricing Methods at NASA in Support of New Commercial Programs


The Other RCA: Restaurant Cost Analysis (MG-5)

Peter Braxton – Senior Cost Analyst and Technical Officer, Technomics, Inc.

The Weapon Systems Acquisition Reform Act (WSARA) of 2009 highlighted the importance of Root Cause Analysis, or RCA, but its conduct remains shrouded in mystery. In illustrating the central role of risk and uncertainty analysis in cost, Dick Coleman often made the provocative pronouncement to the effect of “You can’t stand outside a restaurant with a menu, and the people you’ll be dining with, and a calculator and get within 10% of the final bill, so what makes you think you can estimate a complex multi-billion-dollar acquisition program with that precision?!” In a perennially popular training session, Eric Druker uses dinner out with his boss as an evocative example of Monte Carlo simulation. Shamelessly borrowing from those two, this paper presents the accessible analogy of restaurant cost analysis using a readily available source real data, namely the author’s extensive ? much to his wife’s chagrin! ? collection of restaurant receipts, to clearly explicate the principles and conduct of RCA.

RCA aims to separate deterministic from probabilistic causes for variation in cost (usually growth) from initial estimates. More than just a “post mortem,” it seeks to infer lessons learned (or Dr. Tzee-Nan Lo’s more apt “lessons to be learned”), which can possibly be translated into mitigation strategies for future risk management. (In the spirit of the Serenity Prayer, program managers must know which decisions they can make, which external decisions they can lobby to influence, and which factors are simply beyond their, or perhaps anyone’s, control.) Effective RCA requires access to the cost model itself, preferably incorporating uncertainty and as it evolved over time, and the inputs thereto. Thus, we need to know not just what the diners ordered, but the entire menu, capturing the range of possible inputs and outputs. This relates to Dr. Christian Smart’s notion of a progression of conditional estimates. Also essential for RCA are well-defined growth categories and an accompanying order of operations.

The potential for analogies within this framework are virtually limitless. The courses of the meal are life-cycle phases. The basic commodity is the type of meal (breakfast, brunch, lunch, happy hour, dinner), and sub-type is context (family meal, date, group travel, solo travel). The type of restaurant represents the stringency of requirements. Number of diners is quantity, and acquisition strategy is reflected in a la carte vs. prix fixe, and the use coupons or frequent diner programs. And so on.

No analogy is perfect, and the paper will briefly touch on the dissimilarities between the two RCAs. Primarily, restaurants reflect more of a fixed-price environment, where the multitude of meals and diners enables invoking the law of large numbers, with variations in cost priced in to the offerings, assuming financial solvency, which is hardly a certainty in the restaurant business! By contrast, defense acquisition is typified by a cost-reimbursable environment and specialized industrial base often verging on monopoly/monopsony. Still, if this level of insight can be gained by an individual analyst using his own personal data, certainly it is achievable by a well-funded acquisition program.

MG-5 – Presentation – The Other RCA Restaurant Cost Analysis

MG-5 – Handout – The Other RCA Restaurant Cost Analysis


Intelligence Mission Data Cost Methodology Guidebook (MG-6)

Eugene Cullen, III – Senior Cost Analyst, Booz Allen Hamilton
Matthew Schumacher – Booz Allen Hamilton

In 2013, Booz Allen Hamilton developed and authored the Intelligence Mission Data Cost Methodology Guidebook (IMD CMGB) for the Defense Intelligence Agency’s (DIA) Intelligence Mission Data Center (IMDC). This guidebook is the official costing manual for life-cycle mission data planning, required by Department of Defense Directive (DoDD) 5250.01 “Management of IMD within DoD Acquisitions,” and defines costing methodologies, procedures, and processes that result in OSD Cost Assessment & Program Evaluation (CAPE) and Government Accountability Office (GAO) compliant cost estimates that are required by DoD Acquisition Systems. The guidebook specifically applies these standards to the IMD environment to enable diverse elements across the Defense Intelligence and Acquisition Enterprises to develop and evaluate life-cycle costs for
IMD as a component of DoD acquisition program costs. The objectives of the guidebook are threefold:

• To provide a detailed and consistent costing methodology based upon US government best practices (aligned with the GAO’s 2009 Cost Estimating and Assessment Guide) to be used in preparation and evaluation of DoD acquisition program cost estimates;
• To establish standards that integrate IMD cost estimates into the Defense Acquisition Life Cycle Management System and the Lifecycle Mission Data Planning process; and
• To serve as the principle reference for current and future intelligence cost reporting.

The guidebook outlines commonly used costing methodologies (i.e. analogy, parametric, engineering, expert-opinion and extrapolation from actual costs); explains how to conduct sensitivity, risk, and uncertainty analysis; and identifies O&S cost elements, common cost drivers, trends, and considerations specifically within the IMD life-cycle management process. Further, the guidebook includes several tailored case studies and vignettes, designed to exemplify how IMD cost analysis is being conducted at Intelligence Production Centers, Program Offices, and across five IMD functional disciplines (i.e. Signatures, GEOINT, Electronic Warfare Integrated Reprogramming, Order of Battle, Characteristics & Performance).

To substantiate content and generate support, the Booz Allen team championed an integrated project team (IPT) chartered over a sixth month period to develop the guidebook. The IPT included over 75 participants representing 25 USG agencies across defense, intelligence, acquisition, cost communities, and costing experts. The Booz Allen team led the IPT’s data gathering, analysis, and stakeholder consultations, and reviewed over 150 costing artifacts including cost guidance, policies, and resource material from the DoD Acquisition Community, Service Cost Centers, NASA, and private industry. IPT sessions provided ample opportunity for stakeholders and action officers to raise issues, resolve differences, and de-conflict objectives throughout the content development process. The guidebook creates value for a variety of stakeholders by allowing the DoD to more efficiently manage acquisition program costs related to IMD availability requirements and facilitates capability decision-making based upon a reliable and consistent costing methodology framework.

The IMD CMGB was approved in April 2013, and is available via the Defense Acquisition University (DAU) Portal to DoD Major Defense Acquisition Programs (MDAPs), Major Acquisition Information Systems (MAISs), Intelligence Production Centers, Program Offices, and other organizations qualified to prepare or evaluate IMD cost estimates.

MG-6 – Presentation – Intelligence Mission Data Cost Methodology Guidebook

MG-6 – Handout – Intelligence Mission Data Cost Methodology Guidebook


Achieving Affordable Programs NRO Cost Analysis Improvement Group (CAIG) Support of Cost Driver Identification (MG-7)

Linda Williams – Program Manager, Wyle
Pat Kielb
Eileen DeVillier – Associate, Booz Allen Hamilton
Jay Miller – Specialist Leader, Deloitte Consulting

The NRO CAIG has significantly expanded its role beyond independent cost estimating of a large acquisition program to focus on enterprise decision support. One area that is proving invaluable to senior decision makers is cost driver identification.

In an era of looming budget reductions and the possibility of sequestration, the NRO CAIG was tasked to identify cost drivers and provide a framework for additional cost reduction opportunities.

To accomplish this task the NRO CAIG leveraged prior studies and the data collected over 30 years on the organization’s very large space programs. Additionally, the NRO CAIG collected detailed budget data to help focus beyond large acquisition contracts and increase visibility into total organization costs.

This paper will describe the approach taken over the last several months to identify cost drivers and summarize findings that senior leaders found helpful to the decision making process.

In summary, as budgets reductions trigger asset affordability reviews, the NRO CAIG has been able to assist in identifying total organizational cost drivers and provide a framework for future cost reduction opportunities.

MG-7 – Presentation – Achieving Affordable Programs NRO Cost and Acquisition Assessment Group (CAAG) Support of Cost Driver Identification


Gödel’s Impact on Hilbert’s Problems Or Cost Consistency and Completeness as an Impossible Exercise (MG-8)

David Peeler – Director of Risk Management, Amgen, Inc.

In a previous set of papers, the idea of using Hilbert’s Problems as a construct to propose and recently revisit the status of a list of Hilbert’s Problems for Cost Estimating. This paper similarly employs Gödel’s enlightenment with respect to Hilbert’s attempts onto the cost estimating community. What can we learn about ourselves as estimators and where can we exert the greatest impact with respect to the use of our estimates? Using Godel’s two theorems of undecidability as catalyst, we will explore the effect and utility of exacting math and other motions on cost estimates specifically and programmaitics generally.

Files not available


A New Cost Management Policy for Your Organization: An Integrated Approach? (MG-9)

Tom Dauber – Principal, Booz Allen Hamilton
Woomi Chase – Lead Associate, Booz Allen Hamilton
Ken Odom – Booz Allen Hamilton

Developing a robust Cost Management Policy is a key driver to the success of any organization, regardless of size or industry. The policy should ensure cost control measures that are valid and effective, risks are mitigated, solutions are delivered on time, and profits/ROIs are maximized. The Cost Management Policy should be a systematic approach to managing cost through-out the life cycle of a program through the application of cost engineering and cost management principles. The policy should include characteristics of a credible cost estimate with a repeatable process, schedule health check metrics, integrated cost and schedule linked to requirements, support approaches with historical and statistically sound data that allows for sensitivity analysis, and the institution of a robust data collection culture. The policy should state explicitly that cost, schedule and risk management activities are to be integrated on a program at a stated confidence level for target, execution and contingency estimates. Mapping the cost estimate and risk register to the schedule ensures that the cost and schedule plans are compatible; risks are identified, and opportunities to mitigate cost and schedule growth are proactively managed. Instituting this policy goes beyond just identifying, collecting, measuring, and reporting information to decision-makers to determine the cost of programs, projects, products, facilities, services and/or systems, it rewards program managers (PM) for cost containment, continuous improvement, and efficiencies. PMs are incentivized to identify, quantify, and manage risks through-out the life cycle of the program. Incentives include both non-monetary to monetary awards. Planning and applying this policy early in the life cycle and periodically through-out the program, ensures decision-makers’ cost surveillance controls are optimized and the impacts of near-term decisions are understood in the long run. This paper describes an approach for this type of integrated cost management policy.

MG-9 – Presentation – A New Cost Management Policy for Your Organization An Integrated Approach

MG-9 – Handout – A New Cost Management Policy for Your Organization An Integrated Approach


Right Sizing Earned Value Management for Your Project (MG-11)

Gordon Kranz – Deputy Director for Earned Value Management Performance Assessments and Root Cause Analyses, Office of the Assistant Secretary of Defense for Acquisition

Earned Value Management (EVM) is a program management tool that provides data indicators that can be used on all programs to enable proactive decision making throughout the program lifecycle and facilitate communication across the program team. Each program has unique attributes that should be considered when determining program management and reporting requirements, including, but not limited to, contract size and type, scope of work, complexity, risk, technology maturity, and resource requirements. In program management, one size does not fit all; integrated program management techniques, including performance measurement and scheduling can be tailored to what makes sense for each program. In this workshop, participants will discuss various contract types and work scope and share experiences in determining the right fit for their program when applying EVM and measuring performance.

Files not available


 Methods & Models Papers:

Military Construction Cost Estimating (MM-1)

Nicole Barmettler – Cost Analyst, BCF Solutions

An informative presentation on construction cost estimating specifically dealing with military facilities. Within this topic, the author will identify, define, and explain the cost methodologies and cost adjustment factors considered when developing construction cost estimates for general military facilities. Project costs will be exemplified by illustrating a breakdown and walkthrough of the process. The author will specifically discuss the process involved in a five year facility acquisition timeline that is usually required for a typical major military construction effort, which is defined by a project cost exceeding $750,000.

Keywords: MILCON, Military Construction, Facility Acquisition, Timeline, Design

MM-1 – Presentation – Military Construction Cost Estimating

MM-1 – Handout – Military Construction Cost Estimating


Cost and Performance Trades and Cost-Benefits Analysis (MM-2)

Steven Ikeler – Operations Research Analyst, United States Army

The results from a Cost, Performance and Schedule Trades Analysis are extremely useful when performing Cost-Benefits Analysis (C-BA). A Trades Analysis involves direct participation from all stakeholders and compares the effects of different performance and schedule goals on cost and risk. Trades Analysis products are useful in all cost estimates, not only C-BAs. The Army C-BA is discussed since it provides simple examples. The Army C-BA identifies the most cost-effective solutions from among different alternatives. Every C-BA also includes a cost risk analysis. We will discuss the multiple ways that Trades Analysis enhances the C-BA. For example, it results in new C-BA alternatives that enhance the effectiveness of the C-BA. It also provides essential cost risk and sensitivity insights.

This paper will discuss the basics of Trades Analysis and the C-BA. It will include what the cost analyst should do to prepare and what information to collect during the Trades Analysis. One observation is that non-traditional Work-Breakdown Structures need to be considered and the analyst should model potential second and third order effects beforehand. Another key observation is many participants translate risk to cost on their own with differing results. As a result, the analyst needs to be prepared to explain both cost and cost risk implications and second and third order effects. Ideally the analyst should understand the rationale behind the performance and schedule objectives.

We will discuss an example with obvious second order effects by including a high priority weight limit. The example will demonstrate a cost estimate model that facilitates the Trades Analysis. We will discuss recommendations for conducting the Trades Analysis. The example will explain the results of the Trades Analysis. Finally, we demonstrate how the results inform the C-BA through new alternatives, risk and cost risk information and sensitivity analysis.

Keywords: C-BA, Cost-Benefit, Trades, Tradespace

MM-2 – Presentation – Cost and Performance Trades and Cost-Benefits Analysis

MM-2 – Handout – Cost and Performance Trades and Cost-Benefits Analysis


Lessons Learned from Estimating Non-Standard Equipment Returning from Overseas Operations (MM-3)

Michael Metcalf – Cost Analyst, Technomics, Inc.

Throughout the execution of Operation Iraqi Freedom (OIF) and Operation Enduring Freedom (OEF), the Department of Defense (DoD) purchased myriad items to support the warfighter through non-traditional acquisition processes in order to provide them quickly to theater. These items provided critical capabilities, advantages, and safety improvements that helped reduce casualties. Now that these overseas operations are ending, the DoD must determine what equipment should be retained and how to transition its operations and support (O&S) to peacetime processes. Overseas Contingency Operations (OCO) supplementary funding is expiring, and most actions taken will required funding from the DoD base budget going forward. This paper examines this critical issue.

Our discussion focuses on the return of two families of items to the United States Army (USA): a set of small ground robots under the purview of the Robotics Systems Joint Program Office (RSJPO), and the family of Mine Resistant Ambush Protected (MRAP) troop vehicles. All of these items were purchased as Non-Standard Equipment (NSE) from a variety of contractors, either Commercial Off-the-Shelf (COTS) or using rapid wartime development, and were often maintained during wartime using Contractor Logistics Support (CLS) and program office support. USA desires to preserve the capabilities provided by the small robots and MRAPs, for use in potential future conflicts as well as in current peacetime operations. However, both programs must operate in new affordability and budget environments and establish traditional acquisition and sustainment processes.

This paper explores challenges in estimating the cost of retaining this equipment. Items that present unique challenges include: the type classification and full materiel release processes; repair, reset, and upgrade; short- and long-term storage; and knowledge retention and loss of wartime experience. We will also explore funding challenges; moving targets in number and configuration of retained systems; the transition of O&S from wartime contractor-based to peacetime organic; training; and system disposal and divestiture.

Keywords: Non-Standard Equipment, NSE, Commercial Off the Shelf, COTS, MRAP, rapid procurement, contractor support, sustainment, reset, training

MM-3 – Presentation – Lessons Learned from Estimating Non-Standard Equipment Returning from Overseas Operations

MM-3 – Handout – Lessons Learned from Estimating Non-Standard Equipment Returning from Overseas Operations


Weibull Analysis Method (MM-4)

Erik Burgess – President, Burgess Consulting Inc.
James Smirnoff – Wyle
Brianne Wong – Consultant, Booz Allen Hamilton

The NRO Cost and Acquisition Assessment Group (CAAG) develops time-phased estimates for space systems in support of milestone decisions, budget formulation, and other program assessment tasks. CAAG relies on parametric budget-spreading models based on historical data to provide objective analysis and credible information before contract award or early in a program when there is little or no execution history available. However, in today’s environment of increased oversight and budget scrutiny, programs are evaluated annually in hopes of finding excess margin and balancing budget risk across the portfolio. The Weibull Analysis Method (WAM) is an improved approach that estimates budget requirements for programs that are already underway by using their actual execution history and by focusing on accuracy in the near years. WAM builds on work previously published by LMI and the Center for Naval Analysis, and is tailored to the satellite acquisition and contracting practices at the NRO, and validated using historical data from 37 completed satellite contracts. This briefing describes the analytical basis for WAM, how its accuracy metric was developed from the historical data, and how it is applied in program assessments. A spreadsheet tool that implements WAM, computes accuracy metrics in each year, and compares the results to a parametric budget-spreading model is also presented.

MM-4 – Presentation – Weibull Analysis Method

MM-4 – Handout – Weibull Analysis Method


Study of Cost Estimating Methodology of Engineering Development (MM-5)

Myung-Yul Lee – Estimator, The Boeing Company

Introduction: In July 22, 1982, Initial Full Scale Engineering Development (FSED) contract was awarded to McDonnell Douglas. Along with C-17 Production contract, the Boeing Company had initiated a Producibility Enhancement/Performance Improvement (PE/PI) program and Globemaster III Integrated Sustainment Program (GISP).

The PE/PI effort incorporates new design, modifies the aircraft systems, and updates new technology for the aircraft. The PE/PI projects, therefore, operate separately from the production of aircraft. The outcomes of PE/PI projects applied to the C-17 aircraft are incorporated into the production. In this paper FSED and PE/PI will examine in view of estimating methodology because these programs are involved in engineering development environment rather than Production and sustaining engineering (GSP).

FSED Program in View of Estimating Methodology: The parametric estimating method is one of the most desirable and is a highly creditable estimating method because this method applies same project which was performed or similar project actual hours to estimate the weapon system development cost. For that reason, when a new weapon system requires cost estimate, a discrete estimate method is utilized which is based on the engineering requirement for the new weapon system.
PE/PI Project in View of Estimating Methodology: Along with C-17 Production contract, the Boeing Company had initiated a Producibility Enhancement/Performance Improvement (PE/PI) program. Producibility Enhancements (PE) are efforts used to correct obsolescence and safety/operational deficiencies. Performance Improvements (PI) are efforts used for capability and supportability to increase the C-17 Weapon System performance, or support capability, or decrease the C-17 costs of ownership. The PE/PI effort incorporates new design, modifies the aircraft systems, and updates new technology for the aircraft.

Purpose of the Study: The purpose of this paper is to study the actual engineering labor hours for FSED Program and PE/PI project and to create a parametric estimating model in a weapon system, especially cargo aircraft. The C-17 final assembly facility in Long Beach, California closes in 2015. It is worthwhile to review historical data for the C-17 FSED program and PE/PI project, both programs have engineering development program in view of estimating methodology research.

Files not available


Validation and Improvement of the Rayleigh Curve Method (MM-6)

Michael Mender – Cost Analyst, Naval Center for Cost Analysis

The Rayleigh Curve is a popular method for estimating both software development and R&D project durations and costs by extrapolating from earned value data available from the early periods of a project. Many of the existing studies supporting the Rayleigh method rely on R-squared as a measure of goodness-of-fit. Because the Rayleigh function is non-linear and not obviously linearized, most tools implementing the method rely on numerical means to fit a curve without linearizing. Because R-squared requires linearity to hold meaning with regard to goodness-of-fit, we questioned the validity of the existing support for the Rayleigh method and sought to conduct our own series of tests. Additionally, we had some concerns about the “freshness” of the data used to validate the method, as many of the original studies were done decades ago and/or used data from projects of a similar vintage. As we had access to data for more recent projects, we additionally sought to test whether the Rayleigh method was still applicable to modern projects.

While doing our research, we uncovered a means by which the Rayleigh function could be linearized, thereby allowing the use of standard linear regression methods. For ease of use, we have taken to calling this method “Rayleigh Regression.” We built a tool using this method, and then proceeded to evaluate its performance over a data set consisting of completed R&D projects. For each project, we generate a series of fitted curves, starting with the first three data points, then iteratively expanding our data set to include successive data points (i.e., the first four, the first five, etc?), through the end of the contract. By doing this, we are able to evaluate the quality of the Rayleigh predictions as more data becomes available to the estimator. We tested the Rayleigh Regression results both by evaluating the fit of the curve within the bounds of each sample, and by evaluating the accuracy of the forecasted cost and schedule with the actuals from each data set.

The presentation begins with a brief overview of the Rayleigh method, followed by an explanation of R-squared and the issues with non-linear functions. We then discuss the method for linearizing the Rayleigh function and then using it for linear regression. We then provide a brief demo of the tool we developed and then move on to a discussion of our results pertaining to the performance of the Rayleigh method. Ultimately, we found that the Rayleigh method is reasonably effective, but that the support some studies have found does not appear to be as certain when the issues pertaining to R-squared are addressed and the method is tested against more current data.

Keywords: Optimization, EVM Analysis, Linear Regression

MM-6 – Presentation – Validation and Improvement of the Rayleigh Curve Method

MM-6 – Handout – Validation and Improvement of the Rayleigh Curve Method


Rotorcraft Cost Model Enhancements for Future Concepts (MM-7)

F. Gurney Thompson III – Cost Research Analyst, PRICE Systems

The Future Vertical Lift (FVL) program will develop a family of rotorcraft intended to meet the future needs of the U.S. Armed Forces. The precursor programs to develop the needed science and technology improvements are already underway. Much of the U.S. Armed Forces’ fleet of helicopters and rotorcraft are approaching a point where decisions must be made to extend their life, retire, or replace the aircraft. As these decision points approach and new technologies are being created, enhancements are needed to the toolset for future rotorcraft cost estimation.

This paper will discuss the ongoing research efforts to improve upon existing rotorcraft cost estimation capabilities, from both a cost estimating relationship (CER) update and a model development perspective. We will share our approach, many of our findings, and any lessons learned. Efforts currently underway include data collection, updates to existing cost models and their CERs, adding support for new aircraft types and technologies, and the addition of new analysis capabilities to better understand total ownership cost.

MM-7 – Presentation – Rotorcraft Cost Model Enhancements for Future Concepts

MM-7 – Handout – Rotorcraft Cost Model Enhancements for Future Concepts


Kill Vehicle Work Breakdown Structure (MM-8)

Jennifer Tarin – Operations Research Analyst, Missile Defense Agency
Christian Smart – Director of Cost Estimating and Analysis, Missile Defense Agency
Paul Tetrault – Technical Director, Missile Defense Agency

This paper provides an alternative to Appendix C: Missile Systems for inclusion in MIL-STD-881C, the Department of Defense standard for Work Breakdown Structures (WBSs). The Missile Defense Agency (MDA) produces interceptors that are similar to missiles with the exception of the payload element. Where Appendix C defines the payload element with a limited set of WBS elements, the MDA interceptor payload, referred to as a kill vehicle, includes a large collection of significant WBS elements. A kill vehicle is a guided weapon that utilizes hit-to-kill technology after separation from a boosting vehicle. Often described as “hitting a bullet with a bullet” its purpose is the destruction of a ballistic missile threat and/or a threat re-entry vehicle. MDA’s kill vehicles do not contain any explosives; instead the kill vehicles use kinetic energy from the engagement velocities to provide the destructive forces. Additionally, MDA kill vehicles operate autonomously as short-lived space vehicles. Based on the number of significant WBS elements for MDA kill vehicles, we determined that the current MIL-STD-881C Appendix C Missile Systems Payload WBS is insufficient. An analysis of MDA’s currently produced kill vehicles; Ground-Based Midcourse Defense “Exo-atmospheric Kill Vehicle,” Aegis Ballistic Missile Defense “Kinetic Warhead,” and Terminal High Altitude Area Defense “Kill Vehicle”, was done to establish commonality. As a result, we created three alternatives based on Appendix F Space Systems WBS and Appendix C Missile Systems WBS from MIL-STD-881C. The proposed KV WBS, a hybrid of Appendix F and Appendix C, will support existing and future kill vehicle designs.

MM-8 – Presentation – Kill Vehicle Work Breakdown Structure

MM-8 – Paper – Kill Vehicle Work Breakdown Structure

MM-8 – Handout – Kill Vehicle Work Breakdown Structure


Meet the Overlapping Coefficient: A Measure for Elevator Speeches (MM-9)

Brent Larson – Senior Cost Analyst, Infinity Systems Engineering

You’ve seen this picture before. . . a plot of two overlapping probability distributions. You may have created one with an overlay chart. Typically this graphic contrasts two cost distributions so as to illustrate similarity, difference or change. However, seldom seen is a number that quantifies the overlap or area shared by both distributions. This area common to both densities is known as the Overlapping Coefficient (OVL) and is an intuitive, unitless, measure of agreement ranging from zero to unity. The literature reveals that obtaining the OVL from data, such as the output of a monte carlo simulation, can be non-trivial. It typically requires further assumptions and intermediate steps that come with distribution fitting or kernel density estimation. Consideration of large sample theory and some calculus along with exploitation of an existing statistical test, suggest a reasonable approximation of the OVL may be made readily accessible to analysts. This presentation will introduce the OVL, its historical background and applications. It will then demonstrate how to obtain the measure, derived from the totality of both distributions, that quantifies their similarity. Let’s say a year has elapsed between an estimate and its major update. Methods have changed. Data have changed. Tools have changed. Some element uncertainties have been reduced. Some have increased. In the few moments between the 12th and 14th floors you will be able to specifically state the convolution of all that change.

Keywords: Overlapping Coefficient, Empirical Cumulative Distribution Function, S-Curve, KS Two Sample Test, R

MM-9 – Presentation – Meet the Overlapping Coefficient A Measure for Elevator Speeches

MM-9 – Handout – Meet the Overlapping Coefficient A Measure for Elevator Speeches


Excel Based Cost Roll Up Method (MM-10)

Matthew Leezer – Senior Cost Analyst, Honda Aircraft Company

Corporations are managing the cost of new product earlier in the development cycle, creating a need for timely and accurately product cost roll up of complex systems. These systems can consist of hundreds or thousands of unique parts, beyond the capability of manual management. Many large organizations manage new product development using engineering Product Life Cycle Management (PLM) software; these organizations often utilize Enterprise Resource Planning (ERP) software to manage the cost of production products. As these companies move to manage the cost of new product earlier in the development cycle there is a need to merge the data that exists in these two systems into a meaningful and reliable cost management tool. There are many companies developing solutions to this problem that offer a more robust management capability. The scope of this paper is to detail a Microsoft Excel based method that can quickly and accurately roll up the engineering bill of material costs. This method can also be used to roll up the weight of the system, to validate the results.

This paper will show the user how to create a custom function (sumlowerlevel) using Visual Basic and apply this function in Excel to generate a report that will save time and increase the accuracy of estimates. The method uses the lookup function to capture the cost of purchased components and subsystems and uses the custom function for all make assemblies to roll up the cost of the purchased parts. The end product is a sortable and searchable bill of materials with accurate material cost roll ups at all levels of the BOM. This method can be used to create a costed bill of materials in hours instead of months. This method can be applied to simple assemblies, complex assemblies and very complex assemblies of assemblies.

Once mastered this method can be used to roll up the weight of an assembly, and be used to validate that all parts in the BOM have been accounted for. This method can also be used to roll up the assembly labor hours and result in a total product cost. A comparison of the weight and part count to product description will provide the user with a level of confidence in the roll up.

This report will show two unique methods for rolling up product cost in Excel using data available in PLM systems and ERP databases into a simple excel structure that can be built in minimal time and with minimal effort, but provide a high level of accuracy and efficiency. The results of the two methods can be compared and used to validate the results. Each method has benefits and issues that will be identified.

Files not available


The Role of Cost Estimating in Source Selection (MM-11)

Annette Barlia – Cobec Consulting, Inc.

One of the most interesting purposes a cost estimate fulfills consists of using it to evaluate vendor proposals. From the development of an Independent Government Cost Estimate (IGCE) to Source Selection, many players come together to ensure the vendor selected will meet program acquisition goals. Numerous decisions must be made: 1) When should the cost team begin to interface with the technical team? 2) How should the cost model be organized, in accordance with the Statement of Work (SOW) or CLIN structure? 3) What level of detail is required, estimating down to the CDRL level? 4) Why should the cost model correlate to the price evaluation tool and the program schedule? This paper will answer such questions.

The analysis will focus on the process of developing an IGCE and then utilizing it to evaluate vendor proposals for the acquisition of new technology. It will demonstrate that a strong IGCE facilitates source selection. If the cost estimate is developed the right way, organizations will have more leverage during contract negotiations with vendors, and the acquisition will run smoothly and meet program goals. Both junior cost estimators looking to understand real-world applications of estimating as well as government program office personnel facing an acquisition will benefit from this analysis.

MM-11 – Presentation – The Role of Cost Estimating in Source Selection

MM-11 – Paper – The Role of Cost Estimating in Source Selection

MM-11 – Handout – The Role of Cost Estimating in Source Selection


Automated Costing to Identify Opportunities to Reduce Direct Material Spend (MM-13)

Julie Driscoll – Vice President of Marketing Strategy & Product Management, aPriori, Dresser Rand
Dale Martin – Senior Manager Cost Estimating, Dresser-Rand

This session will cover how technology is automating the costing process through integrating costing solutions with CAD, PLM and ERP; pulling information about components directly from CAD files; and using an intelligent cost engine that evaluates manufacturing routings for feasibility and cost effectiveness. We will look how these solutions enable automated batch costing of components and are used by manufacturers to support cost reduction projects. As part of the presentation, we will review a spend analytics methodology that combines process and technology to identify, analyze and take action on potential outliers for cost reduction, driving significant results in short-time frames. We will also examine a case study of a leading manufacturer where the solution was applied to a program that needed to be cost reduced prior to launch. Wrapping up the presentation, we’ll hear from a cost engineer who uses the solution in his organization who will provide a comparison of automated and traditional costing approaches and a summary of best practice tips.
MM-13 – Presentation – Automated Costing to Identify Opportunities to Reduce Direct Material Spend

MM-13 – Handout – Automated Costing to Identify Opportunities to Reduce Direct Material Spend

MM-16 – Presentation – 1.1 Requirements for Estimation Purposes

MM-16 – Handout – 1.1 Requirements for Estimation Purposes

MM-24 – Presentation – Relating Cost to Performance The Performance-Based Cost Model *Best Paper: Methods & Models Track

MM-24 – Paper – Relating Cost to Performance The Performance-Based Cost Model

MM-24 – Handout – Relating Cost to Performance The Performance-Based Cost Model


Parametrics Papers:

Moving Beyond Technical Parameters in our CERs (PA-1)

Eric Druker – Senior Associate, Booz Allen Hamilton
Charles Hunt – Galorath Incorporated

One of the frequent criticisms of JCL analysis (integrated cost and schedule risk analysis) has been that the results typically exhibit coefficients of variation (CV) that are orders of magnitude less than those seen in parametric estimates of similar scope. During a recent NASA research task examining how parametrics estimates can be linked to program management artifacts, the research team stumbled upon a characteristic of our Cost Estimating Relationships (CERs) that almost certainly leads our parametric estimates to have higher than necessary CVs. In particular, today’s CERs, with their focus on technical parameters tend to ignore programmatic attributes likely to drive cost. This presentation will focus on how this feature of CERs, and the fact that they likely use samples from multiple populations representing programmatic attributes, likely drives higher than necessary CVs in our parametric estimates. The presentation will review previous research and current best practices on including programmatic attributes, investigate the challenges of incorporating programmatic attributes, and then propose possible solution spaces to the parametric estimating community to increase focus on modeling key programmatic attributes. Including programmatic attributes not only has the opportunity to reduce CVs, but also could make our estimates more valuable to the program management community by giving them the ability to see how their decisions impact cost.

Files not available


Using Dummy Variables in CER Development (PA-2)

Shu-Ping Hu – Chief Statistician, Tecolote Research, Inc.
Alfred Smith – General Manager, Tecolote Research, Inc.

Dummy variables are commonly used in developing cost estimating relationships (CER). It has become more popular in recent years to stratify data into distinct categories by using dummy variables. However, many analysts specify dummy variables in their CERs without properly analyzing the statistical validity of using them. For example, the dummy variable t-test should be applied to determine the relevance of using dummy variables, but this test is often neglected. Consequently, the fit statistics can be misleading.

The dummy variable t-test is useful for determining whether the slope (or exponent) coefficients in different categories are significantly different. This is directly applicable to the dummy variable CER where we assume distinct categories in the data set share the same sensitivity for the ordinary independent variable; the only difference is in the response levels.

This paper explains the reasons for using dummy variables in regression analysis and how to use them effectively when deriving CERs. Specific guidelines are proposed to help analysts determine if the application of dummy variables is appropriate for their data set. This paper also demonstrates some common errors in applying dummy variables to real examples. An application using dummy variables in splines (to derive the fitted equation as well as the intersection) is also discussed.

PA-2 – Paper – Using Dummy Variables in CER Development

PA-2 – Handout – Using Dummy Variables in CER Development

PA-2 – Presentation – Using Dummy Variables in CER Development


Bayesian Parametrics: Developing a CER with Limited Data and Even Without Data (PA-3)

Christian Smart – Director of Cost Estimating and Analysis, Missile Defense Agency

When I was in college, my mathematics and economics professors were adamant in telling me that I needed at least two data points to define a trend. You may have been taught this same dogma. It turns out this is wrong. You can define a trend with one data point, and even without any data at all. A cost estimating relationship (CER), which is a mathematical equation that relates cost to one or more technical inputs, is a specific application of trend analysis. The purpose of this paper is to discuss methods for applying parametrics to small data sets, including the case of one data point and the case of no data.

The only catch is that you need some prior information on one or more of the CER’s parameters. For example, consider a power CER with one explanatory variable: Y=aX^b . The slope of the equation, b, can be interpreted as an economies of scale factor. As such, it is typically between 0 and 1. When using weight as the explanatory variable, rules of thumb are 0.5 for development cost, and 0.7 for production cost. Bayes’ Theorem can be applied to combine the prior information with the sample data to produce CERs in the presence of limited historical data.

This paper discusses Bayes’ Theorem, and applies it to linear and nonlinear CERs, including ordinary least squares and log-transformed ordinary least squares.

PA-3 – Presentation – Bayesian Parametrics Developing a CER with Limited Data and Even Without Data *Best Paper: Parametrics Track, 2014 Best Paper Overall

PA-3 – Paper – Bayesian Parametrics Developing a CER with Limited Data and Even Without Data

PA-3 – Handout – Bayesian Parametrics Developing a CER with Limited Data and Even Without Data


Tactical Vehicle Cons & Reps Cost Estimating Relationship (CER) Tool (PA-4)

Cassandra Capots – Cost Analyst, Technomics, Inc.
Jeffery Cherwonik – Cost Analyst, Technomics, Inc.
Adam James – Cost Analyst, Technomics, Inc.
Leonard Ogborn – Cost Analyst, Technomics, Inc.

When estimating Operating and Support (O&S), it is reasonable to assume that as reliability increases, consumable and reparable parts (“cons and reps”) cost should decrease (less frequent repairs), while as vehicle price increases, parts cost should increase (more expensive parts). Developing a dataset to support cost estimating relationships (CERs) for the Army’s Tactical Vehicle fleet is a significant challenge. Therefore, rather than supplying a single CER for all tactical vehicle parts cost estimating, this study sought an Excel-based tool that would allow cost analysts to select data relevant to their specific vehicle and build tailored CERs. While the foregoing assumptions are certainly logical hypotheses, the topic poses several challenges for cost analysts. This paper will discuss these challenges and detail a three-step process for quantifying the relationship between tactical vehicle reliability and costs of cons and reps.

A lack of consistent data sources and definitions for the leveraged data types posed a challenge in the data definition phase. These data types are vehicle reliability, average unit price (AUP), and average annual parts cost. Quantifying reliability was complicated, as various organizations use different metrics with varying definitions, making meaningful comparison difficult. Additionally, obtaining a consistent vehicle AUP posed an issue, as it was initially difficult to find data from the same source and life cycle phase. Lastly, selecting a consistent source for parts costs was a challenge, as sources collect this data in varying ways, leading to certain distinctions.

Additional challenges were experienced in the data collection phase. The Army Material Systems Analysis Activity (AMSAA) Sample Data Collection (SDC) was targeted for reliability metrics; specifically, the study focused on mean miles between non-mission capable visits (MMBNMC Visits). Tactical vehicle production price and corresponding quantities were pulled from the Wheeled and Tracked Vehicle (WTV) Automated Cost Database (ACDB) and used to calculate vehicle AUP, while the Operating and Support Management Information System (OSMIS) was the source of parts costs. Upon investigation, it was seen that these three sources contained varying amounts of data, making it necessary to determine a subset of vehicles with the critical amount of information to support CER development.

Additional challenges were met during data analysis. As the data and ensuing relationships were analyzed, it was noted that the data experienced an inherently large amount of variability, even when analyzing within-series relationships. Therefore, as opposed to developing a single CER to be used for all tactical vehicles, an Excel-based tool was developed to allow for optimal flexibility in the creation of CERs. In addition to outputting uniquely-developed CERs, the tool provides appropriate statistics to diagnose and assess the level of fit for the selected CERs.

Due to the ability to easily change any selections?and, therefore, the resulting equations and statistics?users may quickly analyze various relationships and perform a variety of in-depth analyses. The result of this study is a robust tool allowing cost analysts to effectively quantify the relationship between a tactical vehicle’s reliability and parts cost.

PA-4 – Presentation – Tactical Vehicle Cons & Reps Cost Estimating Relationship (CER) Tool

PA-4 – Paper – Tactical Vehicle Cons & Reps Cost Estimating Relationship (CER) Tool

PA-4 – Handout – Tactical Vehicle Cons & Reps Cost Estimating Relationship (CER) Tool


Unmanned Aerial Vehicle Systems Database and Parametric Model Research (PA-5)

Bruce Parker – Naval Center for Cost Analysis
Rachel Cosgray – Cost Analyst, Technomics, Inc.
Anna Irvine – Technomics, Inc.
Brian Welsh – Technomics, Inc.
Patrick Staley – Naval Center for Cost Analysis
Praful Patel – Operation Research Analyst, Naval Center for Cost Analysis

This handbook documents the first two years of research sponsored by NCCA and ODASA-CE. With the inclusion of UAS in the United States’ (U.S.) military arsenal, the government has a desire to understand the components of a UAS including the air vehicle, GCS and payloads, the development and production process, and the O&S implications of these systems. The goal of this research was to support early stage cost estimating for UAS programs where there are limited data and immature designs. Equations include data from Army, Navy, and Air Force programs, and reflect as broad a range of UAV types, with varied propulsion, mission, size, and shape, as was available for this study. The CERs are intended to support Analysis of Alternatives (AoA), Independent Cost Analysis (ICA), and similar analyses

PA-5 – Presentation – Unmanned Aerial Vehicle Systems Database and Parametric Model Research

PA-5 – Handout – Unmanned Aerial Vehicle Systems Database and Parametric Model Research


Building a Complex Hardware Cost Model for Antennas (PA-6)

David Bloom – Senior Engineering Manager, Raytheon Space and Airborne Systems
Danny Polidi – Raytheon

This paper discusses the development of a Complex Antenna Cost Model based on quantifiable sizing mechanisms which are designed to quickly and accurately calculate the “top-down” cost for all engineering and operations disciplines and functions required for antenna development and test.

Previous methods of antenna cost estimation were not based key sizing metrics (KSMs). So, although cost estimates were based on historical cost, scaling factors used to determine cost for new programs were frequently Engineering estimates. Often, previous methods used a bottoms-up approach where each discipline independently bids their contribution to the Program. With that method, each contribution, independently determined, would need to be added together for total antenna development and test cost with a high likelihood of overlap (double dipping cost) or omissions (missing costs). Previous methods would refer back to a parametric Cost Model to provide rationale of new costs, rather than a “similar-to” program. Previous methods of cost estimation would require an independent test of reasonableness. Previous methods of cost estimation would not provide any indication of cost drivers, or sensitivity factors.

The new cost estimation tool uses historical data, and through analytical comparison of requirements/specifications, quantitative effective size factors (or key size metrics) were determined. The KSMs are used as scaling factors along with actual cost for a specific historical program to calculate costs for new programs. The new cost estimation tool uses a top-down approach where all costs for a prior program are considered and inherently all disciplines are then included in the new estimate. This guarantees that there is no overlap or omission of cost. The new cost estimation tool displays graphically all loaded data to allow the user to select the most “similar-to” Program. The new Program cost can be related through KSMs to any of the loaded data. Because all loaded data is graphed along with the new Program cost, the tool provides a test of reasonableness. Each KSM in the new cost estimation tool provides some amount of impact to the total cost. That amount of impact, or sensitivity, is displayed in the tool so that the user has the opportunity to make technology trade-offs to provide the customer with the cost options available.

What makes this cost estimating tool significant is that in any Radar development, the antenna is often the most expensive piece of hardware and it is also the least well characterized in terms of development costs. Many in the Radar industry have described antenna development in terms of a “secret sauce”. This tool removes the “secret sauce” recipe to antenna development and allows the user and the customer the ability to make meaningful cost benefit trade-offs.

PA-6 – Presentation – Building a Complex Hardware Cost Model for Antennas


ESA Project Office Cost Model (PA-7)

Herve Joumier – Head of ESA Cost Engineering, European Space Agency

Abstract: The Project Office price is traditionally one of the most difficult negotiation areas between space agencies and industrial contractors and covers a significant part of the project cost. It is therefore a domain that requires all the attention of the estimators to better support the negotiations.

Space projects costs are mainly made of manpower costs aggregated through contractual layers ranging from simple structures such as single layer for small low cost project up to highly entwined multiple layers for very large projects involving many international partners. The Project Office traditionally covers Management, Product Assurance and Engineering. In our case, the model is expanded to cover Assembly Integration and Testing activities.

This paper describes the definition and the implementation of a Project Office parametric cost model aimed at defining a reference manpower allocation based on fair judgement and rational fair modelling. It has been developed to improve the cost estimations capability of ESA providing outputs that are used by agencies for comparison with contractors’ proposals. In particular the model focuses on Project Office cost for all the main industrial actors involved in the design and development of a satellite taking into account various possible scenarios depending on quality standards applied to the project and the sub-contracting approach retained. A wide variety of sub-contracting approaches have been observed, not simply driven by technical considerations but usually resulting from political demands to include some participating countries at an appropriate level of participation in line with their level of financial contribution.

This paper describe the steps and approaches of the model development with the intention to be a inspiring source for any organization willing to develop such competencies.

PA-7 – Presentation – ESA Project Office Cost Model

PA-7 – Paper – ESA Project Office Cost Model PA-7 – Handout – ESA Project Office Cost Model


Improving the Accuracy of Cost Estimating Relationship (CER) for Software Systems (PA-8)

David Wang – Director of Integrated Program Management, The Aerospace Corporation

Software plays an important role in many aspects of a modern space system, (e.g. real-time software control for onboard subsystems, networked connectivity, multi-source data fusion, pre-processing and post-processing of information, telemetry, tracking & control (TT&C) software, mission management, mission data, integrated mission applications and services, ground control, etc.). In order to develop useful and predictive cost estimating relationship (CER) for space systems, it is necessary to develop a predictive CER for software systems.

CER is a parametric cost estimating methodology often used by cost analysts to quantify the development costs of a software system. CER expressed Cost as a function of one or more independent variables (i.e. cost drivers). In space system software and ground software, the key cost driver is the size of the software (measured in the number of lines of code). The difference between actual and predicted cost represents the estimation error in the parametric cost model. Sophisticated mathematical models for Predictive Interval (PI) analysis have been proposed to analyze and bound the predictive error. The PI equation can then be used to generate an S-curve to predict the cumulative probability of the cost of a system. Numerous studies using actual cost performance data have shown that CER predictions using the traditional technique are much more optimistic than actual cost performance.

In this paper, we leverage recently published results on the statistical characterization of schedule and cost risks to analyze the prediction accuracy of CER for software systems. Our analytical analysis and empirical statistical analysis of actual code size growth data suggest that the statistics of code size estimate can also be characterized by fat-tail distributions. This suggests that predictive error of CER for large software development programs may be significantly larger than predicted by conventional PI analyses. We show in this paper a practical method for improving the accuracy of the prediction interval estimate, and thereby improving the prediction accuracy of the resulting S-curve.

PA-8 – Presentation – Improving the Accuracy of Cost Estimating Relationship (CER) for Software Systems

PA-8 – Handout – Improving the Accuracy of Cost Estimating Relationship (CER) for Software Systems


Hybrid Parametric Estimation for Greater Accuracy (PA-9)

William Roetzheim – CEO, Level 4 Ventures, Inc.

When discussing early stage estimation, estimation by analogy and parametric estimation are often compared and contrasted. But a new hybrid parametric approach that combines these two approaches typically yields significantly greater accuracy. With hybrid parametric estimation, a high-level-object, or HLO, catalog is created based on historic data to represent estimation components at different levels of granularity. At the most abstract level, this catalog may represent an entire project, in which case the HLO catalog results will match traditional estimation by analogy results. However, the HLO catalog will also support a much more granular representation of the items to be delivered, supporting representations all the way down to extremely fine representations such as a line of code (SLOC models) or something like an External Output (EO) or Internal Logical File (ILF) in a function point based environment. The real power of an HLO catalog based approach is in between these two extremes, where we have better granularity and accuracy than a project, but we require less specificity than that required by function points or SLOC based models.

Parametric estimation typically applies a cost estimating relationship (CER) to model a parameter that is often only incidentally related to the item being delivered (e.g., satellite weight) to cost. In this example, the goal is normally not simply lifting a certain amount of weight into orbit, but rather, accomplishing some specific mission. The fact that weight and cost are sufficiently related to allow prediction may be regarded as a fortunate coincidence. With hybrid parametric estimation we apply the statistical analysis and modeling techniques used for parametric estimation, but we look specifically for functional outcomes as our independent variables. These hybrid parametric CERs are, in fact, derived from our HLO catalog.

This talk will discuss hybrid parametric estimation based on HLO catalogs, and give examples of the application and accuracy of this technique within organizations including the State of California, Halliburton, IBM, Procter and Gamble, and multiple top 25 financial institutions.

PA-9 – Presentation – Hybrid Parametric Estimation for Greater Accuracy

PA-9 – Handout – Hybrid Parametric Estimation for Greater Accuracy


Linking Parametric Estimates to Program Management Artifacts (LPEPM) (PA-10)

Mike Smith – Booz Allen Hamilton
Ted Mills – Operations Research Analyst, NASA
John Swaren – Solutions Architect, PRICE Systems

A common fate of parametric cost and schedule estimates is that they fall into disuse as a Project’s own artifacts (e.g. Work Breakdown Structure (WBS), budget, schedule, risk lists, etc.) are created and mature. Parametric estimates typically do not map cleanly to WBS or schedule-derived artifacts, allowing a sense among Project Managers (PMs) ? rightly or wrongly ? that “parametric estimates are fine, but they don’t reflect my project.” As a result of this bias, parametric estimates and the estimators that generate them find themselves relegated to obscurity after passing the first project milestone. The problem lies in that dismissing parametric estimates on these grounds, PMs lose the benefit of the historic realities captured in Cost Estimating Relationships (CERs) that drive the models. Conversely, cost estimators have observed that the recent Joint Confidence Level (JCL) analyses required by NASA policy to occur at PDR/KDP-B, have yielded suspiciously narrow Coefficients of Variation in JCL cost S-curves. This gives rise to concerns within the cost community that projects, overly reliant on their own SMEs to provide uncertainty ranges, are missing opportunities to incorporate significant uncertainties into their estimates.

NASA’s Cost Analysis Division (CAD), Booz Allen Hamilton and PRICE Systems collaborated to conduct research into linking parametric estimates to programmatic artifacts in a manner that would elevate parametric estimates and allow Programs and Projects to apply the historical lessons that make parametric estimates so powerful and accurate. This research brought together parametric and programmatic cost estimators, model developers, software developers, schedulers, risk analysts and practitioners ranging from junior analysts to Ph.D thought-leaders to think through and articulate a process by which parametric cost estimates could be linked to programmatic artifacts in a manner that takes maximum advantage of the best each has to offer. Specifically, the collaborative research evaluated the feasibility of a parametric cost model “informing” a JCL model and vice-versa via iterative methodology. This research resulted in a practical, clearly-articulated process for performing this cross-informing linkage, as well as the development of standardized enabling tools (data collection templates and a dashboard tool) through which to visualize and perform iterative comparative analyses. The research used as a test-case a contemporary, real-world NASA project which needed only to meet two conditions: that a recent parametric estimate have been performed; and it had been through a JCL analysis. This ensured that a requisite set of comparable programmatic and parametric products existed. With those paired data sets, the LPEPM research team deconstructed the models and developed a process for linking parametrics to programmatic artifacts and proved that the concept can be executed and has merit. The team encountered challenges resulting in lessons-learned designed to benefit any analyst in the field attempting such a linkage.

PA-10 – Paper – Linking Parametric Estimates to Program Management Artifacts (LPEPM)

PA-10 – Presentation – Linking Parametric Estimates to Program Management Artifacts (LPEPM)


Impact of Full Funding on Cost Improvement Rate: A Parametric Assessment (PA-11)

Brianne Wong – Consultant, Booz Allen Hamilton
Erik Burgess – President, Burgess Consulting Inc.

The NRO Cost and Acquisition Assessment Group (CAAG) currently houses data collected from various U.S. Government organizations, including the Department of Defense and NASA. These data points are pooled with NRO data and used in Cost Estimating Relationships for space hardware, which underpin CAAG estimates for major system acquisition programs, aiding in the development of programs and budgets. Various funding rules have been in effect over the years for the different procurement agencies, and these rules may have an impact on cost. This study addresses the DoD policy of Full Funding, in particular, and its impact on recurring cost improvement for multi-unit buys. The NRO, which is not subject to full-funding rules, has historically found much steeper cost-improvement rates (averaging 85% cumulative-average) than the DoD has claimed on their programs. In this study we assess the recurring costs of almost 1,700 unit-level data points dating back to the 1970s and conclude that while funding rules certainly can impact cost in specific cases, the Full Funding rule doesn’t result in a statistically significant difference in cost-improvement rate across the data set.

PA-11 – Presentation – Impact of Full Funding on Cost Improvement Rate A Parametric Assessment

PA-11 – Handout – Impact of Full Funding on Cost Improvement Rate A Parametric Assessment


Developing R&D and Mass Production Cost Estimating Methodologies for Korean Maneuver Weapon System (PA-12)

Doo Hyun Lee – Korean Defense Acquisition Program Administration
Sung-Jin Kang – Professor Emeritus, Korea National Defense University
Suhwan Kim – Assistant Professor, Korea National Defense University

Today cost estimates for the government acquisition programs are important in supporting decisions about funding, as well as evaluating resource requirements at key decision points. Parametric cost estimating models have been extensively used to obtain valid cost estimates in the early acquisition phase. However, these models have many restrictions to obtain valid cost estimates in the Korean defense environment because they are developed to be used in the U. S. environment. In order to obtain reliable and valid R&D cost estimate, it has been important for us to develop our own Cost Estimation Relationships (CER), using historical R&D data. Nevertheless, there has been little research on the development of such model.

In this research, therefore, we have attempted to establish a CER development process to meet the current need, and found certain cost drivers for the Korean historical maneuver weapons system data, using Forward selection, Stepwise Regression and R square selection. We have also developed a CER model for production labor costs, using Learning rate which has been generally applied to estimate valid production labor costs. Learning effects are obtained from repetitive work during the production period under three assumptions; homogeneous production, same producer, and quantity measure in continuous unit.

While developing our own CER, we have used Principle Component Regression (PCR) method to avoid multi-collinearity and restriction of insufficient numbers of samples. As a result, we are able to overcome the multi-collinearity and develop a reliable CER. But many important results in statistical analysis follow the assumption that the population, being sampled or investigated, is normally distributed with a common variance and additive error structure. So, in this research, we have used the parametric power transformation proposed by Box & Cox (Box-Cox transformation) in order to reduce anomalies such as non-additivity, non-normality, and hetero-scedasticity.

This study is the first attempt to develop a CER for the Korean historical maneuver weapons system data for the Korean defense industry environment. This is significant because it will be an important methodology applied to the CER development for the future Korean weapons system.

PA-12 – Presentation – Developing R&D and Mass Production Cost Estimating Methodologies for Korean Maneuver Weapon System

PA-12 – Handout – Developing R&D and Mass Production Cost Estimating Methodologies for Korean Maneuver Weapon System


Risk Papers:

Excel Based Schedule Risk and Cost Estimates (RI-1)

William Evans – Associate, Booz Allen Hamilton

The emphasis on robust cost and schedule estimating solutions has resulted in the creation of multiple solutions for analysts and clients. Excel based integrated cost and schedule risk is only one methodology for solving client problems. Incorporating cost and schedule risk in Excel leads to an increased ability to audit and trace the schedule and cost risk methodology throughout an Excel based PLCCE, improving the confidence and robustness of the estimate. While there are hurdles to the implementation of an Excel based schedule risk solution, when combined with form controls, the benefits to PLCCE auditability and usability are immense.

Files not available


Using Bayesian Belief Networks with Monte Carlo Simulation Modeling (RI-2)

Marina Dombrovskaya – Senior Consultant, Booz Allen Hamilton

One of the main aspects of creating a Monte Carlo simulation cost estimate is the accuracy in defining uncertainty and risk parameters associated with the cost components of the model. It is equally important to assess and accurately represent inter-dependencies between uncertain variables and risks, which are measured via correlation. Since oftentimes historical data is insufficient for a rigorous statistical analysis, both probability distribution and correlation are commonly estimated via a subject matter opinion. However, inherent complexity of variable inter-dependencies is often overlooked during such estimates which could significantly affect results of Monte Carlo simulation model. Bayesian belief networks offer an alternative methodology for estimating probabilities and correlation between variables in a complex cost estimating model. Bayesian belief networks are graphical probabilistic models that represent random variables (cost components or risks) and their conditional dependencies with assigned Bayesian probabilities. They provide a visual representation of inter-dependencies among random variables and estimate probabilities of events that lack direct data. This talk will discuss benefits and various methods of applying Bayesian belief networks within a Monte Carlo simulation cost estimating model and explore these methods through hands on examples.

RI-2 – Presentation – Using Bayesian Belief Networks with Monte Carlo Simulation Modeling

RI-2 – Handout – Using Bayesian Belief Networks with Monte Carlo Simulation Modeling


Expert Elicitation of a Maximum Duration Using Risk Scenarios (RI-3)

Marc Greenberg

As acquisition programs become less mature, more advanced and more complex, there is an ever-increasing burden on the cost analyst to employ methods of eliciting requirements, schedule and cost uncertainties from one of more subject matter experts (SMEs). Arguably, the most common technique a cost analyst uses today to elicit such data is to ask each SME for the lowest, most likely and highest value which, consequently, produces a triangular distribution.
Eliciting and using a triangular distribution has its advantages. Getting the SME to provide the three input values takes only a few minutes, the SME can provide a reasonable basis for his or her input values and the distribution represents the SME’s first-order approximation of what s/he believes to be the uncertainty. However, this common process of depicting uncertain input parameters typically produces optimistic estimates. More specifically, structural limitations inherent to the triangular distribution coupled with the optimistic bias of the SME tend to produce optimistic estimates.

This paper provides with a brief review on a current method to elicit a most-likely commute time, a “practical maximum” commute time and risk factors that contribute to commute delays. This paper continues by showing how these risk factors can be organized into an objective hierarchy of risk factors, leading to the creation of a customized risk work breakdown structure (WBS). The cost estimator (i.e., interviewer) uses this risk WBS as a reference for interviewing the SME as follows:

1. Describe the practical worst commute case for each individual risk factor.
2. Estimate the risk-adjusted commute time associated with each individual risk factor.
3.Estimate the annual frequency associated with #1. Calculate the probability of occurrence.
4. Multiply risk-adjusted commute time by the probability of occurrence to get expected value.
5. Rank individual risk cases from highest expected value to lowest expected value.
6. Specify feasible combinations of worst case risks that could occur during the SMEs commute. Note: Each feasible combination is described as a “risk scenario”
7. With results from #6, calculate the probability of each risk scenario.
8. With results from #6, calculate the risk-adjusted commute time of each risk scenario.
9. Select commute time that has the lowest probability. This is the adjusted practical maximum.
10. Using most-likely and adjusted practical maximum durations, solve for maximum commute time.
11. Iterate from #1-#10 as needed. Provide a graphical representation to aid the SME.

Due the likely time intensiveness of such an interview process, this approach is intended to be used primarily for estimating durations of critical path activities and/or costs of high dollar items. The steps described not only help prevent SMEs from anchoring towards a most-likely estimate, but produces a maximum value that the cost estimator can quickly describe in terms of a feasible worst case scenario.

RI-3 – Presentation – Expert Elicitation of a Maximum Duration Using Risk Scenarios

RI-3 – Handout – Expert Elicitation of a Maximum Duration Using Risk Scenarios


Quantifying the Necessity of Risk Mitigation Strategies (RI-4)

James Northington – Analyst, Tecolote Research Inc.
Christopher Schmidt – Senior Consultant, Cobec Consulting Inc.
Chuck Knight – Consultant, Deloitte Consulting

A project’s risk management plan is a three step process that involves identifying risks, formulations of risk mitigation strategies, and the analysis of the cost/schedule impact of these risk mitigation strategies. Each risk is assessed for its likelihood to occur and the impact it would have on the program should the risk become an issue. These two parameters are plotted on a risk cube to show which program risks are of a higher priority.

The assessments of these parameters tend to suffer greatly from a high level of subjectivity. While necessary early in a program due to lack of data and program specific information, a program will evolve and generate additional data. This data if incorporated correctly into the risk process, can increase the accuracy of the measurements of program impacts, and ergo, the significance of risk mitigation strategies. With a small amount of additional, focused effort, programs can reduce subjectivity in the risk management process throughout the remainder of the program thereby providing an accurate and defendable position for the incorporation of risk mitigation strategies.

This paper will begin by highlighting flaws with the current risk management process, walk through the new proposed methodology for risk mitigation, and provide a quantitative example of the process in action using raw data. In the end, the proposed methodology will provide a greater understanding of program risks, a measurement of importance of implementing a risk mitigation strategy, a measurement of the mitigation strategy’s subsequent impact, and a quantitative measurement of benefit for Program Mangers to defend their risk mitigation strategies.

RI-4 – Presentation – Quantifying the Necessity of Risk Mitigation Strategies

RI-4 – Handout – Quantifying the Necessity of Risk Mitigation Strategies


Improved Decision Making with Sensitivity Analysis (RI-5)

Blake Boswell – Analytic Tool Developer, Booz Allen Hamilton

In constrained budget environments, Project Managers are often faced with tough decisions on how to balance project requirements with available funding. Therefore, it is critical for estimating models to not only serve as accurate predictors of future cost and schedule outcomes, but also to provide Project Managers the ability to explore trade-off scenarios, measure the effectiveness of potential decision strategies, and gain a greater understanding of what actions can improve the likelihood of project success.

To provide decision makers actionable intelligence, the technique of Sensitivity Analysis (SA) is often applied in the field project estimating. SA methods are related to probabilistic estimating models based upon Monte Carlo simulation or similar techniques for combining distributions for uncertainty in model inputs in order to estimate uncertainty in model outputs. Proper application of SA methods can provide insight into what is causing poor project performance, and what action is required by decision makers to ensure program success. However, shortcomings exist in conventional SA applications: SA metrics are often esoteric and become lost in translation between the analyst and program managers ? reducing their ability to provide the information needed by decision makers; standard SA practices have not kept pace with the increasing complexity of estimating techniques leading to misapplication and misinterpretation of results; and powerful SA techniques that have been proven effective in other estimating fields are often overlooked because they are not yet part of the standard project estimating lexicon.

In this study, we review common applications of SA methods to project estimation including a description of each method as well as its advantages and disadvantages. Additionally, we explore the topic of Global Sensitivity Analysis (GSA), which is a process for measuring the overall contribution of uncertain model inputs to variation in model outputs and is a popular technique for model validation in engineering and life sciences. GSA techniques are applicable to a robust class of estimating models including models that currently dominate the field of Integrated Cost and Schedule Risk Analysis. This study seeks to improve the ability estimating models to serve as a decision informing tools that help project managers make the right choices to improve the likelihood of program success.

RI-5 – Presentation – Improved Decision Making with Sensitivity Analysis

RI-5 – Handout – Improved Decision Making with Sensitivity Analysis


Affordability Engineering for Better Alternative Selection and Risk Reduction (RI-6)

Marlena McWilliams
Bob Koury – Chief Solution Architect, PRICE Systems LLC

Affordability engineering approach is based on the simple foundation that the system design and architecture should define the system cost. Developing an affordable accurate estimate means you must attain data. One of the primary focus areas of the government is affordability, due to the current budget crisis and complex, uncertain security environment. The current Deputy Secretary of Defense defined affordability as “cost effective capability”. Additionally, The Deputy Secretary of Defense chartered the Defense Systems Affordability Council (DSAC) to develop and guide the implementation of an integrated DOD strategy for better, faster, cheaper modernization. In this leadership role, the DSAC has enumerated three top level goals for the Department:

• Field high-quality defense products quickly; support them responsively.
• Lower the total ownership cost of defense products.
• Reduce the overhead cost of the acquisition and logistics infrastructure.

In order to accomplish these goals the pricing and engineering community must become more than just two organizations that support the effort they must become a joint hybrid organization which bridges the gap between technical and cost performance. Secondly, the need to use models that speaks an “engineering” language which enables the rapid translation of design concepts to program and fiscal impacts. Thirdly, the need for availability to attain some actual data and crosswalk, calibrate and map over estimates in any format required for quick what-if analysis and capability requirement justification.
This paper will outline the process and steps to how to implement affordability into your estimating environment to understand system requirements vs. system costs and affordability; and provide best value identifying and accepting the most affordable, feasible, and effective system or alternative. The need to evaluate and assign a best value is essential to both the government (DoD) and the contractors supplying systems / alternatives to the government.

RI-6 – Presentation – Affordability Engineering for Better Alternative Selection and Risk Reduction

RI-6 – Handout – Affordability Engineering for Better Alternative Selection and Risk Reduction


Risk Adjusted Inflation Indices (RI-7)

James Black – Cost Analysis Division, NASA

It is often observed that Office of the Secretary of Defense (OSD) inflation rates are different than prime contractor specific inflation rates seen in Forward Pricing Rate Agreements/Proposals (FPRAs/FPRPs) and in commodity group composite rates (e.g. Global Insight indices).
Yet, it is a standard practice in many cost estimating organizations to use OSD inflation rates for escalating costs in estimates without giving consideration to a range of different possible inflation rates. This can result in cost estimates that underestimate the effects of inflation, especially for programs that have many years of procurement and/or operations & support (where the compounding effects of inflation are significant).
This paper proposes an approach to create risk adjusted inflation indices based on defined risk distributions, thus giving consideration to a range of different inflation rate possibilities.

As an example, consider the following comparison between the current approach to calculating future-year weighted indices and the proposed risk adjusted approach (using a Monty Carlo simulation and triangular distribution as an example). Also, this example uses the hypothetical appropriation type titled “ABC”; in practice this would be Weapons Procurement Navy (WPN), Aircraft Procurement Navy (APN), etc.

Current approach to calculating future-year weighted indices:

Static OSD Inflation Rates for ABC * Outlays for ABC = OSD Weighted Indices for ABC;

Proposed risk adjusted approach to calculating future-year weighted indices:

Simulation Output * Outlays for ABC = Risk Adjusted Weighted Indices for ABC;

Where,
• Simulation Output = Monty Carlo Simulation with Triangular(Minimum, Mode, Maximum);
• Minimum = Smallest ABC Inflation Rate Observed Over Previous ‘X’ Years;
• Mode = Static future-year OSD Inflation Rate for ABC; Maximum = Largest ABC Inflation Rate Observed Over Previous ‘X’ Years;

In this example, the analyst would select the ‘X’ years that the minimum and maximum functions use. Yet, defining these minimum and maximum functions as such would be at the discretion of the analyst. Also, the selection of the distribution type would not be limited to triangular; continuous distributions (e.g. lognormal, beta, etc.) may be considered more appropriate. Additionally, the selection of simulation type would not be limited to strictly Monty Carlo. Care would need to be taken with the assignment of correlation coefficients between the aforementioned inflation distribution and any other distributions as well.
In the above example, the “Risk Adjusted Weighted Indices” would be used in place of the “OSD Weighted Indices” when performing escalation on cost elements that use the appropriation type “ABC”. Using this approach to generate Risk Adjusted Weighted Indices would enable cost estimates to consider a range of different possible inflation rates, rather than assuming a single static rate is representative of all future-year inflation.

RI-7 – Presentation – Risk Adjusted Inflation Indices

RI-7 – Handout – Risk Adjusted Inflation Indices


Critique of Cost-Risk Analysis and Frankenstein Spacecraft Designs: A Proposed Solution (RI-8)

Mohamed Elghefari  – Pasadena Applied Physics
Eric Plumer – NASA

When using the parametric method to estimate the cost of a spacecraft, cost analysts typically use the most likely value or best estimate for each technical input parameter required by the Cost Estimating Relationship (CER). The technical input parameters describe the physical, performance, and engineering characteristics of spacecraft subsystems. Examples of technical input parameters are mass, power requirements, data rate, memory capacity, solar array area, specific impulse, etc. These parameters are not typically known with sufficient precision to perfectly predict cost of the system particularly in the early stages of development. To produce some measure of cost risk, cost analysts go one step further and treat them as random input variables, and subjectively adopt probability distributions for modeling their uncertainties.

However, the various spacecraft subsystems are interdependent, and their designs are governed by key physical relationships, such as the Stefan-Boltzmann Law and the Rocket Equation (for missions requiring chemical propulsion). These key relationships analytically and implicitly relate the technical input variables of the various subsystems to one another and, yet, they are generally not upheld when cost analysts perform their cost-risk simulations. As a result, the generated spacecraft point designs (i.e., simulated sets of CER input variables) may be neither technically feasible nor buildable (i.e., “Frankenstein” designs), and the corresponding spacecraft cost estimates and program cost probability distribution are invalid.

In this paper, we present a historical data driven probabilistic cost growth model for adjusting spacecraft cost Current Best Estimate (CBE), for both earth orbiting and deep space missions. The model is sensitive to when, in the mission development life cycle, the spacecraft cost CBE is generated. The model is based on historical spacecraft data obtained from the NASA Cost Analysis Data Requirements (CADRe) database. This alternative cost-risk modeling approach encompasses the uncertainties of underlying design parameters of the spacecraft (i.e., cost drivers) without violating laws of physics or the theory of probability. In addition, it promotes realism in estimating NASA project costs by providing traceable and defensible data-derived measures of cost risk reflecting NASA’s historical cost-estimating performance.

RI-8 – Presentation – Critique of Cost-Risk Analysis and Frankenstein Spacecraft Designs A Proposed Solution *Best Paper: Risk Track

RI-8 – Handout – Critique of Cost-Risk Analysis and Frankenstein Spacecraft Designs A Proposed Solution


 Space Papers:

A Next Generation Software Cost Model (SP-1)

Jairus Hihn – Principal, NASA Jet Propulsion Laboratory
Tim Menzies – Professor, Computer Science, West Virginia University
James Johnson – NASA

The cost estimation of software development activities is increasingly critical at NASA as software systems being developed in support of NASA missions are becoming larger and more complex. As an example MSL (Mars Scientific Laboratory) launched with over 2 million lines of code. Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. Even more important is the threat of a schedule slip that could result in missing a launch date. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. NASA analysts currently employ a wide variety of models and tools in order to produce software cost estimates. Some models are available as COTS software applications, such as SEER-SEM(R). Other models such as COCOMO and COCOMO(R) II can be readily used in a variety of platforms such as Microsoft Excel.

While extensive literature exists on software cost estimation techniques, industry “best practice” continues to rely upon standard regression-based algorithms. These industry wide models necessarily take a one size fits all approach. This results either in models with large estimation variance or the need for a large number of inputs that are frequently not known in the early stages of the software lifecycle. One of the more significant advances in cost estimation has been the development of the Joint Confidence Level (JCL) methods and models. JCL is working well for NASA at PDR but there are challenges with applying this method earlier in the lifecycle. The detailed JCL approach is also less driven by parametric models and historical datasets, becoming more of an extension of network scheduling and resource analysis making the approach challenging to use effectively early in the lifecycle.

In this paper we will summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods. We will then describe the methodology being used in the development of a NASA Software Cost Model that provides an integrated effort, schedule, risk estimate, as well as identifying the changes in the project characteristics that are most likely to improve a given projects cost-schedule performance and risk exposure.

SP-1- Presentation – A Next Generation Software Cost Model

SP-1- Handout – A Next Generation Software Cost Model


NASA’s Phasing Estimating Relationships (SP-2)

Chad Krause – Burgess Consulting
Erik Burgess – President, Burgess Consulting Inc.
Darren Elliott – NASA and Commercial Projects Operations Manager, Tecolote Research, Inc.

Cost and schedule estimating in support of budget formulation is limited when cost phasing is not considered. As a result, NASA’a Office of Evaluation (OE) Cost Analysis Division (CAD) initiated a review of historic mission funding profiles for the purpose of corroborating current phasing profiles and optimizing future budgeting performance. Actual expenditures by year, technical parameters, and programmatic information were compiled and normalized from NASA’s extensive library of CADRe (Cost Analysis Data Requirment) documents for programs since 1990. Regression analysis on the normalized data was used to develop Weibull-based models that estimate expenditures and NASA Obligation Authority as a function of time from SRR to launch. Models for total project cost (excluding launch) and spacecraft/instrument cost only are presented, and front/back-loading is shown to be a function of total project cost, mission class, foreign participation, and other factors. Accuracy metrics derived from the historical data and the regression models are explained and incorporated in a phasing toolkit available to the cost-estimating community. Application of these models toward understanding phasing’s ramification on cost and schedule is also discussed.

SP-2 – Presentation – NASA’s Phasing Estimating Relationships

SP-2 – Handout – NASA’s Phasing Estimating Relationships


NASA Instrument Cost Model (NICM) (SP-3)

Hamid Habib-Agahi – Manager, Systems Analysis & Model Development Group, NASA Jet Propulsion Laboratory
Joseph Mrozinski – Systems Engineer, NASA Jet Propulsion Laboratory
George Fox

The NASA Instrument Cost Model (NICM) includes several parametric cost estimating relationships (CERs) used to estimate NASA’s future spacecraft’s instrument development cost. This presentation will cover the challenges associated with creating cost models in an environment where data on previously built instruments is 1) sparse, 2) heterogeneous and 3) book-kept differently by the various NASA centers and support institutions. It will also cover how these challenges were met to create a suitable instrument database which then was used to develop the CERs using Cluster Analysis, Principal Component Analysis and Bootstrap Cross Validation for different types of instruments, such as optical, particles detectors and microwave instruments.

NICM is sponsored by NASA HQ, with the primary NICM team operating at the Jet Propulsion Laboratory (JPL) in Pasadena, California. The first version of NICM was released in 2005. The latest version, NICM VI, was released in January, 2014.

SP-3 – Presentation – NASA Instrument Cost Model

SP-3 – Handout – NASA Instrument Cost Model


The NASA Project Cost Estimating Capability (SP-4)

Andy Prince – Manager, Cost Engineering Office, NASA/Marshall Space Flight Center
Brian Alford – Operations Research Analyst, Booz Allen Hamilton
Blake Boswell – Analytic Tool Developer, Booz Allen Hamilton
Matt Pitlyk – Operations Research Analyst, Booz Allen Hamilton

The NASA Air Force Cost Model (NAFCOM) has long been the standard NASA capability for estimating the cost of new spaceflight hardware systems during concept exploration and refinement. The software instantiation of NAFCOM was conceived during the early 1990’s during a time of stand-alone programs performing dedicated functions. Despite numerous improvements over the years, the NAFCOM software continued to be failure prone and suffer from performance issues. Decreasing Agency resources meant that the NASA cost community could not support the software engineering effort needed to bring NAFCOM up to an acceptable level of performance.

In addition to the software engineering problems, several other model limitations are directly related to the NAFCOM structure and software architecture. Chief among these is the difficulty in aligning the NAFCOM Work Breakdown Structure (WBS) with the NASA Standard WBS. Other issues include concerns with data security, the approach to risk analysis, insight into the functioning of the model, and clarity into the development of the Cost Estimating Relationships (CERs).

Given the issues summarized above and the improvements in Commercial off-the-Shelf (COTS) software over the last 20 years, NASA has decided to move forward with the development of a new estimating environment: the Project Cost Estimating Capability (PCEC). PCEC is an Excel based architecture that combines a user interface running VBA with WBS and CER libraries. This structure provides a high degree of flexibility and openness while reducing the resources required for software maintenance, thus allowing more effort to put into improving our models and estimating capabilities. The NASA cost community is also taking advantage of existing Information Technology (IT) systems to provide security. COTS and special purpose tools now provide capabilities such as risk analysis and cost phasing, functions previously contained in the NAFCOM software.

The paper begins with a detailed description of the capabilities and shortcomings of the NAFCOM architecture. The criteria behind the decision to develop the PCEC are outlined. Then the requirements for the PCEC are discussed, followed by a description of the PCEC architecture. Finally, the paper provides a vision for the future of NASA cost estimating capabilities.

SP-4 – Presentation – The NASA Project Cost Estimating Capability *Best Paper: Space Track

SP-4 – Handout – The NASA Project Cost Estimating Capability


Developing Space Vehicle Hardware Nonrecurring Cost Estimating Relationships at the NRO CAAG (SP-5)

Ryan Timm
Jan Sterbutzel

This paper builds on our 2012 SCEA conference briefing that described the NRO CAAG approach to developing Space Vehicle (SV) hardware Cost Estimating Relationships (CERs) for Nonrecurring (NR) engineering. These CERs are developed from the NRO CAAG’s cost database of more than 2300 space hardware boxes, and can stand as alternatives to other popular parametric tools, like the nonrecurring CERs in USCM or NAFCOM. We will briefly cover our box level estimating method, CER development approach, and the types of hardware (equipment groups) being estimated. We will describe the functional forms and different scale and complexity variables selected for each equipment group. We will also highlight some of the issues encountered and lessons learned during CER development including:

1. Striking a balance between data homogeneity and data quantity in equipment groups
2. Selecting average unit cost (AUC), theoretical first unit cost (T1), or weight as a primary scale variable when developing NR SV CERs
3. Handling incidental nonrecurring costs and points with low % New Design values
4. Determining the impact of production quantity on nonrecurring cost
5. Accounting for cost of prototype units produced such as engineering units, qualification units, and other prototypes
6. Lessons Learned: merits of some statistical measures and methods used to evaluate, compare and select CER candidates

SP-5 – Presentation – Developing Space Vehicle Hardware Nonrecurring Cost Estimating Relationships at the NRO CAAG

SP-5 – Handout – Developing Space Vehicle Hardware Nonrecurring Cost Estimating Relationships at the NRO CAAG


NASA JCL: Process and Lessons (SP-6)

Steve Wilson – PP&C Analyst, NASA
Mike Stelly – Cost and Schedule Analyst, NASA

‘Joint Confidence Level’ (JCL), from its inception, has proven throughout NASA to be much more than a rote framework of mathematical nuances, but, rather, a mechanism for capturing intra-program/project complexity, program control process synergy, and other disparate effects — all faces of a nascent analytical abstraction whose implications touch almost all salient issues that comprise the picture of program health and trajectory.

Our paper will describe JCL implementation and address the creation, implementation, evolution, inherent benefits, inherent issues, its ultimate place among program management’s decision-making toolset, and hard recommendations for organizations hoping to wage successful JCL campaigns. Real-world examples will be referenced, including those from the Constellation, Commercial Crew, and Orion spacecraft development programs.

Issues discussed will include, but may not be limited to:

~The benefits of joint confidence level as a cost-schedule-risk ‘stovepipe merging’ agent within organizations
~The role of risk in JCL and program management and the challenges quantification of risk pose on future analysis
~Cost estimating approaches (parametric cost estimating, build-up estimating, etc) and their varied appropriateness for inclusion in a JCL model
~Schedule Do’s and Don’ts, integration issues and solutions, and an overview of schedule health and confidence level metrics
~The role of uncertainty and its implication on the overlap among cost, schedule, and risk
~Hard recommendations for the future implementation of JCL: consideration of performance, annual risk results, and other process-specific lessons for creating a defensible analysis

SP-6 – Presentation – NASA JCL Process and Lessons

SP-6 – Handout – NASA JCL Process and Lessons