|Cost and Competition in U.S. Defense Acquisition
|The cost estimator has a major role in determining the price, and therefore value, of major systems acquisition in the Department of Defense. Two primary costing methodologies include 'should cost' and 'will cost' analysis, and are affected by 'must cost' realities. This paper explores the history of these costing methods and places them in a theoretical context, first with respect to the meaning of competition, and second with respect to the nature of cost.
|Best Overall/Acquisition & Operations
|Being Certain About Uncertainty: Part 2
|Andy Prince, Christian B. Smart
|This paper addresses the difficult and pervasive challenge of identifying extreme cost growth early in a project's life cycle and preventing it before it happens. The paper examines how DoD and NASA have implemented policies and practices to minimized or eliminate extreme cost growth and why those policies can sometimes fail. Finally, we propose some remedies that could help and identify some warning signs that a project may be headed for trouble.
|Best Overall/Comprehensive Perspectives
|Demand, Recurring Costs, And Profitability
|Douglas K. Howarth
|Customers in all markets collectively abide by their self-imposed demand curves, which dictate their responsiveness to changes in price and the maximum quantities of products they can absorb. Concurrently, producers in all markets face recurring costs, which typically fall over time due to a variety of factors. Producers can effectively model demand and recurring costs before product launch. Understanding how demand curves relate to recurring costs is key to enhancing profitability, which this paper examines.
|Enhancing Risk Calibration Methods
|Christian B. Smart
|Calibration methods such as the Enhanced Scenario-Based Method allow analysts to establish cost risk analyses that are based on objective data. Some methods currently in use rely on the normal and two-parameter lognormal. Empirical data, however, indicates that a three-parameter lognormal is more appropriate for modeling cost risk. We discuss three-parameter lognormals and how to calibrate cost risk using this distribution. We compare the results with traditional calibration to two-parameter normal and lognormal distributions.
|Risk & Uncertainty
|A Probabilistic Method for Predicting Software Code Growth: 2018 Update
|Eric M. Sommer, Bopha Seng, David LaPorte, Michael Ross
|Software estimating is challenging. SMC's approach has evolved over time to tackle this challenge. Originally based on Mike Ross's 2011 DSLOC Estimate Growth Model, we've updated our model to include more recent SRDR data and an improved methodology (Orthogonal Distance Regression). Discussions will focus on non-linear relationships between size and growth, unique growth for new, modified, and unmodified DSLOC, as well as correlation between DSLOC types and future efforts to include space flight software data.
|Software & Agile
|Technology Development Cost and Schedule Modeling
|A tangible need exists in the scientific, technology, and financial communities for economic forecast models that improve new or early life-cycle technology development estimating. Industry models, research, technology datasets, modeling approaches, and key predictor variables are first examined. Analysis is then presented, leveraging a robust industry project dataset, applying technology and system-related parameters to deliver higher performing parametric cost and schedule models.
|Best Overall/Data Analysis
|Beyond RIFT: Improved Metrics to Manage Cost and Schedule
|Risk-Informed Finish Threshold (RIFT) presented an innovative solution to the problem inherent in schedules that risk analysis results (time) cannot be allocated the same way as in cost models (dollars). Developing RIFT validation methods inspired an exploration into analyzing simulation data more meticulously. Methods described here provide unique insight into cost and schedule uncertainty results while introducing powerful new techniques to improve a project's potential to complete on time, on budget.
|EVM & Scheduling
|A History of Thought in Defense Acquisitions
|Eric M. Lofgren
|As Congress debates another round of defense acquisition reform, the necessary role for the cost estimator is affirmed. But how did this role come about and what are future implications? From the famed RAND systems analyses of the 1950s to the introduction of data reporting systems still in use today, this paper will explore the rich history of thought in defense acquisition, giving a special eye to controversies and continuing challenges that affect cost estimators.
|Policy & Standards
|Being Certain About Uncertainty, Part 1
|Doing cost risk analysis is hard because we don't really know what a good cost risk analysis looks like. In this paper we will explore the challenges to doing good cost risk analysis and discuss ways to know if your cost risk analysis is any good. We will also examine the phenomena of extreme cost growth and lay the groundwork for future work.
|Software Effort Estimation Models for Contract Cost Proposal Evaluation
|Wilson Rosa, Corinne Wallshein
|This study will introduce regression models and benchmarks for predicting software development effort using input variables typically available at contract bidding phase. The analysis is based on 200 DoD projects delivered from 2004 to 2016. The first set predicts effort using initial software requirements along with peak staff, application domain, and other categorical variables. The second set predicts effort specifically for agile software development using data from 16 agile projects.
|3D Printing in the DoD; Employing a Business Case Analysis
|Nicole Santos, Richard Shea, Robert Appleton
|3D Printing is the family of technologies that enables users to produce items on demand from CAD files or 3D scanning. The potential benefits to military logistics include cost savings, weight reduction, and responsiveness to the warfighters' needs. To demonstrate and measure the benefits in the Department of Defense (DoD), a rigorous Business Case Analysis (BCA) will identify benefits and challenges to implementation including evaluating its costs, risks, and benefits.
|Business Case Analysis
|Beyond Anderlohr: An Alternate Approach To Estimating Production Breaks
|Estimating the cost impacts of production breaks has long been problematic. Use of the Anderlohr method is widely accepted, but requires a significant degree of estimating judgment and can produce substantially different answers based on the individual user's assumptions. This paper suggests an alternate empirical methodology based on recent research on organizational learning and forgetting.
|Methods & Models
|Putting Schedule Quality Checks to the Test
|Eric M. Lofgren
|Analysts often use the 14-Point Assessment for Schedule Quality as a benchmark for determining the overall reliability of a schedule. But how much of the variation in schedule error can be explained by the results of the 14-Point check? This paper will use actual defense contract data to find the correlates of schedule reliability, measured using both the accuracy and the timeliness with which the schedule slip is predicted.
|Management, EVM & Scheduling
|Dangers of Parametrics
|What if our models are not solving our estimating problems, but instead are the source of our problems? The purpose of this paper is to address this question. We will look at what a cost model is, and what it isn't. We will examine how cost models appeal to our need for certainty and helps us create a good story for our cost estimate. We will take a look at a simple cost model to see what can go wrong when we trust the model over trusting the data. Finally, we will identify specific actions
|Introducing RIFT to Protect Your Uncertain Schedule
|Nicholas DeTore, Peter Frederic
|There are industry-accepted methods for allocating cost risk and uncertainty analysis results to detailed WBS elements; schedule results cannot be allocated the same way since duration behaves differently than cost. We present an innovative solution to this issue. The RIFT algorithm calculates a threshold date, for any task or milestone, that if exceeded puts the probabilistic project finish date in jeopardy. RIFT provides a new tangible metric to guide decision makers.
|Risk & Uncertainty
|Process-Related Effort and Schedule Estimating Relationships for Software Cost Estimating
|Nichlas Lanham, Corinne Wallshein, Wilson Rosa
|The Naval Center for Cost Analysis will present comprehensive, updated findings of software size growth factors, effort estimating relationships (EER), and schedule estimating relationships (SER) with subsets of Department of Defense Computer Software Configuration Item records. This presentation focuses on software size (new, modified, reused, auto-generated, and total code). Subsets include maturity, application and super domain, language, contract type, and operating environment.
|The Psychology of Cost Estimating
|Research into human psychology has yielded amazing findings into how we process information and how we use information to make decisions. Cost estimators can use psychology and behavioral economics to improve not only our cost estimates but how we communicate and work with our customers. By understanding how our customer's think, and more importantly, why they think the way they do, we can have more productive relationships and greater influence. The end result will be better decisions by the decision makers.
|Best Overall/Program Management
|The Navy Modernization Program: Estimating the Cost of Upgrading AEGIS Guided Missile Cruisers
|The Navy is well into its 20 year, $16B (CY10$) plan to modernize 84 AEGIS warships. This cost estimate covers eleven of the planned ships and accounts for maintenance and upgrades to the ship and the combat systems. The estimate leverages recent contract and shipyard performance data and interviews with engineers, resulting in a detailed study and recommended cost savings initiatives. The methods and data in this estimate will assist ship modernization cost efforts across the fleet for the foreseeable future.
|Generalized Degrees of Freedom (GDF)
|Minimum-Percentage Error/Zero-Percentage Bias (ZMPE) method is commonly used for multiplicative-error models. But ZMPE users do not adjust degrees of freedom (DF) for constraints included in the regression process. This generates misleading ZMPE CER fit statistics and underestimates the CER error distribution variance. Hence, ZMPE results are incomparable with those derived without constraints. This paper details why DF should be adjusted and proposes a Generalized Degrees of Freedom measure to compute fit statistics for constraint-driven CERs.
|Methods & Models
|Improvement Curves: An Early Production Methodology
|Learning slope selection is a critical parameter in manufacturing labor estimates. Incorrect ex ante predictions lead to over- or understatements of projected hours. Existing literature provides little guidance on ex ante selection, particularly when some actual cost data exists but the program is well short of maturity. A methodology is offered using engineered labor standards and legacy performance to establish a basic learning curve slope to which early performance asymptotically recovers over time.
|Beyond Correlation: Don't Use the Formula That Killed Wall Street
|Risk models in cost estimating almost exclusively rely on correlation as a measure of dependency. Correlation is only one measure (among many) of stochastic dependency. It ignores the effect of tail dependency, when one extreme event affects others. This leads to potential outcomes that do not make sense such as a program with a large schedule overrun but no cost overrun. This method has been widely blamed for the financial crisis a few years ago. The copula method is introduced as a way to overcome these deficiencies.
|Early Phase Software Cost and Schedule Estimation Models
|Software cost estimates are more useful at early elaboration phase, when source lines of code and Function Points are not yet available. This study introduces effort estimation models using functional requirements as a size predictor and evaluates development schedule using estimated effort based on data from 40 military and IT programs delivered from 2006 to 2014. Statistical results show that estimated functional requirements and peak staff are significant contributors and that estimated or actual effort is a valid predictor of development duration.
|Bayesian Parametrics: Developing a CER with Limited Data and Even Without Data
|This paper discusses Bayes' Theorem, and applies it to linear and nonlinear CERs, including ordinary least squares and log-transformed ordinary least squares.
|A Comprehensive CES and BCA Approach for Lifelong Learning
|Kevin Cincotta, Darcy Lilley
|The Air Force Air Mobility Command (AMC) Enterprise Learning Office (ELO) mission is to transform AMC into a premier Air Force learning organization, achieve learning through optimum approaches and develop Mobility Airmen into life-long learners who demonstrate institutional Air Force competencies with a positive approach to managing their own learning. In this context, learning has three main components: training, education, and experience.
|Business and Art of Cost Estimating
|Trust but Verify - An Improved Estimating Technique Using the Integrated Master Schedule (IMS)
|It has long been the wonder of management why the Integrated Master Schedule (IMS) fails to give advanced warning of impending schedule delays. The oft-touted Government Accountability Office (GAO) 14-Point Check for Schedule Quality analyzes schedule health using key metrics, leading one to assume that such a test authenticates schedule realism. Why, then, do practitioners find themselves caught off guard to slips when their IMS appears in good health?
|Earned Value Management
|Improved Method for Predicting Software Effort and Schedule
|Wilson Rosa, Barry Boehm, Ray Madachy, Brad Clark, Joseph P. Dean
|This paper presents a set of effort and schedule estimating relationships for predicting software development using empirical data from 317 very recent US DoD programs. The first set predicts effort as a function of size and application type. The second predicts duration using size and staff level. The models are simpler and more viable to use for early estimates than traditional parametric cost models. Practical benchmarks are also provided to guide analysts in normalizing data.
|Cost Overruns and Their Precursors: An Empirical Examination of Major Department of Defense Acquisition Programs
|Alan Gideon, Enrique Campos-Nanez, Pavel Fomin, James Wasek
|This paper proposes a model of acquisition program future cost for two specific acquisition domains - aircraft and ships - that takes into account the non-recurring developmental costs defined at program approval and each domain's historic tendencies to exceed planned program cost. Technical and non-technical reasons for these observations are discussed.
|Life Cycle Costing
|Innovative Business Agreements and Related Cost & Pricing Methods at NASA in Support of New Commercial Programs
|Torrance Lambing, James Roberts
|This paper and presentation, focusing on Kennedy Space Center, will discuss changes and new methods of pricing and estimating the costs of NASA facilities and services to be provided to outside entities for use in new Commercial Space endeavors. It will also give an overview of new NASA regulations and documents that establish policy and guidance for entering into Agreements and how they are priced under the various types of Agreement currently being used at NASA.
|Relating Cost to Performance: The Performance-Based Cost Model
|Michael Jeffers, Robert Nehring, Jean-Ali Tavassoli, Kelly Meyers, Robert Jones
|For decades, in order to produce a cost estimate, estimators have been heavily reliant on the technical characteristics of a system, such as weight for hardware elements or source lines of code (SLOC) for software elements, as specified by designers and engineers. Quite often, a question will arise about the cost of adding additional performance requirements to a system design (or in a design-to-cost scenario, the savings to be achieved by removing requirements). Traditionally, the engineers will then have to undertake a design cycle to determine how the shift in requirements will change the system.
|Critique of Cost-Risk Analysis and Frankenstein Spacecraft Designs: A Proposed Solution
|Mohamed Elghefari, Eric Plumer
|In this paper, we present a historical data driven probabilistic cost growth model for adjusting spacecraft cost Current Best Estimate (CBE), for both earth orbiting and deep space missions. The model is sensitive to when, in the mission development life cycle, the spacecraft cost CBE is generated. The model is based on historical spacecraft data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.
|The NASA Project Cost Estimating Capability
|Andy Prince, Brian Alford, Blake Boswell, Matt Pitlyk
|The paper begins with a detailed description of the capabilities and shortcomings of the NAFCOM architecture. The criteria behind the decision to develop the PCEC are outlined. Then the requirements for the PCEC are discussed, followed by a description of the PCEC architecture. Finally, the paper provides a vision for the future of NASA cost estimating capabilities.
|Touch Labor Estimate Modeling
|Michael Yeager, Lyle Davis
|To support its mission for the F-35, the production cost estimating team developed touch labor estimate models which include the flexibility to run production rate effects, loss of learning, commonality adjustments, affordability initiative and outsourcing impacts, multiple learning curve break point analyzes, and estimates for touch labor in hours or by realization. Given the scrutiny to the F-35 Lightning II program by the Department of Defense and Congress detailed and accurate cost modeling allows for better budgeting and more credibility within the services themselves and with the American public.
|Robust Default Correlation for Cost Risk Analysis
|Correlation is an important consideration in cost risk analysis. Exclusion of correlation from cost risk analysis results in the de facto assumption that all risks are independent. The assumption of independence leads to significant underestimation of total risk. However, figuring out the correct correlation values between work breakdown structures elements can be challenging. For instance, it is difficult to estimate the exact correlation value between the structures and thermal protection subsystems in a cost risk estimate.
|Utilization of Visual Basic in Cost Estimating and Analysis Tools - Anyone Can Cook
|As collaborative environments become more prevalent in all industries, the cost estimating and analysis industry is no exception to this movement. Increasing amounts of pressure are put upon cost estimators and analysts to develop tools that are low cost, fast, robust in design, have long standing methodologies, do not require proprietary software or licenses, and are and easy to use by both the advanced cost estimator looking for maximum control and the novice simply trying to diligently support the early stages of a development program.
|Deciphering JCL: How to use the JCL Scatterplot and Isocurves
|As the use of integrated cost and schedule risk analysis (ICSRA) methodologies rapidly expands across U.S. government programs there has been increasing confusion on how to interpret its seminal result: the Joint Confidence Level scatterplot and its associated isocurve. Over the past few years, the following statements have been made about JCL.
|Back to the Big Easy: Revisiting Hilbert's Problems for Cost Estimating
|Peter Braxton, Richard Coleman
|At the International Congress of Mathematicians at the Sorbonne in Paris in 1900, German mathematician David Hilbert boldly put forth a list of 23 theretofore unsolved problems in mathematics, which subsequently became quite influential in 20th-century research efforts. At the Joint SCEA/ISPA Conference in New Orleans in June, 2007, the authors audaciously emulated Hilbert with a list of 21 problems for cost estimating and risk analysis.
|Getting (and sharing!) the FACTS: Factors, Analogies, CERs and Tools/Studies
|Daniel Harper, Ruth Dorr
|One of MITRE's corporate values is "People in partnership." MITRE values [...] partnership with the government, collaboration within and without..."
We at MITRE have been charged via our corporate goals to "Apply technical and engineering excellence," by bringing to the customer the best thinking possible by "[...] tapping into a deep technical base, both within MITRE and globally, across the breadth of industry and academia."
|Methods & Models
|ODASA-CE Software Growth Research
|Kevin Cincotta, Lauren Nolte, Eric Lofgren, Remmie Arnold
|For several years, the Office of the Deputy Assistant Secretary of the Army for Cost and Economics (ODASA-CE) has used a single growth factor to account for size growth in weapon system software development estimates. This factor is invariant to program characteristics and may, for example, lead to excessive growth estimates for large programs, whereas experience suggests that these grow less in percentage terms than their smaller counterparts. Over the past year, ODASA-CE worked with Technomics, Inc. to research improved methodologies for incorporating growth in software estimates.
|Time is Money: The Importance and Desired Attributes of Schedule Basis of Estimates
|"Time is Money" is a maxim made popular by Benjamin Franklin. It reflects the long understood importance of schedule requirements and their impact on cost. While the importance of cost basis of estimates (BoE) have gained in popularity among industry communities, schedule BoEs are, at least equally, if not more important as cost, but have yet to reach the same level of understanding of importance within the industry.
|Earned Value Management
|Can DoD Inflation Indices and Discounting Processes Be Improved?
|Kathryn Connor, James Dryden
|Currently the DoD is facing an uncertain budget environment. This will have an impact on what the DoD can spend for acquisition programs and sustainment of major weapons systems. Current practices for inflation and discounting skew program affordability, especially during operations and sustainment. In this presentation, we look at how well current inflation indices and discount rates serve programs today and whether there are strategies to improve the accuracy of these estimates. After examining the experience of several major weapons systems we have identified potential policy changes and strategies for cost estimators to employ on inflation and discounting. We believe that these can improve a program's understanding of long run affordability and potential risks associated with inflation and discounting.
|Life Cycle Cost
|Rapid Generation and Optimisation of Ship Compartment Configuration based on Life Cycle Cost and Operational Effectiveness
|Aidan Depetro, Rhyan Hoey
|It has been well established that the majority of Life Cycle Cost (LCC) is incurred during the in-service period. Among other factors, this is strongly linked to the design of the ship and the decisions made during the early design phase. In particular, compartment configuration can have a significant effect on LCC. Poorly considered compartment configuration and hull selection can result in hydrodynamic efficiencies which significantly increase energy consumption and hence fuel costs. Associated space limitations, inadequate or non-existent removal routes and other accessibility problems may result in expensive equipment overhaul and replacement procedures, invasive removal methods, longer maintenance availabilities and increased maintenance costs. Current design methods and decision analysis techniques focus mainly on the trade-off between operational effectiveness and acquisition cost rather than LCC.
|Methods & Models
|Fit, Rather Than Assume, a CER Error Distribution
|Analysts usually assume a distribution (e.g., normal, log-normal, or triangular) to model the errors of a cost estimating relationship (CER) for cost uncertainly analysis. However, this hypothetical assumption may not be suitable to model the underlying distribution of CER errors. A distribution fitting tool is often used to hypothesize an appropriate distribution for a given set of data. It can also be applied to fit a distribution to.
|Estimating Relationship Development Spreadsheet and Unit-as-an-Independent Variable Regressions
|Raymond P. Covert, Noah L. Wright
|MCR has constructed an estimating relationship (ER) development spreadsheet based on the zero percent bias, minimum percent error (ZMPE) regression technique to help with more credible and efficient development of cost improvement curves and cost estimating relationships (CERs). The CER-development method accommodates linear, power and triad functions with single and multiple technical and dummy independent variables. Furthermore, the ER development spreadsheet may be modified to accommodate other functional forms that may be of value in particular contexts.
|Enhanced Scenario-Based Method for Cost Risk Analysis: Theory, Application, and Implementation
|Bryan Flynn, Peter Braxton, Paul Garvey, Richard Lee
|There is a growing realization within the cost-analysis community that estimates of cumulative probability distributions of cost, or S-curves, too often understate true, underlying risk and uncertainty. Several organizations cite cases where return program acquisition costs, or actuals, fall at the 99th+ percentile on S-curves estimated years previously. This degree of deviation from the mean is a legitimate possibility for any one acquisition program. After all, there's no such thing as an "average" program. Variation is expected.
|Modeling R&D Budget Profiles
|Prior research sponsored by the National Reconnaissance Office (NRO) and published in the Journal of Parametrics (Summer 2006) has become the principal method for developing space and ground segment budget profiles for NRO programs. Since then, the practice of budgeting to independent cost estimates has been signed into law for the intelligence community. The resulting scrutiny and reliance on budget-phasing models has motivated several improvements that should be relevant to the estimating process for other DoD commodities. The presentation first reviews the mathematical formulation of expenditure profiles that are based on historical data, and how some of the resulting accuracy metrics have been useful in defending annual budgets. New models for space and ground systems are presented. Since the annual budget authority required to support an expenditure stream is heavily dependent on a program's execution rates (i.e., outlay rates), we also present a new approach for estimating what those outlay rates will be. This is a new area of responsibility for most cost estimating organizations, yet its impact on annual budget requests can be significant.
|Galaxy Charts: Depict and Color Your WBS in a Meaningful Way
|Robert Nehring, Katharine Mann, Robert Jones
|For centuries, we have searched for new ways to display our thoughts and ideas in ways that will allow the viewer to easily digest and understand our point of view. Displaying quantitative data in interesting yet meaningful ways is no different. In fact, discerning how to display your entire Work Breakdown Structure (WBS) succinctly and clearly has proven to be very difficult. One answer to this challenge is called the Galaxy Chart, which shows both relationships and magnitudes on a single chart.
|Best Overall/Methods and Models
|A Closed-Form Solution for the Production-Break Retrograde Method
|Darrell Hamilton, Brian Gillespie
|This article explores and discusses concepts surrounding the multi-step retrograde analysis process for learning curve production breaks that was popularized by George Anderlohr, in his 1969 Industrial Engineering article "What Production Breaks Cost". Mr. Anderlohr based much of his analysis using the cumulative average curve method, but the basic principles have been widely accepted and used to calculate the equivalent calculation using the unit theory learning curves.
|Accepted Standards and Emerging Trends in Over Target Baseline (OTB) Contracts
|Over Target Baseline (OTB) projects or programs are those that have run significantly over cost and require formal reprogramming - essentially a complete replanning of the project - in order to help the contractor regain management control over the effort. The OTB process has been well documented and become an established part of Earned Value Management practice. Much of the literature-to-date has focused on OTBs from the contractor perspective, including the steps to take in order to propose and implement an OTB, and the proper channels and occasions for engaging the customer in the process.
|Earned Value Management
|Designing a Conceptual Framework for Estimation and Analysis of Total Ownership Cost
|F. Gurney Thompson III, Robert Koury
|In recent years, the push for greater efficiency and productivity in Defense spending has yielded an increased focus on affordability analysis. Understanding and estimating Total Ownership Costs (TOC) is key in assessing affordability, and the cost community must adapt to support TOC estimation. This paper discusses the development of a conceptual framework for estimating TOC in support of a broader audience, from the acquisition community to program managers and even as a decision support tool for entities such as Congress, DoD Financial / budgetary community, and G-8 Program Analysis & Evaluation.
|Life Cycle Cost
|Will-Cost and Should-Cost Management: It's Not Business As Usual
|In April , 2011 Under Secretary of Defense for Acquisition, Technology & Logistics Ashton Carter issued the Memorandum: "Implementation of Will-Cost and Should-Cost Management". The memo defines implementation of Should-Cost and Will management for all ACAT I, II and III programs and lists "Selected Ingredients of Should Cost Management". Thus, each organization involved with these programs must successfully deal with the challenges or planning, coordinating and managing Should Cost/Will cost programs and have the necessary tools to quantitatively manage them through their life cycle.
|Methods & Models
|Covered with Oil: Incorporating Realism in Cost Risk Analysis
|When Jimmy Buffett sang the words “All of those tourists covered with oil” in his song Margaritaville he probably never imagined that this phrase might apply to crude oil instead of suntan lotion. Both the cost and the environmental impact from the 2010 oil spill in the Gulf of Mexico were much worse than anyone had expected or could have predicted. It was, in the words of financial writer Nassim Taleb, a “black swan” an unexpected event with tremendous consequences. These types of events, like Hurricane Katrina in 2005, the giant tsunami in the Indian Ocean in 2004, and the financial crisis that began in 2007 are all examples of events with huge impacts that are hard to predict.
|An Improved Method for Predicting Software Code Growth: Tecolote DSLOC Estimate Growth Model
|This paper describes the Tecolote DSLOC Estimate Growth Model, which provides probabilistic growth adjustment to Technical Baseline single-point Estimates (TBEs) of Delivered Source Lines of Code (DSLOC) for New software and for Pre-Existing software, these estimates being sensitive to the “maturity” of the estimate; i.e., when, in the Software Development Life Cycle (SDLC), the TBE DSLOC estimate is performed. The model is based on Software Resources Data Report (SRDR) data collected by Dr. Wilson Rosa of the U.S. Air Force Cost Analysis Agency (AFCAA). This model provides an alternative to other software code growth methodologies such as Mr. Barry Holchin’s (2003) code growth matrix.
|Trade Space, Product Optimization and Parametric Analysis
|This paper shows how to bound, build and assemble trade spaces for product optimization.
The advent of computerized tools that describe available trade spaces has changed not only the nature of optimized product design, but that of parametric cost studies as well. Because these tools allow broader analysis, engineers produce more potential designs and parametricians must produce many more estimates in support of them.
|Methods & Models
|An Application of Data Mining Algorithms for Shipbuilding Cost Estimation
|Bohdan L. Kaluzny
|The North Atlantic Treaty Organization (NATO) Research and Technology Organization (RTO) Systems Analysis and Studies (SAS) 076 Panel (NATO Independent Cost Estimating and its Role in Capability Portfolio Analysis) is a working panel generating independent cost estimates for NATO systems with the aim of standardizing how NATO countries conduct cost estimation. One of the systems analyzed by the SAS-076 Panel in an ex post exercise was Her Netherlands Majesty's Ship (HNLMS) Rotterdam Landing Platform Dock (LPD), an amphibious transport dock ship that was launched in 1997.
|Joint Cost Schedule Model(JCSM) - Recent AFCAA Efforts to Assess Integrated Cost and Schedule Analysis
|Antonio Rippe, Greg Hogan, Darren Elliott
|The Space Division of Air Force Cost Analysis Agency (AFCAA) supports Air Force (AF) and Department of Defense (DoD) Major Space Acquisition programs by providing thorough, effective, independent cost estimates (ICEs) and conducting special studies for decision makers. Recently AFCAA has initiated a research task to assess the potential for developing a joint cost-schedule model and the usability of the model.
|Life Cycle Cost Growth Study - 20 Science Mission Directorate (SMD) Missions
|Previous studies have concentrated on examining development cost growth, excluding launch vehicle, mission operations and data analysis. This study looks at cost growth from a Life Cycle Cost perspective, including launch vehicle and mission operations and data analysis. Costs are separated into major WBS elements to see if any elements are more likely to have cost growth than others. The different WBS elements are tracked over time with snapshots of costs at major milestones to see where in the project life cycle the cost growth occurs.
|Software Cost Estimation Using a Decision Graph Process: A Knowledge Engineering Approach
|Sherry Stukes, Dr. John Spagnuolo
|At The California Institute of Technology/Jet Propulsion Laboratory, methodologies for Flight Software (FSW) cost estimation and documentation are determined that allow for efficient concurrent and consistent analysis within a tight schedule constraint. This knowledge is structured or "engineered " to facilitate the implementation of FSW cost estimation by others who wish to serve as practitioners in the field.
|Here, There Be Dragons: Considering the Right Tail in Risk Management
|The portfolio effect is the reduction of risk achieved by funding multiple projects that are not perfectly correlated with one another. It is relied upon in setting confidence level policy for programs that consist of multiple projects. The idea of a portfolio effect has its roots in modern finance as pioneered by Nobel Memorial Prize winner Harry Markowitz. However, in three recent ISPA-SCEA conference presentations, “The Portfolio Reconsidered” in 2007, “The Fractal Geometry of Cost Risk” in 2008, and “The Portfolio Effect and the Free Lunch” in 2009, the author has demonstrated that the portfolio effect is more myth than fact.
|Estimating Issues Associated with Agile Development
|New software development methodologies with combinations of old and new ideas are getting increasing public attention. These ideas all emphasized close collaboration between the programmer and business experts; face-to-face communication; frequent delivery of software units; and tight, self-organizing teams. Agile is a frequently used developmental process that follows this paradigm. As the use of Agile Development expands, the cost analyst is faced how this process affects the basic estimating fundamentals. This paper will discuss the best estimating practices that should be applied to agile programs and will recommend a set of techniques for agile estimation including.
|SMC/PMAG - Control Account Manager (CAM) Notebook Evaluation
|Eddie Hall, Nhung Tran, Mun Kwon
|The Control Account Manager (CAM) notebook is the key document for assisting the CAM and government action officer in the integration and management of the control account. The CAM notebook enables “single thread” analysis of the technical scope, integrated master schedule (IMS), resource loading profile, earned value (EV) performance data, and subcontractor costs. Past and present practice tends to focus on the financial and EV performance data, schedule and program plan data. The technical scope of the control account has largely been ignored.
|EVM & Scheduling
|Estimating Life-cycle Cost of West Virginia Fiber Reinforced Polymer (FRP) Bridge Decks
|The main objective of the research was to study the economic viability of West Virginia Fiber Reinforced Polymer (FRP) bridge decks. Life?cycle cost of those bridge decks were estimated for conducting such analysis. Three main differences that distinguish the way the life?cycle cost of FRP deck was estimated are: (1) the manufacturing cost of a FRP bridge deck was estimated using learning curve theory; (2) cost savings in support structures when FRP is chosen as opposed to the alternative bridge deck was modeled; and (3) the service life was estimated based on factor method to minimize the subjectivity of the estimates. The three case studies for West Virginia FRP deck projects show that based on the estimated life?cycle cost, FRP decks are financially viable under certain conditions.
|Life Cycle Cost
|Simple Mean, Weighted Mean, or Geometric Mean?
|There are three commonly used methods to calculate an average (i.e., mean): the simple average, weighted average, and geometric average. Analysts often use averages as estimating guidance when predicting nonrecurring (NR) costs using ratios of NR costs to first unit (T1) production costs. For example, when the “average” of the NR to T1 ratios is determined, the NR cost can be estimated as a “factor” of its T1 cost. Here, the simple average is simply the arithmetic mean of these ratios while the weighted average is derived by weighting each ratio by its T1 cost. Consequently, deciding which average (i.e., factor) is the most appropriate metric for estimating the nonrecurring cost is frequently debated.
|Advancing the Art of Technology Cost Estimating- a Collaboration between NASA and Boeing
|Mark Schankman, John Reynolds
|Advancing the art of estimating the cost of developing technologies is a long term need for both government and industry. This paper describes a collaborative effort by Systems Engineers at Boeing and NASA resulting in cost estimating tools and analysis techniques that assist in evaluating the development cost of aerospace technologies.
|Software Cost Estimating Relationships
|Software cost overruns are a common problem for the majority of software development projects. With the ever increasing amount of software present in current Department of Defense (DOD) programs, it is extremely important to generate an accurate software cost estimates. There are many complex models that estimate software development productivity and costs. This paper builds upon the principles of these models to look for a simple regression model that can be used to generate accurate and defendable cost estimates for software development programs.
|Software & IT
|The Portfolio Effect And The Free Lunch
|The portfolio effect is the reduction of risk achieved by funding multiple projects that are not perfectly correlated with one another. It is relied upon in setting confidence level policy for programs that consist of multiple projects. The idea of a portfolio effect has its roots in modern finance as pioneered by Nobel Memorial Prize winner Harry Markowitz. However, in two prior ISPA-SCEA conference presentations, “The Portfolio Reconsidered” in 2007 and “The Fractal Geometry of Cost Risk" in 2008, the author has demonstrated that the portfolio effect is more myth than fact. Additional cost growth data have been collected for an updated study.
|PERFORMING STATISTICAL ANALYSIS ON EARNED VALUE DATA
|Eric Druker, Dan Demangos, Richard Coleman
|Some Earned Value Methods, particularly those described in the equations on the DAU Gold Card, suffer from the shortcomings that they are backwards looking (they do not make a prediction of the final CPI) and do not allow for inferential or descriptive statistics. This leads to the propensity for these estimates to tail-chase as the CPI changes over time, meaning that the EAC for an over running program will systematically lag in predicting the overrun, and the EAC for an under running program will systematically lag in predicting the under run.
|NASA Productivity Study
|Joe Hamaker, Tom Coonce, Bob Bitten, Henry Hertzfeld
|This study was performed at the request of the NASA Administrator and examined NASA productivity in terms of economic efficiency. The analysis examined historical trends of NASA “bang for the buck” metrics over time, and compared NASA productivity to other organizations including the Air Force, the NRO, ESA and commercial space taking into the account the relative complexity of the missions. Past studies with a similar focus were reviewed and their results summarized. Finally, recommendations for improving NASA productivity were solicited from industry, government and academia and are documented in the study. The results of the study were briefed to the Administrator in late December 2008 and a journal quality paper is being prepared for submission to the Journal of Cost Analysis and Parametrics.
|Methods and Challenges in Early Cost Estimating
|Department of Defense (DoD) leadership is currently making decisions on acquisition programs much earlier in the system’s lifecycle than they have in the past, and leadership demand for cost information to support these decisions is growing rapidly. Since its inception two years ago, the ODASA-CE Early Cost Team’s primary mission has been to find ways to ensure that the cost information to support these early decisions is available and reliable.
|A Distribution?Free Measure of the Significance of CER Regression Fit Parameters Established Using General Error Regression Methods
|Timothy P. Anderson
|General Error Regression Methods (GERM) have earned a strong following in the cost estimating community as a means of establishing cost estimating relationships (CERs) using non?linear functional forms. GERM has given rise to a wide variety of functional forms for CERs, but has so far lacked a means for evaluating the “significance” of the individual regression fit parameters in a way that is analogous to the roles played by the t?statistic in ordinary least squares (OLS) regression.
|Illustrative Example of Flight Software Estimation
|The effort to effectively model the development of flight software for manned space is complicated because of unique approaches used by NASA. Data from previous efforts are frequently poorly documented. Also the infrequency of flight software development activities for new manned vehicles means that techniques from previous projects are not applicable to the present day due to the maturation of software languages. This presentation will explain the challenges faced in developing a parametric model-based software estimate and how these areas were addressed.
|Software & IT
|Adaptive Cost-Estimating Relationships
|Stephen Book, Melvin Broder, Daniel Feldman
|Traditional development of cost estimating relationships (CERs) has been based on “full” data sets consisting of all available cost and technical data associated with a particular class of Products of Interest (PoIs), e.g., components, subsystems or entire systems of satellites, ground systems, etc. In this paper, we extend the concept of “analogy estimating” to parametric estimating by deriving “adaptive CERs, namely CERs that are based on specific knowledge of individual data points that may not be reflected in the full data set at the time that the original CER was derived.
|Best Overall/Methods & Models
|Software Total Ownership Cost: Development is Only Part of the Equation
|Software development is a costly, often schedule driven activity, prone to compromises to meet schedule. Many of these compromises have far reaching impacts on the cost of software maintenance, total ownership costs and software sustainability
|Recognizing Earned Value Management (EVM) Gaming
|Walt Majerowicz, Dorothy Tiffany
|Why would a contractor or project manager intentionally reduce an Estimate-To-Complete or activity duration? What are the implications of your contractor keeping two sets of Earned Value Management “books” and how would you even know about it? Under what circumstances would a contractor manipulate schedule logic? What is a “rubber baseline?” These questions and others like it raise serious concerns that all project teams using Earned Value Management should recognize. This presentation examines some common and not-so-common gaming, abuse, and data manipulation techniques that some projects or contracts may employ. At best, these techniques are errors or misunderstandings on the part of the contractor, at worst they are outright fraud and could place the entire project outcome in jeopardy.
|Earned Value Management/Schedule
|Cost Overruns and Defense Contracting
|This paper will show that incentive fee contracts in the Department of Defense (DOD) for the development phase are not effective in eliminating the cost overrun problem faced by the DOD. This will be examined first by analyzing the optimal share ratios with the Weitzman model on two specific DOD development programs, then through a comparison of various programs with extreme cost overruns and varying contract types and fee structures. Finally, reasons explaining the cost overruns will be explored.
|How to Capture Discrete Cost Risks in Your Project Cost Model
|Dave Graham, Alfred Smith, Melissa Cyrulik, Robert Kellogg, Robert Bitten, Debra Emmons
|Discrete cost risk analysis is defined in different ways by different cost analysts. It can mean the identification of specific risk under a specific scenario, the estimate of the cost impact due to that specific risk and/or the addition of the cost estimate for the risk into the cost estimate of the element, subsystem or system. Almost always, it can involve more than one risk under a given scenario. However, each scenario requires a risk adjusted cost estimate and there is generally only enough time to develop a few specific scenarios. NASA PA&E/CAD cost analysts are exploring simulation methods to perform discrete cost risk analysis.
|R2 vs. r2
|Cost estimating relationships (CER) with multiplicative error assumptions are commonly used in cost analysis. Consequently, we need to apply appropriate statistical measures to evaluate a CER’s quality when developing multiplicative error CERs such as the Minimum-Unbiased-Percentage Error (MUPE) and Minimum-Percentage Error under the Zero-Percentage Bias (ZMPE) CERs.
|Cost Risk Allocation Objectives, Tendencies, and Limitations
|Allocating cost risk for building a budget is sweeping through the cost estimating community faster than cost estimators can keep up. Few understand the goals, behavior, implementation, or limitations of results generated by popular risk allocation methods. This paper provides perspective on the issue by going beyond the basic mechanics to examine general tendencies, trade-offs, and objectives of cost risk allocation heuristics with focus on the quantification and verification of the allocated results. Some important limitations of existing heuristics are discussed, including the inability to capture program priorities, schedule impacts, and contract vehicles, followed by recommendations on how to capture these in future allocation methods.
|Learning Curves Redux: A New Use for a Familiar Tool
|Evin Stump, Alexandra Minevich
|This paper proposes another use for learning curves, namely the scheduling of production operations. This is not entirely a new idea, but it is usually not formally implemented in the planning of production operations. Learning curves alone can’t do this planning job effectively, but when combined with other appropriate, relatively simple logic, the result can be an automated scheduling process that predicts not only the cost of each produced item, but also the dates when production of each item will start and finish, plus a spread of labor hours and material costs by month, or even by week, if desired.
|Capabilities-Based Costing: Approaches to Pre-Milestone-A Cost Estimating
|The issue of early, rigorous evaluation of program costs is becoming more important as defense funding come under greater scrutiny. Often at this point in the life cycle, a requirement or desired capability is known, but the manifestation of the solution is unknown or described only at a high level. Can capabilities alone be used to produce a cost estimate? If so, how can we link the proposed solution to existing systems if only a particular solution’s general capability set is known?.
|The Evolution of Hardware Estimating
|Just as Nintendo evolves gaming and Apple innovates telecommunications, PRICE Systems examines how hardware estimation evolves as technology improves at breakneck speeds. This paper presents the findings of a study on the impacts of improving technology on the development and delivery of hardware and hardware systems.
|Earned Value Management
|Realities of Cost As an Independent Variable (CAIV) - Stakeholder Perceptions
|Greg Kiviat, Stuart Swalgen, Sebastian Botta
|Cost As an Independent Variable (CAIV) has been shown to be an effective tool to identify and mitigate program risk by balancing effectiveness and affordability. CAIV establishes goals and provides cost feedback to development teams as requirements change and design details emerge. However, depending on the project phase (Concept, Technology Development, SD&D, Production and Operating & Support) and the level of stakeholder support (both customer and contractor), there is great variability in the level and scope of CAIV application and investment in cost analysis capability.
|Best Overall/Management & Lessons Learned