ICEAA Archives

Search past ICEAA Workshop Proceedings in the table below and click the title to access the downloadable files.


2007-2024 Workshop Proceedings are available online. For 2006 and earlier, please email us.

TitleAuthor(s)SummaryYearTrack
Data-Driven Lifecycle Analysis to Optimize Cost, Risk, and SustainabilityGeorge BayerMany government infrastructure investments adhere to a standard lifecyle to estimate program cost, plan replacement timing, and compare business cases to one another in a cost-benefit analysis. What if those lifecycle replacement timelines are inconsistent with system sustainability and are not cost-effective? Some infrastructure systems which are replaced according to an end-of-life schedule can be sustained more cost-effectively for longer periods of time via preventative maintenance. Our team examined multiple infrastructure program replacement timelines and analyzed operational effectiveness, cost/risk trade-offs, system redundancy, and sustainability, and we recommended lifecycle adjustments based on those considerations. We reduced overall program cost by extending replacement timelines, eliminating system redundancy without compromising sustainability, and reprioritizing maintenance portfolios on critical backlogs. We document a comprehensive process to customize program lifecycles to optimize cost, risk, and sustainability.2024Analytical Methods
Triage the Sub-Projects: Calculating and Applying Portfolio ContingencyStephen KoellnerRisk-adjusted cost estimates are needed to understand the potential range of actual costs through execution. Cost risk analysis produces uncertainty distributions which can be used to calculate an expected cost as well as contingency, which can be thought of as the difference between expected cost and a higher confidence level chosen for planning purposes. In a portfolio of projects, allocating uncertainty at the portfolio level will result in a different risk-adjusted cost than applying the same allocation at the project level, and so it is unclear whether a portfolio should allocate and manage risk-informed contingency at the portfolio or project level. This topic will explore methods for calculating portfolio contingency, using a tangible example to demonstrate.2024Analytical Methods
Things Forgotten Since College - Foundational StatisticsJordan HarlacherStatistical analysis is one of the foundations of cost estimating, but fundamentals are easy to overlook. This presentation will help ensure that is not the case for your next estimate as we will discuss how the data collection and organization processes can form the basis for your estimate. Once the relevant data has been collected and organized, the real fun begins, as the central tendencies and variability of the data can now be examined. The central tendencies and variability can be used to determine the most applicable distribution and assess the probability of different events occurring. We will examine the best ways to visualize different data sets, using charts and graphs to convey the information clearly to stakeholders, as visualizing the data can help inform relationships between variables. Finally, we will touch on key statistics to look for in your regression analysis to ensure a meaningful relationship is defined.2024Analytical Methods
Stretching Purchasing Power through Improved Escalation MethodsAmanda SchwarkEscalation methods ensure cost estimates adapt to economic changes and facilitate accuracy and reliability. The NNSA chartered the Programmatic Recapitalization Working Group (PRWG) to track mission-critical equipment directly supporting weapons activities across the NSE. The PRWG maintains a comprehensive database of equipment above the NNSA capital acquisition threshold of $500,000. The previous escalation methodology for equipment purchase price was limited to using a single equipment inflation index. Additional fidelity in price projections can be achieved by leveraging empirical price data and published indices to derive escalation rates specific to various equipment categories. This paper explores our approach to improving upon the previous escalation methodology to better inform planning and programming decisions. This approach can be leveraged when one broad escalation index is used to predict costs for many significantly differing data elements.2024Analytical Methods
Spacecraft Design to a Cost Target: From CAIV to CosmosRyan SukleyPerfect performance of every system is critical for space missions. Identifying capable designs is a challenging process, and one that often comes at the expense of exceeding cost targets. The Cost as an Independent Variable (CAIV) approach helps mitigate this issue by treating cost as a primary consideration in the design or procurement of systems. Establishing a fixed cost target sets a ceiling for the cost versus performance trade-off and, in the case of NASA's in-house spacecraft, enables more cost-conscious decision making. This paper examines the application of CAIV to identify upper bounds for parameters (mass, power, quantity, etc.) early in the process of designing a spacecraft that satisfies mission requirements. It describes the process of developing, maintaining, and explaining the limitations of this capability, and addresses potential applications of the approach to other commodities.2024Analytical Methods
Early-Stage Cost Growth CER DevelopmentGabriel SandlerCapital acquisition projects at the National Nuclear Security Administration (NNSA) have experienced significant early-stage cost estimate growth, driven in part by early optimism and missed scope. To account for these potential scope changes, NNSA's Office of Programming, Analysis, and Evaluation (PA&E) developed a cost estimating relationship (CER) for construction projects which relates the actual total project cost (TPC) to its early-stage scope estimate. This methodology differs from usual CERs which model actual cost as a function of actual scope, but reflects the scope uncertainty NNSA projects have at early stages. Three cost drivers (gross square footage, hazard category, and equipment complexity) were selected as the variables to solve for the TPC. The results of the CER were compared to another PA&E CER built with actual scope and actual costs so that early-stage cost estimate growth at the NNSA for various types of capital acquisition projects could be quantified.2024Analytical Methods
Market Dimensional Expansion, Collapse, Costs, and ViabilityDouglas K. HowarthMost government programs set out with cost caps and minimum force requirements. Commercial projects usually begin with a budget, sales targets, and specifications. All too often, in both cases, producers and customers give little thought to the changing market structures they face. When it comes to Demand, markets self-organize to form up to four boundaries each, including 1) Upper (price-limited), 2) Outer (saturation-limited), 3) Inner (efficiency-limited), and 4) Lower (margin-limited) Demand Frontiers. When new market segments appear as different product forms with enhanced functionality over existing options, as the new markets grow, the product groupings they replace may contract across one or more Demand Frontiers. This paper examines preparing for these inevitable eventualities in an N-dimensional framework.2024Analytical Methods
Comparison of UMP in the Great Recession and the Covid-19 RecessionNathan GallacherThis piece aims to produce a review of the Unconventional Monetary Policy (UMP) used in both the Great Recession 2007-09 and the COVID-19 Recession, then compare the two recessions to show how unconventional monetary policy changed, differences in tools used by the Bank of England and the size of the tools put in place. Notably, tools such as quantitative easing see use in both recessions suggesting similarities in the aims of the Bank of England during both recessions. The main results show a significant increase in the use of unconventional monetary policy from the Great Recession to the COVID-19 Recession. At the same time, inflation outcomes were worse during the COVID-19 Recession. This suggests that the greater reaction by the BoE in the use of UMP towards the COVID-19 Recession may not have been as effective in controlling inflation compared to the Great Recession.2024Analytical Methods
Explosive Analysis: Using Data to Hold Warfare Centers AccountableRyan WebsterThe Joint Service Explosive Ordnance Procedure Publications program creates and maintains critical documents for the safe handling of ordnances. This effort is managed by Naval Warfare Centers. Historically, senior leadership has funded these efforts without the ability to evaluate reasonableness of annual funding requests. Augur has recently obtained publications system data, resulting in valuable analysis of historical efforts. This data is being leveraged to develop a planning calculator, capable of estimating ranges of labor hours based on ordnance type, country of origin, and other complexity drivers derived through regression analysis and other visualization techniques. This tool and the accompanying insights will enable senior leadership to negotiate with warfare centers and more easily measure performance.2024Data Science & Machine Learning
Maximizing Analysis of Minimalized DatasetsTaylor FountainMany techniques exist to determine parametric relationships within large datasets. While cost estimation relies heavily on identifying such relationships, a data-scarce environment, driven by factors such as vendor proprietary restrictions, security concerns, and the uncertainty of emergent technologies, is a common barrier in implementing these techniques. This topic will evaluate common methods to analyze minimalized datasets for developing defendable cost estimates, such as complexity factors and 3-point distribution fitting, and demonstrate the statistical impacts of their underlying assumptions.2024Data Science & Machine Learning
Labor Mapping in Parametric EstimatesDavid FerlandContractors and Original Equipment Manufacturers (OEM) alike often struggle applying specific resources or labor categories to their parametric estimates. Many parametric modeling techniques produce hours by generic resources that still need to be translated into labor resources that have rates and other attributes before they can be useful for analysis. I will outline a tool development framework that fills this gap and allows the cost estimates to stay in-sync with downstream tools like ProPricer that may compile the final estimate. This case study uses TruePlanning® as an input to the pipeline but can be applicable to most parametric sources. In cases where Basis-of-Estimates (BOEs; as opposed to Realistic Cost Estimates or RCEs) using proposed resource hours are still being required to justify parametric estimates, the traceability and justification of these pipelines is also an important consideration.2024Data Science & Machine Learning
Data Cleaning in Python for BeginnersAlexis SomersAs cost estimators, we collect large amounts of data from many sources, and it's often messy. Cleaning and organizing the data often requires time-consuming manual effort before proper analysis can begin. Using Python to clean and manipulate data is one of the easiest ways to save time and maximize efficiency when working on cost or other data analyses. As a free, beginner-friendly, and versatile tool, Python is an excellent choice for processing and analyzing data. This session will cover how to get started using Python to create simple scripts that produce clean, organized data. We will use the pandas and NumPy libraries to clean datasets by correcting errors, reformatting data, handling missing values, adjusting for outliers, and more. The ability to create simple Python scripts can improve the quality of your cost estimates and other deliverables by improving accuracy, efficiency, and saving time.2024Data Science & Machine Learning
Going Beyond Count-Based Methodologies with Semantic Vector EmbeddingsTrevor LaxMachine Learning (ML) is a topic of persistent interest and a frequent buzz word because of the astounding capabilities it has shown across disparate fields. However, the complexity of ML combined with the overwhelming number of options can lead to decision fatigue and reduced understanding in new users. While much attention is duly focused on the data and machine, occasionally the basic components of ML, such as input data type, are not properly interrogated. Indeed, a frequently used Natural Language Processing method, Term Frequency - Inverse Document Frequency (TF-IDF), simply uses counts, which cannot encode syntactic or semantic information. An alternative to TF-IDF, Word-2-Vector, creates vector embeddings of the words in a corpus, instead of relying on sentence-level counts, and attempts to encode Semantic information. Word-2-Vector has its own limitations, such as the need for a large corpus, however, it can allow for better performance and greatly improved flexibility.2024Data Science & Machine Learning
Automation and Process Improvement in Cost EstimatingAnil Divvela2024Data Science & Machine Learning
AI and Machine Learning/Data Science Tools for Cost AnalysisDaniel HarperAI and Machine Learning/Data Science Tools such as Chat GPT have taken on an expanded presence in Cost Analysis. E.g., NLP is used to automate functional software sizing in commercial models. Large Language Models (LLM) may even have applications for cost and acquisition professionals. We will present an overview of modern usages of data science, to include Machine Learning, AI and data visualization. We will also provide several use cases for applying these tools in cost estimation.2024Data Science & Machine Learning
Costing Web App Development for Operations ResearchKyle FerrisCommercial-off-the-shelf (COTS) web application development platforms empower analysts to leverage low-code environments to build comprehensive business tools. Therefore, understanding the lifecycle cost requirements to design, develop, deploy and maintain low-code web applications as both analytical and decision support tools for stakeholders is of interest to the cost community. We define web application lifecycle requirements as analogous to an overarching Data Operations Stack. The Data Operations Stack is a conceptual framework that describes data operations as a set of hierarchical requirements, from base-level IT infrastructure and tools to high-level business products. With this framework in mind, we describe web application lifecycle requirements through successive levels of the Data Operations Stack, elucidating the required personnel, tools, and capabilities integrated into each level. Finally, we discuss how an understanding of interconnected dependencies across the Data Operations Stack can be used to develop defensible cost estimates and manage resources for web application lifecycle requirements.2024Data Science & Machine Learning
From a Man-Month to an AI-Minute, Myth or Reality?Colin HammondIn this session I will share some of our discoveries of using AI over the last five years that can help software cost estimators and our thoughts on how AI will be changing software development costs in the coming years. Back in 1975 Fred Brooks discussed observations of software engineering, many of which are counter-intuitive in a book entitled The Mythical Man Month, we pay homage to his book title in this talk as we share some observations and quantifications of how AI is helping to improve early software estimation. I will also share our predictions on areas where AI will help accelerate software development and impact on software costs over the next few years.2024Data Science & Machine Learning
Implications of Generative AI (Artificial Intelligence) in Software EngineeringArlene F. MinkiewiczGenerative AI is positioned to revolutionize software development, with potential far reaching implications for productivity. Generative AI applications leverage Large Language Models to understand language, imagery and code, then use what they learned to generate content; answering questions, organizing multimodal information, and writing text and code snippets. A McKinsey report from 2023 reports that the software development landscape is quickly changing as Generative AI applications such as ChatGPT and Github Copilot have the potential to enable software engineers to complete development tasks; achieving as much as 2x productivity over traditional development practices. Activities such as inception and planning, system design, coding, testing, and maintenance can all be aided through applications of Generative AI. This paper will include an introduction to Generative AI in the software engineering context. Following will be a discussion of productivity impacts and guidance for incorporating them into a software estimates.2024Data Science & Machine Learning
Distribution Free Uncertainty for CERsWilliam KingFor this presentation we intend to introduce and demonstrate the application of conformal prediction as a tool to specify prediction intervals for any machine learning algorithm. Conformal prediction intervals offer rigorous statistical coverage guarantees without distributional assumptions and only require the exchangeability of data (a weaker assumption than independence). Moreover, generating these prediction intervals is an easy consequence of retaining the sub-models trained during k-fold cross-validation. Specifically, we intend to summarize the "CV+ for K-fold cross-validation" method (and its locally weighted variant) from Predictive Inference with the Jackknife+ (Barber, Candes, Ramdas, Tibshirani, 2021, The Annuals of Statistics), and show how conformal prediction enables distribution free uncertainty for CERs. Additionally, we plan to discuss how this technique can be applied to direct human-in-the-loop intervention when applying machine learning models.2024Data Science & Machine Learning
Industry Leaders' Insights: Enhance Efficiency and Simplify Your Work Using AIKaren Richey MislickThe modern workplace is increasingly influenced by leaders who recognize the transformative power of data analytics and AI. This presentation delves into the practical experiences and insights gleaned from industry frontrunners effectively utilizing these technologies. These leaders have not only achieved significant operational efficiencies but have also mastered the art of simplification in complex business processes. Their lessons underline the importance of strategic integration, the value of data-driven decision-making, and the transformative potential of AI-driven automation. Attendees will gain a comprehensive understanding of how top enterprises are reducing costs, streamlining operations, and fostering innovation. Drawing from real-world case studies, this presentation aims to encourage cost analysts to tap into the immense potential of data analytics and AI, turning insights into actionable strategies for enhanced work efficiency.2024Data Science & Machine Learning
Generative AI for GovernmentConner LawstonChatGPT' has been making massive waves across the world in the last year! This presentation gives an introduction to several 'Generative AI' models, and how they can create new images, code, data, and text, seemingly out of thin air. We will look at the process of how to build these models, including their training dataset sizes and costs. Examples will be shown of how to use ChatGPT to generate python code for you, as well as R, and PowerBI. After the general overview, specific examples of applications to Government will be shown (including acqbot- an AI tool for generating proposals). There will also be a demo of the 'GURU' bot, which was trained on the Federal Acquisitions Regulation (FAR) pdf, and can answer questions about PPBE, EVM, and Acquisition questions. We will summarize the pros, cons, and potential risks of Generative AI, as well as the future outlook to come. 2024Data Science & Machine Learning
The Cost-Risk Uncertainty Determination (CRED) Model – A New ApproachCheryl L. JonesThe objective of this model is to improve the credibility of and trust in a cost estimate by: 1) Identifying, characterizing, and accounting for different cost performance factors that may be sources of risk/uncertainty that can result in creating material impacts on a software sustainment and maintenance cost estimate. 2) This approach makes visible the “knowledge gap” (if any) between "what should be known" and "what is known" about the system under assessment - this "gap" is an input used to assess a range of uncertainty associated with the estimate. 3) It also fully documents the key program issues and related performance factors that may influence the cost estimate and why. While this presentation focuses on the software domain, it is easily adaptable to other domains.2024Management, EVM & Risk
Schedule Risk at Early AcquisitionGabriella MagasicIt can be difficult to construct a realistic schedule early in the acquisition lifecycle due to the limited certainty of requirements, design decisions, and other key elements of program planning. Understanding risk and uncertainty in a schedule is essential, and the GAO Scheduling Guide includes "Conducting a Schedule Risk Analysis" as one of the 10 Best Practices. A Schedule Risk Analysis (SRA) can provide quantitative insight into potential areas of delay along with associated cost impacts. However, a well-formed SRA requires clear input and structured analysis of risk events and uncertainty. In this presentation, we will discuss how to address schedule risk in low maturity projects by investigating different risk modeling techniques, reviewing existing guidance on schedule risk, and analyzing how uncertainty analysis must be interpreted and applied early in the project lifecycle.2024Management, EVM & Risk
Cost Estimation for Project ControlEd SpriggsProject control in software development is a critical responsibility of program managers and contracting officers. And although the job is a difficult one for most analysts, the inability to measure and control what is being created and tested can result in loss of stakeholder confidence and, in the worst case, a cancelled project/program. What got us here? You guessed it - agile development. The adoption of agile means less defined up-front scope and little to no requirements documentation. While that flexibility allows for more development freedom it creates more variability in the features and functionality of the delivered product. This paper will describe the best new and existing practices for forecasting capabilities (features) that can be delivered within a certain timeframe given the fixed parameters of cost, schedule, and development team size. We will explore innovative techniques to measure software features, even in the absence of requirements, using function points and story points among others.2024Management, EVM & Risk
Advancing EVM with a Modernized FrameworkAaron EverlyDoD's FY24 procurement budget is the largest in history. The cornerstone of this budget is the procurement of complex, technologically advanced systems. DoD programs require new technologies to meet end-user requirements; however, the challenges inherent in new technology often translate to significant cost growth. PMs utilize EVM analysis to make informed decisions and mitigate contract cost growth. The IPMDAR exemplifies DoD's recognition of the need for meaningful data by requiring a modernized data schema (machine-readable format providing near real-time cost performance). Likewise, Technomics implements a modern approach to EVM using data analytics software and BI tools applied through a framework that incorporates a comprehensive view of EVM. This paper describes Technomics' EVM Framework (EVMS Surveillance, Contract Startup, Data Aggregation, EV Analysis, and Program Management), which implements modern tools to not only reduce monthly reporting tasks but also perform powerful EV analysis that enables programmatic decisions.2024Management, EVM & Risk
EVM Reviews – Surveillance Reviews vs. IBRsSam KitchinSuccessful Earned Value Management (EVM) implementation requires an effective Earned Value Management System (EVMS) and a well-planned performance measurement baseline. Meaningful insight into project performance can only be achieved with this combination of a compliant system with the active planning and management of project execution. A critical method to evaluate adherence to EVM best practices is to conduct reviews. Compliance reviews and surveillance reviews are used to evaluate the sufficiency of the EVMS, while integrated baseline reviews are used to assess the reasonableness of a project baseline. This presentation will compare and contrast these two types of review, demonstrating how and why they differ. Key terminology, stakeholders, artifacts, timeline, and intended results will be discussed. Real life examples may be used.2024Management, EVM & Risk
Advanced EVM Analysis using Time Series ForecastsAnna B. PetersThe recent digitization of contractor EVM data affords cost analysts a newfound ability to execute robust statistical and data science techniques that better predict total project cost and schedule realism. Time series analysis, a well-established method in private sector finance, is one such method. Auto regressive integrated moving average (ARIMA) models may capture the persistence and patterns in EVM data, as measured by CPI, SPI, and schedule execution metrics (SEMs). As a second option, macroeconomic regression models can measure the relationship between contract performance and external considerations, like unemployment and inflation, over time. Both techniques, moreover, may forecast future changes in EVM variables interest, like IEAC. This presentation will discuss how these types of time series models and forecasts are employed on real acquisition programs and their associated IPMDAR data using Python based tools to raise program analysts' alertness to emergent acquisition risks and opportunities.2024Management, EVM & Risk
Deriving Total Project Costs from WBS Elements' Probability DistributionsRainald KasprikStudies on possible cost variances in major acquisition projects are focusing on total project costs in order to come to plausible project budgets with a confidence level of 80%. Different lognormal probability distributions had been worked out representing different states of uncertainty. However, these models cannot be applied when using risk management software for deriving the total project costs based on cost probability distributions for WBS elements. Due to a limited processing capacity, risk management software demands a division of the underlying probability distributions into intervals A simple discretization of the models developed to date is not possible, as these models contain unrealistic cost growth factors. Based on simulation studies, three lognormal probability distributions are presented that meet these challenges. Finally, some practical hints are given on the minimum number of intervals which still represents the curvature of a probability distribution and on how to interpret the joint CDF's not-defined areas.2024Management, EVM & Risk
Cascading Effects - Performance Impacts of Fragile TasksTommie (Troy) MillerThe growing popularity of Joint Cost & Schedule Analysis has highlighted the need for quality Schedule Risk Assessments (SRA). Modeling schedule risk and uncertainty requires an understanding of schedule networks. Network Analytics (NA) has been furthered in recent years due to research in fields such as social networks, IT networks, and transportation networks. Key aspects of these advancements can be used in SRAs to improve our understanding of schedule risk and mature our modeling techniques. For example, epidemiologists study the propagation of diseases through a community. The techniques used to model this phenomenon can be applied to SRAs to model the propagation of task slips through schedules. This presentation integrates classical concerns in schedule analytics, principally Merge Bias, with NA processes, such as node centrality measures and edge properties, to uniquely identify fragile tasks and illustrate how delays in these tasks cascade through a schedule and ultimately affect program execution.2024Modeling
Data-Driven Constellation Architecture Design Using Integrated ModelsW. Allen WautletThe modern space mission landscape requires consideration of numerous trade variables to deliver optimal mission performance at low cost. Academic methods exist to address such challenges, however, practical deployment of these methods to constellation mission design remains uncommon. This paper presents a practical space mission constellation architecture approach that employs proven statistical, data science, and machine learning techniques on the products of an integrated cost and engineering modeling framework. When deployed at the early stages of constellation development, this integrated modeling framework and analysis approach provides stakeholders insight into key design parameters that drive mission performance and cost sensitivity. Furthermore, it enables the uncovering of promising design regions in large trade spaces that can be further examined and refined by technical subject matter experts. This approach leads to better decision making earlier in the acquisition timeline and increases the efficiency of design cycles.2024Modeling
Mission Operations Cost Estimation Tool (MOCET) 2024 StatusMarc HayhurstThe Mission Operations Cost Estimation Tool (MOCET) is a model developed by The Aerospace Corporation in partnership with NASA's Science Office for Mission Assessments (SOMA). MOCET provides the capability to generate cost estimates for the operational, or Phase E, portion of full NASA space science missions. MOCET is a widely accepted model in the NASA community used in full mission Announcement of Opportunity competitions since 2015. MOCET was awarded NASA Cost and Schedule Team award in 2017 and honorable mention in the 2021 NASA Software of the Year competition. The cost estimating relationships and documentation have been implemented as a standalone Excel tool that is available within NASA and publicly through software.nasa.gov. Extended mission and Level 2 work breakdown structure costing capabilities are continuing to be developed and a status will be presented.2024Modeling
A CASE for Estimate Analytics at the Enterprise LevelJosh AngeoAre our estimates improving over time? What did this cost 2 years ago? When was the last time we reviewed this estimate? These questions, amongst many others, are why SSC FMC developed the Cost Analytics for SSC Estimates (CASE) tool. CASE includes over 175 cost estimates, 60 programs, and goes back as far as 2017. The tool creates comprehensive dashboards capable of analyzing programs individually and in aggregate. CASE utilizes various data sources and performs extensive data pre-processing to ready the data for Power Bi. Data pre-processing steps utilize python, DAX, and Power Query. Estimate data comes from a combination of POST reports, PDFs, and spreadsheets. Custom meta data tables were developed to enable parsing and other functions. Lastly, data sources comprising of program actuals have recently been integrated. All of this results in a new found capability to evaluate estimates using analytics.2024Modeling
Modeling Electronic/IT System Deployment ProjectsF. Gurney Thompson IIIThis presentation will discuss the development and application of cost models for electronic and IT system deployment projects. The deployment projects include various technical studies and preparation activities, site survey visits, and comprehensive installation efforts across many sites. The models consider size drivers such as the amount of hardware and software systems to be installed, number of sites, scope of activities, and number of different system configurations. Project complexity can be adjusted for many system/technology intricacies and site conditions. The models have been applied successfully, with validation against actuals, in estimating deployment costs for communication network deployment projects such as data centers, air traffic control towers, and military vehicle/ship/aircraft communication systems. Additionally, these models have been applied to weapon system and train signaling equipment deployments, with model validation relying on expert judgment. This presentation outlines the model's development, stru2024Modeling
Recipe: Homemade Pizza (or Facility Estimate)Kristen MarquetteHave you ever wanted to "wow" your guests with a homemade pizza, but didn't know where to start? This is how we felt when beginning our facilities estimates. This presentation will break down both recipes step by step, leaving everyone satisfied and writing rave reviews. Just as you need delicious dough, sauce, and toppings for great pizza, you need detailed structural, material, and recurring scope requirements for a great facilities estimate. We will take you through our experience with data collection spanning multiple facilities and serve up comprehensive methodologies with high fidelity. If you don't have time to create a homemade pizza or perform your own detailed facilities analysis, you can leverage the tools and methodologies provided (as to-go slices), to build your own facilities estimate based on your
specific program requirements.
2024Modeling
Well, That Escalated Quickly – A Novel Approach to Forecasting EscalationSean WellsEscalation rates are an important part of estimates and as such the provenance and derivation of indices should be regularly scrutinized, yet are rarely contemplated. This paper will compare a commonly used black-box escalation resource, IHS Global Insight, to a traceable, simplified forecasting method to determine if a purely mathematical model delivers an improved level of forecasting accuracy. Our model relies on a curated set of Bureau of Labor Statistics (BLS) indices to develop a moving average forecast. With access to over 15 years of IHS forecasts dating back to 2006, spanning 800+ indices, this study has the unique opportunity to quantify the accuracy of IHS and moving average forecasts against historical BLS indices. Our paper will establish and explore various measures of forecast accuracy for use in creating defensible estimates. The goal is to provide a quick, transparent, and flexible way to develop tailored escalation projections without sacrificing accuracy.2024Modeling
Comparative Analysis of NASA Cost Estimating MethodsCamille HollyNASA policy and customer expectations dictate use of various cost estimating tools depending on milestone and program maturity, regardless of level of effort or accuracy of results. This paper presents a case study of the tradeoffs of modeling the cost of an unmanned space mission using different NASA-approved parametric tools. The comparison addresses subsystem and component-level cost estimates, providing invaluable insight into the granularity of cost modeling for complex space missions and differences in results associated with more or less granular estimates. The study offers perspective on the challenges and opportunities associated with parametric cost modeling methodologies due to the varying levels of input detail, and of effort, needed to complete an estimate. It also aims to provide practical insights on the number and types of subjective decisions made when modeling costs using different approaches, and the impacts that these choices have on cost results. 2024Modeling
The Nuclear Option: Avoiding Critical Delays with Advanced Constraints AnalysisHannah Hoag LeeNNSA construction projects are often subject to funding constraints. The ripple effect of funding shortfalls can be severe; projects are forced into suboptimal execution profiles that produce costly schedule slips with drastic mission implications. This experience is not unique to NNSA construction projects. Funding constraints occur in most government sectors, negatively impacting many types of projects' progression, schedule, and mission. However, since inadequate funding is often unavoidable, it is imperative to use a data-driven methodology to predict schedule deviations and calculate ideal cost phasing to mitigate additional or unanticipated implications on project timeline. This paper demonstrates how a constrained phasing model uses historic project cost and schedule data to estimate a new project timeline based on a constrained funding profile. It also reveals how the model re-phases costs for the remainder of the project duration to generate a viable execution plan.2024Modeling
Costing a Ballistic ScheduleRob CarlosJoin us to explore an imminent solution addressing recurring concerns in the DoD involving cost overruns and schedule delays resulting from program practices and schedule dynamics. We'll address the power of Integrated Cost & Schedule Risk Analysis (ICRSA) & Joint Confidence Level (JCL) assessment from a DoD program office perspective, emphasizing its practicality. Such outputs yield more reasonable and quantifiable estimates by incorporating cost & schedule risk and uncertainty. We'll present a case study involving a DoD ACAT IB program, discussing the lessons learned during ICSRA implementation and JCL attainment. Our presentation illustrates the impact of ICSRA and JCL, facilitating improved forecasting, early risk identification, trade space analysis, and informed decision-making. The primary objective is to provide real world insight based on lessons learned, quantitative analysis, and creative problem solving on the efficacy, utility, and power of the ICRSA and JCL.2024Modeling
Flavors of Commonality: Learning in a Multiple Variant EnvironmentBrent M. JohnstoneCommonality – the reuse of parts, designs and tools across multiple aircraft models -- is a popular strategy to reduce program costs in commercial and military applications. But its use poses unique challenges to learning curve practitioners. This paper examines five approaches to estimating multiple variant programs using different learning curve techniques. A notional dataset is created, and the accuracy of each method is measured to highlight the advantages and disadvantages of each. This presentation should be of interest to anyone doing learning curve analysis in their cost estimates.2024Modeling
Installation Cost AnalysisEric WhiteNavy IT program managers have been frustrated in recent years by increasing system installation costs. Large amounts of siloed but related installation cost data has previously proven difficult to analyze and identify core problem areas. This paper describes an innovative new solution to this problem, utilizing data visualization tools to combine related data sources and illustrate the trends and relationships in visuals that make it easy for program managers to consume and act upon. By dynamically expanding cost data, this visualization dashboard can express cost across time, product types, location, and more, while also offering the ability to quickly drill into the inherent cost makeups. Not only can this tool quickly identify significant variances, but also offers an explanation to those cost variances. Once the historical cost data is understood it is then used in a cost model that accounts for the time value of money across future years.2024Modeling
Scrutinizing an Organization's Project Planning PerformanceSergey KozinExplore the intriguing world of Project Planning within a typical sustainment organization, spanning nearly a decade worth of estimation and execution data for dozens of special projects as PMs, Engineers, and Estimators desperately fight to defend their budgets and keep the system operating. Did we prioritize having thoroughly developed requirements definitions or wait till the 11th hour to establish them? Was schedule and scope realistic or heavily reliant on optimism as a primary methodology? Did we find ourselves broken and send up a signal flare or accept the shackles of a constrained budget? It is accepted that no plan or estimate is perfect, but rarely do we scrutinize and quantify the errors of our ways to encourage improvements within the process. Join this thought-provoking expedition, as we use metrics to judge the performance of planning practices, seeking insights and wisdom for the projects that lie ahead.2024Processes & Best Practices
Mission Class in Unmanned Space EstimatingJohn SwarenThe cost engineering community needs consistent guidelines in addressing mission assurance processes for a given space vehicle mission risk class (A, B, C, or D) based on programmatic constraints and mission needs. This presentation reviews current considerations and research. Current best practice recommendations typically relate Mission Class to an operational environment specification that conveys quality information based upon requirements. Specific end item maintenance accessibility, reliability, structuring, testing and documentation requirements are typically driven by mission operating environment. More user-definition is needed for factoring in parts quality, test-sampling, orbit-ranges and mission duration. Mission Class estimating needs to tailor component-level Part Quality as well as affect "informed" higher-level assembly and system charges. Modeling operational environment should reflect specification flow-down, validation and documentation and modification/ integration of subcontracted material items.2024Processes & Best Practices
A Look Back: Ten Years of Cost Estimate AuditsBen WilderSince the Government Accountability Office (GAO) Cost Guide was released as an exposure draft in 2007, GAO has used it as criteria to review and assess agency's cost estimates. This presentation will look at a 10 year period (FY13-FY23) to see (1) if there are any consistent gaps in agency performance of the four characteristics of a reliable cost estimate and (2) if there has been any improvement in scores over the course of the 10 year period.2024Processes & Best Practices
GAO Agile Assessment Guide: Best Practices in an Agile EnvironmentJennifer LeottaIn 2020, the Government Accountability Office (GAO) released the Agile Assessment Guide as an exposure draft. After an open comment period, vetting comments, updating text, and applying best practices during audits, we have recently issued the updated Agile Guide. This presentation will provide an overview of the best practices discussed in the Guide and then take a deeper dive into Chapter 7; using program controls such as using a WBS in an Agile environment and what we have found in recent audits for programs using cost estimating, scheduling, and EVM best practices in an Agile environment.2024Processes & Best Practices
A Series of Unfortunate SlidesShannon CardozaEmbark on a journey through the realm of impactful presentations, where we unravel the secrets to captivating briefings. Picture this: a vivid showcase of real-life blunders that often muddy the waters of comprehension and engagement—slides lacking labels, drowning in excessive words, or confounded by chaotic transitions. Join us as we delve into the essence of strategic naming and purposeful design to craft presentations that captivate and inform. Witness the transformation with us as we reveal the magic of visuals and charts, drawn from successful briefings to Cost Chiefs, PEOs, and beyond. You'll discover how to master the art of avoiding tricky questions by leveraging compelling visuals and enhancing your soundtrack for seamless narrative flow. Moreover, we'll shed light on how these skills not only save valuable time and resources but become a cornerstone for professional growth — empowering you to conquer larger audiences with clarity and confidence.2024Processes & Best Practices
The ABC's of Contract Proposal Pricing Evaluation & Cost AnalysisChristopher SvehlakAlmost every nook and cranny of the Government relies on contracts for services, support and, well, "stuff." As a cost estimator (especially a certified one), you may not know that you probably already have the requisite base of knowledge, skills, abilities and Excel-spreadsheet-jockey talent to learn and do pricing evaluation and cost analysis of contract proposals. This presentation offers you the "what-for," the "why," and the "how-to-perform" to potentially add this tool to your arsenal. It will distinguish between price evaluation/analysis and cost analysis, their purposes, when each is needed, and explain cost realism and reasonableness. Then comes the nitty-gritty -- how to perform a pricing evaluation and cost analysis on contract proposals. The goal: you leave with a better understanding & appreciation of the process ... and perhaps even consider offering your services to the contracting department.2024Processes & Best Practices
The Complex Database Design Tool BeltJamie BouwensThe process of designing a Dimensional Database (DDB) for complex and evolving data types can be difficult for those who have never made one before. A case study is used to demonstrate how to turn an unsustainable method of data management into a DDB using two Six Sigma methodologies, Define-Measure-Analyze-Improve-Control (DMAIC) and Define-Measure-Analyze-Design-Verify (DMADV). DMADV is a preventative method used to create a process from scratch. While DMAIC is a reactive method that is used to improve an existing process. We walk through answering questions such as: What is a fact in a varied, complex, and evolving data set? How do you visualize these fact tables and dimensions? How do you track time phased data? We illustrate that these techniques are cost saving, because they reduce rework, and, most importantly, enable individuals without extensive prior experience to successfully implement an operable DDB.2024Processes & Best Practices
Context-Responsive Cost Evaluation: Dynamic Approach to Cost Estimate ReviewsBrittany HolmesAnnually, MDA's Cost Estimating Directorate self-audits program cost estimates and provides each program a score ranging from 0.0 to 4.0 based on questions extracted from GAO's Cost Estimating and Assessment Guide. The questionnaire does not consider details about the program being estimated, so it assumes that an immature program can achieve the same score as an established program. This project suggests a new approach to assess estimates that takes into account the life cycle and known details of the program. This presentation compares the two scoring mechanisms, details the variables used to define optimal scores, and defines how to implement the new scoring mechanism.2024Processes & Best Practices
Space Fence: A Cost Analysis Success StoryRick GarciaIn 2007, the AFCAA was tasked with providing a Non-Advocate Cost Assessment (NACA) for the Space Fence program, among others. As a central cost estimator on that team, this analysis will describe the research and actions taken by the team to develop cost and schedule estimates that represented an unbiased view of the program's most likely cost under normal conditions. This analysis will also describe the partners used for the critical independent technical assessment, as well as the successful interactions with the industry leading companies that were competing for the contract. Lastly, this analysis will describe how our team developed a Ground Radar specific expenditure phasing model based explicitly on historical ground radar programs.2024Processes & Best Practices
Convincing Leaders of the Value of COTS Tools for Quick AssessmentsKaren MourikasMultiple COTS tools and industry databases exist in our profession. But many organizations prefer to develop their own tools based on their own historical data, which then better represents their own environment. However, often the effort to develop these tools can be time-consuming. What happens when decision makers need answers immediately and there isn't enough time to collect and analyze their own data? One approach employs COTS tools and their underlying industry data. But cost analysts often need to convince decision makers of the validity of using COTS tools. This presentation describes several use cases in which program decision makers needed information right away, issues facing the decision makers, how the cost analysis team convinced program leaders of the validity of using COTS tools, including their pros & cons, as well as surprising insights that emerged, ultimately enabling decision makers to determine feasible paths forward.2024Soft Skills & Trending Topics
Priceless Culture: Crafting a Culture for the Future of WorkCassidy ShevlinPriceless Culture: Crafting a Culture for the Future of Work delves into the intricate web of elements that constitute a thriving organizational culture. At its foundation lies effective leadership, setting the tone for a space where core values are not just stated but lived out daily. A unified purpose drives every team member, fostering genuine accountability across all levels. Essential to this mosaic is effective communication, ensuring that everyone is not only heard but also understood. Furthermore, the culture is enriched when leadership embraces vulnerability, showing authenticity and encouraging openness. Intertwined with all these is the spirit of gratitude, acknowledging every contribution, big or small. In an era where workplaces are rapidly evolving, crafting such a priceless culture is not merely beneficial—it's imperative for the future of work.2024Soft Skills & Trending Topics
Equity and Environmental Justice in Early-Stage NNSA PlanningHaley HarrisonRecent executive orders (EO13985, EO13990, EO14008) directed federal agencies to prioritize environmental justice and reduce systemic barriers affecting minority and underserved groups. As an organization specializing in decision support for early-stage planning, the Office of Programming, Analysis, and Evaluation has developed a framework for incorporating quantifiable factors as proxies for equity and environmental justice-related factors in analyses of alternatives and early-stage planning studies. This framework will be used to inform decision-makers about potential project impacts from an equity and environmental justice-focused lens.
An equity and environmental justice-informed approach to planning within the NNSA can minimize the incidence of negative environmental and health outcomes, maximize the number of opportunities available to historically marginalized groups, and contribute to greater trust of the NNSA mission within minoritized communities improving equity.
2024Soft Skills & Trending Topics
Advancing the Art of Cyber Cost EstimatingAustin MacDougallThe growth in quantity and intensity of cybersecurity threats has led to new cyber best practices, such as Zero Trust and Secure by Design. These practices present challenges when developing cost estimates for the development and maintenance of information systems. This paper examines how these topics and other new cyber trends influence costs. It evaluates the cost implications in both the design (incorporating cyber requirements into new system development) and sustainment (cyber support for existing systems) phases. This research also examines existing cyber frameworks and relates them into a cost element structure to drive data collection and methodology development. Finally, this paper translates cyber cost estimating lessons learned into recommended content improvements to the technical baseline documentation upon which cost estimators rely. Standard treatment of cyber in technical baselines should facilitate much needed consistency in the composition of cyber cost estimates.2024Soft Skills & Trending Topics
Mind the Gap: Bridging the Generational DivideJennifer AguirreDo you ever feel you're speaking a different language than your peers? Ever struggle relating to your IPT as they talk about recent college experiences or upcoming retirement plans? Join us as we explore various ways each generation sees the world, whether through their own eyes or through a high-res smartphone camera. Let's bridge that gap to reap the full benefits of working in a multi-generational environment enabling effective connections between cost and IPT members. With each wave of people comes new ideas, perspectives, communication styles, and workplace preferences. This diversity can be challenging to navigate and when not properly managed can cause miscommunication, feelings of exclusion, disconnected goals, and failed tech baselines. When harnessed properly, it can be the superpower enabling success within team cohesion, gathering cost inputs, and delivering estimate packages. Whatever stage of life you're in, come with us on a journey of self-discovery in the workplace!2024Soft Skills & Trending Topics
ChatGPT: Friend or Foe - Meet Your New EN SMEPatrick CaseyChatGPT Friend or Foe is an insightful exploration into the capabilities and nuances of ChatGPT. Delving deep into the genesis of this AI model the presentation tracks its evolution from inception by OpenAI to its fourth iteration. Patrick Casey, a Senior Cost Analyst at Quantech Services, candidly shares his experiences with the tool, highlighting its transformative power in various use cases for cost analysts ranging from WBS considerations to innovative recipe creations. While celebrating its prowess the presentation does not shy away from addressing its limitations and security concerns, urging a cautious approach. As a grand finale, attendees are treated to an entirely AI-generated TV commercial. This engaging journey demystifies ChatGPT offering both appreciation and critical insight into this modern marvel. Beyond mere technology the presentation invites audiences to consider the impact of AI in our lives challenging us to harness its potential responsibly.2024Soft Skills & Trending Topics
Economics of Digital Twins in Aerospace and DefensePatrick K. MaloneDefense and Aerospace systems engineering is transforming from a document to a digital model framework, leveraging low-cost multidisciplinary modeling, analysis and simulation tools. With these methods, engineers can specify, analyze, design and verify systems. Digital Twins enable the approach, they are digital or virtual replications of physical products and processes allowing increased speed to market and performance evaluation at reduced costs. Not straightforward is return on investment evaluation when generating cost to develop digital twins. This paper looks at development of DT architectures, capabilities and resulting life cycle cost estimates. Factors impacting DT development costs are model fidelity, design features, analytical tools, integration difficulty, scalability, and programming languages. Concepts are grouped providing practitioners tools and methods to apply digital twin concepts to recommended solutions that maintain positive ROIs and identify cost drivers.2024Soft Skills & Trending Topics
From "Plan and Pray" to "Sense and Respond": War Gaming Defense AcquisitionAlex Wekluk"The most dangerous phrase in the language is, 'We've always done it this way.'" - Rear Admiral Grace Hopper. The need for flexible and rapid solutions in the face of emerging threats warrants a radical reset in defense acquisition. NATO's canonical post-World War II plan-acquire-pray acquisition processes lack the agility to meet a generational change in what military historian John Keegan calls the face of battle. A new paradigm is urgently needed to meet the exigencies of modern warfare with the adaptability of the best business firms: innovating and reacting at the speed of competition. This paper provides an innovative risk-driven framework for an Acquisition War Game that laser-focuses on key metrics such as scalability, logistical footprint, time-to-contract, and fungibility – to support today's battles and near-peer competition with our enemies. This new Acquisition War Game strategy senses and responds rather than plans and prays, meeting reality head-on in an ever-changing battlespace.2024Strategy
FP&A: Can We Disrupt Traditional Government Cost Estimating?Christopher MetzThere is tremendous value potential in the cost estimates built today across Government under the guidance of GAO's best practices, but perhaps with varying realization. "Cost Teams" and "Cost Estimators" are sometimes viewed as simple calculators of FTEs times labor rate in the minds of those who do not understand where a cost estimate goes and how its value can increase the chance of mission success. At our relatively new DoD Agency, we set out to find the industry equivalent to "Cost Estimating" and found "Financial Planning & Analysis (FP&A)." As we stand-up this competency we have been gathering and integrating best practices from industry's "FP&A" and Government's "Cost Estimating" along with novel ideas and contracting vehicles to disrupt the cost estimating field to better operationalize our cost estimates, steward taxpayers' dollars, and meet the mission.2024Strategy
Portfolio Analysis Made Effective and SimpleBrandon SchwarkEffective portfolio analysis strategies rely on robust recognition of resource constraints, competing priorities, interdependencies, and executability. They transform complexity into simplicity. Our strategy details a flexible, efficient, and analytically rigorous evaluative framework that integrates complex sets of interconnected analyses to assist leadership with data-driven resource allocation. The framework offers solutions in data cleaning, optimization algorithms, and visualization tools that enable stakeholders to effectively navigate complicated portfolio landscapes. Applicability of the framework is demonstrated through a use case that details a facility construction portfolio expected to grow aggressively in the coming years. This paper addresses the complex and often conflicting portfolio objectives mentioned above and outlines their corresponding solutions.2024Strategy
Parametric Construction Estimating: Consistency through TurbulenceCortney CollinsNot all estimates are created equal, but all are necessary. How is construction estimating different from DoD estimates? They both predict costs based on agreed-upon requirements; they both use historical information to develop parametrics; and they both exist as living documents, updated as new information becomes available, and delivered to the customer to assist with budgeting and purchasing. So – how are they different? This paper will highlight some of the major disparities – from how inflation is handled, to validity of pricing, to how the current economy factors into the estimate. We will also explore how materials could be affected by pandemics and natural disasters (hurricanes, earthquakes, etc.). All of these events can impact the prices of lumber and steel – which in turn, can have estimators scrambling to update the models.2024Strategy
Leveraging Cost Estimating Techniques in Price to Win AnalysisDarren KreitlerLeveraging cost estimating techniques is pivotal in "Price to Win" (PTW) analysis for competitive bidding. This session delves into various techniques, from analogy-based to parametric and bottom-up estimating. By integrating these methods with PTW analysis, organizations can strategize optimally, balancing profitability with competitive pricing. Real-world applications underscore the benefits of this synergy, emphasizing the role of accurate cost prediction in securing contracts and ensuring sustainability in today's dynamic markets.2024Strategy
Cloud Estimating in the 21st Century – Okay, well in 2023!Chris PriceCloud deployments represent a fast-paced technology. The ability to produce quality cost estimates for Cloud Deployments is challenging. In the current state, cost estimates must be able to address Kubernetes Orchestraters, Containers, IaaS and PaaS. Cybersecurity is also key to cloud deployments and modern development processes include working in a DevSecOps environment using Agile software development approaches. This presentation will discuss all these challenges and describe ways to perform quality cost estimates for cloud deployments.2024Software
Simplified Software Estimation (SiSE) – Updated on Advancements and TrendsCurtis ChaseIn 2019, representatives from the DHS Cost Analysis Division (CAD) presented early research findings for their Simplified Software Estimating (SiSE) approach at the ICEAA annual professional development workshop. Since then, further advancements ensued facilitated by the IFPUG Simple Function Point Method (SFP), revisions and expansion of the DHS CAD verb keyword lexicon, the addition of requirements risk and uncertainty considerations, and a full guidebook supporting the method. The addition of uncertainty reduces the risk associated with requirements and verb keyword ambiguities. It also gives the estimator the flexibility to create min/max/most likely estimates for requirements that are simply vague at this early requirements stage. As such, the sizing results take into account uncertainties related to different document writers, style, and verb interpretations. This presentation outlines some of the key findings, ongoing research, and (re-)introduces the SiSE approach, offering a more streamlined and accessible process.2024Software
Agile Software Development Cost EstimatingJim GoldenThis presentation will discuss agile software development cost estimating in the multi-year planning cycle. Agile software development programs focus on near term workload and activities with only limited future planning cycles identified. Future cycles are only activated as their start date nears. Any model-based cost estimate or predictive analysis for the budget needs to be flexible, responsive, and adaptive to the daily dynamics of Agile software development program planning and execution. As a cost estimator, integrating with the IPT for a particular program or project has always been a critical factor in understanding requirements, gathering data, and producing a quality estimate. With agile processes being adopted more frequently across software development organizations, cost estimators and program offices are challenged even further to work closely with developers to continuously update cost estimates. Agile sprint results reveal progress of development, and subsequently could affect the cost estimate and budget requests.2024Software
How to Choose a Database Storage ModelTristan JuddTo design and implement a database solution, teams must conceptually understand how data is formatted in storage. We compare traditional ways of storing data in Excel or CSV formats with that of a scalable SQL format. Within a SQL database, data is typically stored in either a relational or dimensional format and we will explain these formats for novices with examples. Relational may be easier to implement but is less powerful than a dimensional format. We take you through the process of analyzing the types of data used in a team, and how that would be reflected in a dimensional format. The ability to query efficiently, linkage to popular business intelligence techniques, and scalable structure make dimensional databases the preferred option for structured data storage.2024Software
Measuring Software Development Efficiency in an Agile EnvironmentBenjamin TruskinAgile software development practices, while designed to deliver value sooner and accommodate changing requirements, are not intended to mitigate cost growth. Nevertheless, Program Managers must navigate this paradigm and control risk while ensuring stakeholder requirements are fully met. Traditional metrics used to measure growth (e.g., SLOC counts, productivity factors, requirements sell-off) are likely unavailable in Agile projects and while recent DoD policy recognizes the need for metrics, agile metrics are not standardized and using them for independent estimation is uncommon. This paper discusses real-world experience balancing leadership's goals for independent analysis with the realities of an Agile environment. It will show the value of utilizing program-specific metrics and calculating useful measures such as Change Traffic and Feature (in)efficiency for producing defensible estimates, enabling better program outcomes, and providing insights for others to use themselves.2024Software
A Software Sizing Toolkit – Choosing the Right Approach(es) for Your PCarol DekkersYou've probably heard of source lines of code (SLOC) and function points as choices for software size, but what about RICEFW, t-shirt sizing, story points and Simple Function Points? Like the old adage "If all you have is a hammer, everything looks like a nail" – the most appropriate software sizing approach for your cost estimate may include multiple sizing methods. This presentation outlines the various units of measure available and outlines how and when each approach is most suitable. It's a primer for cost estimators new to software intensive systems and who need to understand what are the options available when estimating software projects.2024Software
Unlocking Untapped Software Metrics Potential with Jira's RESTful APIBlaze SmallwoodMany software projects manage their efforts in Application Lifecycle Management (ALM) tools, like Jira, and these tools can capture a rich set of data, which can be a treasure trove for a cost or project analyst. However, many analysts limit themselves by simply exporting flat lists of records from the tool's User Interface (UI), which ignores valuable data captured in the system that can further enhance various analyses. This paper will focus on Jira and explain how an analyst can access several interesting additional data sets from its RESTful Application Programming Interface (API) with appropriately structured Uniform Resource Identifiers (URI). This paper will also cover how an analyst can use Java or Python programming to parse the JSON data returned from the API and produce simple but powerful data formats that can inform metrics dashboards or cost analyses.2024Software
Risky Business: Navigating the World of Software ProductivityDave BrownSize and productivity are commonly cited as the two major software development cost drivers. Logic dictates that the two are related and inversely correlated. But what is the probabilistic range of uncertainty for productivity, given a software size? What is meant by “an 80% confidence level for productivity”? Cost analysts often quantify uncertainty with an S-Curve; why can't this be done for productivity directly? We use International Software Benchmarking Standards Group (ISBSG) data to estimate the distribution of productivity directly and provide closed-form formulas for the fitted distribution(s). We find that productivity (and, with certain assumptions, cost) can be estimated with an S-Curve directly, using built-in Excel formulas, with no need for Monte Carlo simulation. This result has significant implications for almost any software development cost estimate, and is particularly relevant to agile development efforts where time-boxed effort is generally fixed. 2024Software
Sizing Agile Software Development ProgramsBob HuntSize is a critical element in software cost estimation. As Agile has become more prevalent, the use of lines of code as a software size metric for software estimation has become less accepted. This presentation will discus and compare sizing alternatives including "tee shirt" sizing and functional size alternatives on large Federal Programs. The presentation will provide some emerging metrics for assessing size.. Since many automated models convert functional size to physical size, the presentation will address techniques to accomplish "backfiring". The presentation will address the use of Natural Language Processing and models such as Cadence and ScopeMaster. And, the presentation will discuss models such as COCOMO III that directly convert functional size to hours.2024Software
Why Care About CEBoK-S if we Don’t Build Software?Carol DekkersGiven the increase in software-intensive programs today, it should come as no surprise to experienced cost estimators that even minor software development can render a program overbudget and behind schedule. This presentation outlines the key differences in cost estimating approaches from traditional industries (hardware, facilities, systems) versus software development, and why CEBoK-S knowledge is critical for today’s cost estimators. Given that close to 60% of software projects are deemed failures (overbudget and/or late), with little improvement despite modern technologies, understanding the basics of software cost estimating can provide a competitive advantage for anyone involved in estimating programs for which software development is a component.2024Software
Design to Cost: What it is and Why You Should CareKaren Mourikas, Henry Apgar, Lisa ColabellaDesign to Cost (DTC) has been around since the 1970s, falling in and out of favor. But what exactly is DTC? And will it help control program or product costs? Googling it yields various definitions, sometimes contradictory. Examples fluctuate from large defense programs to mass-produced commercial components. Implementation differs depending upon company, life-cycle phase, and objectives. This presentation is the culmination of an effort by the SoCal ICEAA Chapter to educate our community on DTC.2023Analytical Methods
Accuracy, Precision and Uncertainty in Cost Estimates Timothy P. AndersonCost proposals for large government contracts are necessarily very detailed and typically highly precise, expressed down to the nearest dollar. This presentation explores the differences between precision and accuracy in cost proposals in the context of uncertainty and cost growth. Moreover, the presentation is interactive, with thought-provoking questions peppered throughout, providing numerous opportunities for audience participation for improved knowledge and retention.2023Analytical Methods
Innovative Risk-Driven Contract Pricing StrategyBrian Flynn, Robert Nehring, Peter BraxtonThis paper presents a framework or scoring matrix that quantifies contract risk using a numerical evaluation of the many factors that make or break a program, such as: experience of the contractor, stretch in technology, solidity of requirements, and degree of competitive procurement. The framework, in turn, is made operational by leveraging benchmarks from over 60 U.S. contracts, enabling data-driven selection of contract type, incentives, and share lines for use in evaluating future contract prices.2023Analytical Methods
One Number to Correlate Them All - Efficient Implementation of CorrelationAnh (Anne) Harris, Karen McRitchie, Christian SmartCorrelation is a key consideration in cost and schedule risk analysis, as its exclusion causes significant underestimation of uncertainty. When assigning values in the absence of functional correlation, this can be accomplished by considering every WBS estimate. However, this can be time consuming for a detailed estimate. In this presentation, we discuss an alternative method that uses a single value, which offers significant time savings and discuss its implementation in the SEER model suite.2023Analytical Methods
Incorporating Risk into Analysis of Alternatives Results: A Novel MethodologyBrittany Clayton, Lauren MayerWhile Analysis of Alternatives (AoAs) commonly calculate the expected cost and effectiveness of alternatives, risks to cost overruns and degraded mission effectiveness are analyzed separately. RAND developed and employed a novel methodology for incorporating these risks into cost and effectiveness results. We collaborated with each AoA working group to identify, quantify and map risks to existing cost and effectiveness model variables, allowing for results to be displayed as a comparison of the alternatives’ risk-adjusted cost-effectiveness.2023Analytical Methods
Using Bayes' Theorem to Develop CERs – Extending the Gaussian ModelChristian Smart, David JoBayes' Theorem is a mathematical method to combine prior experience with new data and is extremely important in leveraging limited information, which is often the case in cost estimating. This paper is an extension of a previous ICEAA paper that dealt with the application of Bayes' Theorem to cost estimating. In this update we show how the Bayesian approach for linear models can be extended for different, more realistic assumptions.2023Analytical Methods
Alternative Risk Measures for Determining Program Reserves Louis FussellNASA project managers hold cost and schedule reserves equal to a 50% joint confidence level and the managing directorates hold reserves to a 70% confidence level. These joint confidence levels are quantile risk measures. This paper discusses the drawbacks of quantile risk measures and proposes the use of super quantiles for determining project reserves. Several projects are analyzed and a comparison of risk measures is presented.2023Analytical Methods
The Space Between Us: A Novel Collaborative Spacecraft Estimating FrameworkAlex Wekluk, Benjamin TruskinGovernment space acquisitions are renowned for their complexity and can suffer from notable cost and schedule overruns. The space cost community is composed of disparate organizations employing different cost estimating methods, even when the same industry partners are building the spacecraft. This paper leverages a Technomics internally-developed space estimating framework, industry-released Cost Estimating Relationships from multiple agencies, and NRO CAAG external collaboration experience to examine and contrast previously incompatible methods, and improve synchronization.2023Analytical Methods
Measuring Schedule UncertaintyTommie (Troy) MillerThe growing popularity of Joint Cost & Schedule Analysis has highlighted the need for quality SRAs. SRAs require that uncertainty distributions be assigned to all schedule activities. The most popular technique for making this assignment employs a binning structure that assigns each task to a particular bin. Most binning structures have little theoretical support. This paper attempts to improve on this technique by discussing how uncertainty determination and allocation is conducted within the Sentinel program.2023Analytical Methods
Spread Thin: Managing Coefficient of Variation in Monte-Carlo Based Cost Models Stephen KoellnerCoefficient of Variation (CV) can be utilized to determine whether sufficient uncertainty is captured in Monte-Carlo based estimates. This topic explores common barriers to capturing program level risk using the interpretation of a WBS as a linear combination of distributions. A WBS CV equation is provided to model perturbations of a baseline case and then randomized WBSs are generated to analyze CV at scale. Estimators can apply these insights to improve program estimate risk calculations.2023Analytical Methods
Follow the Money in Government EstimatesGeoffrey P. BoalsStruggle with finding the correct source for actuals? Contracts for RDTE items are only bundles with no breakouts for the data you need? This session we will assist estimators in how to use government contract, Military Interdepartmental Purchase Requests (MIPRs), and other sources to find the relevant data needs for a strong basis for estimating. The session will detail a proposed data structure to store relevant data from the separate data sources to help power your estimates.2023Data Science
"Cracking the Code": Demystifying Programming Languages for Data AnalyticsKyle Ferris, John MaddreyIn recent years, buzzwords such as "Machine Learning" and "Artificial Intelligence" have made their way into the cost community’s vocabulary. While data science concepts such as these certainly advance the cost estimating industry, their significance is lost when analysts find themselves unfamiliar with fundamental programming principles. This presentation aims to demystify and compare commonly used programming languages such as R and Python, with the purpose of helping analysts who wish to develop coding expertise but don’t know where to begin.2023Data Science
Level up your Cost Analysis with Data ScienceSarah GreenHow can developments in the field of data science help us transform many of our analytics capabilities in the Cost Community? AFCAA is leveraging the AF's cloud-based platform VAULT and state-of-the-art toolsets to in order to increase process automation, data curation and predictive models, and therefore significantly increase both quality and efficiency of our data analytics. This presentation will include a live demonstration including a demo of custom cost applications, tools and dashboards to help you in your everyday estimating- come see how the cost community is being revolutionized!2023Data Science
Web Applications for Cost Analysis Use: R Shiny vs Python Plotly DashEric J. Hagee, Christine K. ClayWeb applications are a new tool for the cost community. Here, we compare and contrast two common web application tools, R Shiny and Python Dash. We develop a dashboard in each tool to display various USA government inflation measures and compare/contrast the use of either tool. Our findings are summarized, future thought points identified, and relevant background on the tools, inflation measures, use cases, and tool implementation provided.2023Data Science
Using Machine Learning to Improve Cost Data NormalizationBryan MariscalCost data normalization is time consuming, but remains critical to building a defensible cost model. SSC's Unmanned Space Vehicle Cost Model (USCM) database contains over 120 normalized datasets that span nearly a 50-year history. Developing machine learning models from the older data that generalize well to newer data would be difficult due to data obsolescence. However, we have identified an example use case with promising initial results that will be examined in detail.2023Data Science
Automating the Data Preparation Process using R ProgrammingZachary WestOne of the most challenging aspects of the cost estimating process is the collection and consolidation of data prior to carrying out any analysis. Initial data collection and preparation steps are often manual, messy, and time consuming. I plan to present an automated solution using R Programming that efficiently collects and merges structured data reports, organizes and prepares the relevant data, and exports the results to support cost estimating efforts.2023Data Science
Interesting Results from EVAMOSC or "Wow There is a Lot of O&S Data"Daniel GermonyEVAMOSC is an new OSD-CAPE developed database which ingests and normalizes Operations & Sustainment (O&S) data on over a thousand weapon systems for the DoD. In this presentation, the EVAMOSC team will review interesting results it has found related to O&S data using the hundreds of millions of datapoints it has ingested and normalized. Pareto rule? Try hyper-Pareto rule! How did COVID impact vehicle maintenance? Not how you would have expected. Are field units experiencing inflation/escalation? All this and more.2023IT& Cloud Computing
Applying Simple Function Point Analysis to an 804 Rapid Acquisition Program CostBob Hunt, Rainey Southworth, Chad LucasWhen software is developed under the Middle Tier of Acquisition (Section 804), Adaptive Acquisition Framework (AAF), these programs tend to be actual agile software development programs. These programs focus on delivering capability in a period of 2-5 years with rapid prototypes and rapid fielding with proven technology. Sizing a Software development program remains a major challenge. This paper outlines the advantages and issues associates with using Simple Function Point analyses to size and estimating the cost for a Major Rapid Acquisition Software Program.2023IT& Cloud Computing
Cloud-Based Machine Learning in the DoD EnvironmentConner Lawston, Bryan EckleFollow up to the 2022 Presentation "Supercharging Machine Learning with Cloud Computing." This project has made huge strides in the past year- having made over 5,000 predictions since February, all 100% autonomously and within the DoD ADVANA environment. There have been numerous improvements (like increasing ML accuracy via PCA, & speeding up calculation time with SPARK) and lessons learned as we expanded into production for the Joint Chiefs, MDA, Navy, Marine Corps, and more!2023IT& Cloud Computing
Addressing CI/CD with Automated Testing in Your Software EstimateArlene F. MinkiewiczAgile development and DevOps/DevSecOps practices are becoming increasingly common in software development projects in commercial, government and government contractor organizations. Three practices arising are Continuous Integration, Continuous Delivery and Continuous Deployment. Automated testing is clearly an investment, but one that promises to lead to productivity and quality improvements. This paper presents the general concept of automated testing focused on practices highlighted above. The cost impacts of employing automated testing practices will be demonstrated and discussed.2023IT& Cloud Computing
Comparing the Cost of Cloud Services: AWS vs AzureGunnar Nichols, Patrick Casey, Sergey KozinCloud Services continue to play a growing role in how the internet is used to provide capabilities to users. Amazon Web Services and Azure are the two biggest Cloud Service Providers on the market today. This research investigates cost differences for cloud services comprised of comparable virtual machines for workloads with defined computational requirements. This analysis is utilized to characterize scenarios where cost savings should be considered when supporting a Cloud Service Provider decision.2023IT& Cloud Computing
Shining Rays of Light & Savings on Cloud Portfolios: An Important Advance!Kenneth Rhodes, Alex Wekluk, RJ KrempaskyExisting cloud cost estimating and pricing tools have at least two significant drawbacks: 1) they require many assumptions, therefore a detailed understanding of current/future architectures and 2) they do not provide insights required to manage costs. Technomics developed a groundbreaking alternative that will revolutionize how cloud costs are estimated and managed. This paper describes our parametric-based toolset, which enables cloud lifecycle costs estimation based on known programmatic design requirements and cost reduction and efficiency analysis.2023IT& Cloud Computing
Budget Execution and Margin Simulation (BEAMS)Erik BurgessODNI works with organizations in the Intelligence Community to submit multibillion-dollar budget requests to Congress, based on cost estimates that are not exact. For an agency portfolio, this study quantifies the impact of budgeting to higher or lower confidence levels. By modeling the interactions among programs and the mechanics of the annual budget closure process, we show that estimates somewhat higher than 50 %tile are the "sweet spot" between minimizing growth and maximizing missions delivered.2023Modeling & Case Studies
Foundational Cost ModelsPaul Broaddus Franklin Jr.This study will unveil more than forty-five previously unpublished simple and learning-augmented cost estimating relationships for common construction materials and equipment. These relationships and other data allow us to investigate the hypothesis that the error distribution of conventional building components follow a skew exponential power distribution rather than the lognormal one usually applied to defense estimates. We also examine whether backcasting to the intercept can isolate approximate labor cost underlying composite expense data. This paper shares those relationships in the hope of expediting cost estimating from the ground-up.2023Modeling & Case Studies
CSI EU (Cost Scene Investigation – European Union)Doug HowarthIn 2000, several Airbus units from European Union member countries launched the Airbus A380. The program was projected to sell 1250 units with a development cost of €9.5 billion but shut down after selling 251 copies, as costs more than doubled. This paper examines its causes and how the entire debacle might have been prevented, as all the information needed to assess its viability existed before it started.2023Modeling & Case Studies
Trouble With the Curve: Engineering Changes and Manufacturing Learning CurvesBrent M. JohnstoneEngineering changes pose a dilemma for estimators: If learning curves assume cost improvement due to repetitive build, what happens when that repetition is interrupted by a change of task? Design changes are common occurrences, but rarely addressed in learning curve literature. This paper addresses how to analyze an engineering change by breaking it into its pieces and outlines techniques to calculate the reversionary impact on the learning curve to derive the estimated cost of change.2023Modeling & Case Studies
Developing a Schedule Model From a Cost Modeler's PerspectiveBenjamin Kwok, Daniel NewkirkCost estimators need to utilize and develop estimating methods for project components beyond cost. Schedule is one of those components. The team that brought you USCM is now applying their model building experience to the world of schedules. This journey has yielded some exciting new methods, products, and processes. It has also brought challenges as we could not address schedule exactly as we would with cost.2023Modeling & Case Studies
All in the Hierarchy: Meta-Estimators to Standardize Work Breakdown StructuresMaura Anne Lapoff, Patrick SheltonThe National Nuclear Security Administration (NNSA) has demonstrated how to use Machine Learning and Natural Language Processing to map disparate cost data to a standard, high-level Work Breakdown Structure (WBS). However, mapping data to deeper WBS levels becomes increasingly complex due to the hierarchical relationship between levels, rendering common machine learning models inadequate. Here we demonstrate how to implement a Hierarchical Classification Machine Learning scheme to map multi-level, hierarchical cost data to a common WBS.2023Modeling & Case Studies
The Cost of BureaucracyZachary Matheson, Jenna Vandervort, Daniel MulrowThe National Nuclear Security Administration (NNSA) has separate management methods for capital construction projects, with enhanced requirements for larger projects. Naively, one might expect that enhanced requirements might increase program management costs and possibly impact schedule. However, as the enhanced management requirements are tied to a dollar threshold, comparing costs is not straightforward. In this paper, we share results of a regression analysis that developed separate cost and schedule regressions for each management method and a comparison of the costs by management type.2023Modeling & Case Studies
Unprecedented – Accurate Estimating in the Hypersonics EraEric Sick, Christian SmartMany cutting-edge programs and emergent domains, such as hypersonics, have few direct historical precedents. This presents challenges in estimating the cost of such programs at the system level. A key insight is that many of the components on these systems have been implemented on numerous other historical programs, so one way to successfully tackle this challenge is by estimating at a component-level Work Breakdown Structure. This paper provides comparative examples of accurate estimates for aerospace advanced technology programs, including commercial space and government hypersonics programs2023Modeling & Case Studies
Systems Engineering Cost Estimation Aligned with MIL-STD-881FF. Gurney Thompson III, Vivian TangThis presentation reviews advancements in estimating top-level Systems Engineering efforts. We will propose a new model structure for alignment with MIL-STD-881F definitions, and improved scope control and output usefulness. We will discuss the drivers of effort distribution within systems engineering activities and resources, including system criticality level, safety risks, logistics scenarios, human factors considerations, technology readiness level and more. This presentation also discusses ongoing data analysis efforts to support model validation and prototype demonstration.2023Modeling & Case Studies
Cost Estimating: The Estimator's Guide to a Parametric UniverseTaryn Anne ReillyImagining tomorrow's cost estimating using parametric methods. Experience the advantages of parametric estimating throughout the opportunity and program lifecycle. Learn ways to significantly improve your estimating efficiency, accuracy and traceability through historical data calibration. Develop cost early so it influences the design process. Understand the importance of a standard WBS and curating completed program data to build calibrated models. Augment information in the Cost Community through data sharing, improving publicly available sources, methods, models.2023Modeling & Case Studies
Shortfall Analysis – Creative Approaches to Problem QuantificationGeorge Bayer, Brian CarrollGovernment acquisitions start with problem identification as a means of solving public sector problems or "shortfalls" and inefficiencies with technical hardware and software solutions. To justify government business cases, analysts are tasked to (1) define the problem, (2) identify impacted stakeholders, and (3) creatively quantify and monetize impacts to operations and services. Using a proven approach, the team demonstrates how to effectively perform shortfall analyses and monetize the largest underlying value to justify business cases.2023Processes & Best Practices
You've Invested, Now Sustain it: Insights into the Sustainment Review ProcessAdam Kidwell, John Liss, Alex BonichCongress, in NDAA 2017, established Sustainment Reviews to assess the planning and execution of Major Defense Acquisition Program (MDAP) Product Support strategies. Readiness is a critical aspect of Defense Operations, and rising supply chain costs highlight the fact that effective planning, budgeting, and execution of a program’s Sustainment Strategy is crucial to its long-term success. This paper provides an objective, experience-based perspective of Sustainment Reviews and best practices to ensure successful outcomes.2023Processes & Best Practices
Educating Future Cost Engineers in the Space Station Design Workshop (SSDW)Fabian Eilingsfeld, Nicolaus MillinThe space sector seeks young cost engineers who can work in a concurrent engineering environment. One means for scouting young talent is the annual Space Station Design Workshop (SSDW) at the University of Stuttgart, Germany. Two competing teams (size: 20) each must design a space station within one week. SSDW’s concurrent design facility (CDF) format offers students a unique opportunity to learn in a hands-on, international, and interdisciplinary environment. The use of cost estimating tools for teaching at SSDW is described. Lessons learned from the 2022 post-workshop survey are presented.2023Processes & Best Practices
From Software CONOPS to ROM Estimate in Six Easy Steps Carol Dekkers, Dan FrenchCreating software estimates using a Concept of Operations (ConOPS) document is a challenge for software estimators due to the low fidelity of the requirements. This presentation will illustrate how to create a defensible ROM (Rough Order of Magnitude) Estimate using the new IFPUG software functional size measure. Carol and Dan will also explain how to use this size metric to develop a ROM Cost and Schedule by leveraging various software estimating techniques.2023Processes & Best Practices
Interviewing Subject Matter ExpertsMelissa StoneAs estimators, we often must obtain technical information from Subject Matter Experts (SMEs) in order to provide thorough and accurate analysis. But sometimes getting the information you need is easier said than done. This presentation discusses the steps for interviewing SMEs in order to make the most of their time and your time. It explores the benefits of building a cooperative relationship of mutual respect and the "dos" and "don'ts" of SME consultations.2023Processes & Best Practices
NATO Transformation Command: Launching a Cost Estimating CapabilityCandace Mahala, Ryan Feeks, Michael ThompsonAfter over 70 years of establishing NATO, the Transformation Command recognized a cost estimating gap in its capabilities. They launched the development of a new cost estimating capability, for the pre-concept and concept stages remaining solution-agnostic at programmatic level. This is the story of how to grow and cultivate cost estimating from program inception, establishing the fundamentals, integrating with other NATO organizations’ cost communities and developing a closed-loop cost track for robust data.2023Processes & Best Practices
Managing Schedule Risk Expectations During Program ExecutionPatrick MaloneProgram managers use Integrated Master Schedules to complete projects. It is a time phased logical view of the work and activity network. Organizations identify risks using rule-based risk taxonomies; or selecting risk exposures and impacts from tables grading them low to high. These assessments are carried forward but not always implemented in the schedule. We investigate schedule risk methods to provide an approach to more accurate forecasting resulting in a process to implement across programs.2023Processes & Best Practices
Take the Blue Pill!Javier Provencio, Chris Hutchings, Cassidy ShevlinDigital ecosystems are distributed technical systems that are scalable, sustainable, and often self-organizing; much like ecosystems found in the natural world. We will offer insight into what a digital ecosystem encompasses, along with relatable examples, and then explicate the virtues and value for our community in areas such as on-demand design to cost, MBSE, in-service support, and simulation. Finally, we will examine emergent relevant and significant use cases2023Processes & Best Practices
Accidental Proof of Calculus for Cost AnalystsJack Snyder, Brittany SmithEver get derailed chasing your tail checking calculations, only to discover the cross check initiating the train wreck was flawed to begin with? Join us on an unflattering journey that began with a quick integral cross check that snowballed into the perfect proof for why calculus is a requirement for cost analysts. MATLAB, calculator, or by hand were the academic tools of the calculus trade, we’ll also share how we use Excel for integral calculus.2023Processes & Best Practices
Methodology for Assessing Reasonableness of Large Scientific Facilities' CostsRaymond D. Woods, Valerie RockwellAerospace analysts developed a methodology to address the questions and challenges that arise when looking at reasonableness of large scientific facilities' costs. This methodology is an iterative process of developing tailored reasonableness ranges and systematically addressing lower WBS levels to refine estimates. Aerospace will share their experience in using this method, when it is applicable, the benefits of implementing this, and associated products - including a specialized ‘Scorecard' - to provide insights to program management.2023Processes & Best Practices
NLP: A New Approach to SFP EstimationDavid H. BrownThe need for data-driven methods for predicting software size has never been greater. This is especially true for agencies that rely on SME opinion or T-shirt based sizing as the primary method for estimating the size and cost of agile-developed software. This paper offers an alternative method — use of Natural Language Processing to automate the estimation of Simple Function Point counts – that offers two important advantages: 1) consistency across multiple counts and 2) speed.2023Software & Agile
The Symbiotic Nature of Requirements Quality and Software Cost EstimationColin HammondIt is widely recognised that high quality software requirements lead to reliable functional size estimates, what is less apparent is the potential contribution of functional sizing to help assess and improve the quality of requirements. This paper looks at the observations from assessments of software user stories using our NLP (Natural Language Processing) requirements analysis tool, and how this symbiotic relationship is revealed and enhanced with automated analysis tooling.2023Software & Agile
A Novel Approach for Early Phase Agile Software Estimating and SizingWilson Rosa, Sara JardineThis study investigates how well three high-level size measures (Capability Gaps, Capabilities, and Epics) accurately predict total agile development effort and schedule at the earliest acquisition phase. These measures were obtained from early program documents including Mission Needs Statement, Concept of Operations, and Release Roadmap. Analysis of Variance and Goodness-of-Fit were used to introduce six new estimation models based on data from 20 agile projects implemented between 2014 to 2022 in the DHS and DoD.2023Software & Agile
Scheduling Agile-Fall - Best PracticesSam Kitchin, Ryan WebsterA high-quality schedule can improve a program’s ability to clarify requirements, predict cost, and communicate effectively by creating a common denominator that links complex efforts across a project. Many Agile programs operate using a hybrid waterfall approach making a properly built schedule critical for understanding the relationship between agile development and government requirements. This presentation will demonstrate best practices for schedule construction in an Agile environment and its implications on cost, delivery, and performance.2023Software & Agile
Tales from the Trenches – Challenges of Data Analytics in a DevSecOps ProgramAlex SmithThe large databases of information captured in software planning tools within DevSecOps environments contain a wealth of information. However, this data is user-input and not created with data analysis in mind, which results in several challenges when attempting to analyze. This presentation walks through the real-world process that resulted from collecting, cleaning, reviewing, and analyzing over 20,000 observations of agile software data to facilitate DevSecOps planning and road-mapping activities for a program entering the Software Acquisition Pathway.2023Software & Agile
How Efficient are Your Efficiency Levers?Eric van der VlietFor application service projects an important aspect is making services more efficient by means of efficiency levers like juniorization, test automation, robotics, etc. The challenge is to combine these efficiency improvements and determine the effort and cost impact during the maintenance period. This presentation presents lessons learned with respect to applying efficiency levers in large application service projects and shows an estimation tool that supports a combination of multiple efficiency levers with different impacts and durations and calculates the monthly impact.2023Software & Agile
Ticking Time Bomb of Telework: Communication and Time ManagementJennifer Aguirre, Shannon Cardoza, Annie BagayToday’s evolving work environment has demonstrated the importance of emphasizing soft skills in the workplace. Employers must seek candidates with high interpersonal abilities in conjunction with technical expertise. Hard skills often dominate in job-related training, while critical foundations like the ability to communicate and manage workload effectively are often overlooked. This presentation will focus on expanding a person’s marketability by discussing techniques to open these communication pipelines and increase productivity despite an ever-growing workload.2023Technical Innovations
Data Science's Expanding Role in Cost AnalysisKevin McKeel, Daniel Harper, Greg FormanData Science has taken on an expanded presence in Cost Analysis. E.g., NLP is used to automate functional software sizing in commercial models. Data Science methods and tools such as NLP can be also used to extract data to identify spending trends and agency funding. We will present an overview of modern usages of data science, to include Machine Learning, AI and data visualization. We will also survey cost and budget data sources which are publicly available and ripe for analysis using data science methods.2023Technical Innovations
Software Estimation Using Functional Size Derived From User StoriesKaren McRitchie, Esteban Sanchez, Alton NgNatural language processing and ISO standard functional size metrics offer an opportunity for an improved approach to sizing and estimation directly from user stories. This presentation will offer an overview of how functional size generated by ScopeMaster can be used for directly for estimation in SEER-SEM. This approach uses either Cosmic or Simple Function Points and considers the uncertainty around sized, ambiguous and potentially missing requirements.2023Technical Innovations
Software Maintenance and Software Obsolescence: A Tale of Two Cost DriversSanathanan RajagopalSoftware Obsolescence is one of the key issues in the modern warfare technologies especially in the cyber space where vulnerabilities of obsolete software are exploited. Understanding these vulnerabilities, predicting, managing them cost effectively are now key priorities of any key defence contracts. In this original research paper the author looks at the relationship of software maintenance and its key cost drivers with software obsolescence. Author will further explore the appropriate mitigation strategy and resolution approach for managing software obsolescence while providing an insight2023Technical Innovations
Wouldn't It Be Nice to Use All of Your Data Points: An Introduction to NICESteven ReadThe difficulty of having non-homogeneous data when fitting a CER can be addressed using data imputation. A Non-linear Iterative Constrained Estimator (NICE) is an approach to both data imputation and CER fitting that considers the whole model and uses the WBS relationships to recalibrate systems of equations for a better fit. This presentation will explain what NICE is, how it works, and begin to explore what problems this method can solve.2023Technical Innovations
VEGA: Shining Light on the Battles of Data Collection and ManagementAbby Schendt, Omar Akbik, Raymond VeraEvery cost organization battles both the collection and management of cost, schedule, technical, and programmatic data. This paper describes how the Office of Cost Estimating and Program Evaluation (CEPE) at NNSA has built and matured this capability since the office’s inception in 2015. This paper will present seven years of lessons-learned that will pay dividends for any cost organization looking to optimize how it collects and manages mission critical data.2023Technical Innovations
Hilbert's Problems III: The Post-Pandemic Cost CommunityPeter BraxtonUpended by the pandemic, the cost community stands at an inflection point. Revolutions in telework and data science open doors to collaboration and innovation. After a decade, we revisit the seminal Hilbert’s Problems paper and reassess Professional Identity, Cost Estimating Techniques, Cost Estimating Implementation, Cost and Schedule Risk Analysis, and Integration with Other Disciplines. We summarize recent progress and issue challenges for future research. We recommend taking advantage of trends in Digital Engineering and Agile and providing meaningful dashboards instead of static briefs.2023Trending Topics
Placing a Value on Employee Well-BeingKevin Buck, Asma AbuzaakoukMITRE applies Social Return on Investment (SROI) principles to inform government workplace Diversity, Equity, Inclusion, and Accessibility (DEIA) decisions. An Investment Value Management Framework (IVMF) compares DEIA solutions and identifies optimal solutions. The framework includes process guidelines and an SROI model to estimate cost, benefit, uncertainty, and risk. The model translates metrics to benefits, and an early warning system manages lifecycle DEIA investment performance, comparing projected to actual SROI.2023Trending Topics
Standardising and Governing Cost Estimation and More in the Australian DefenceKimberley RowlandsIn August 2020 the Australian Defence Department established a central governing branch to standardise costing practices and professionals within Defence. This presentation will walk you through the challenges we faced, our approach to overcoming them and the resulting successes. It will outline what we're focussing on now, understanding the environment of the cost professionals we work with and how we might leverage from each other. As the branch head, I am interested in maturing the profession of cost estimation in the Australian Defence Department.2023Trending Topics
Modeling Battery Manufacturing ComplexitySara McNealFossil fuels are the primary energy source used to drive our economies, but with limited supplies, the reliance on batteries has increased. Batteries are energy storage devices, and the chemical molecules react and produce voltage differences between positive and negative electrodes. These differences produce current flow in the energy forms to power products. In this study, the emphasis is researching cost and specifications to identify variables that have strong relationships with manufacturing complexity for structure.2023Trending Topics
Sustainment Contracts Cost Growth AnalysisCandice SchultheisContract type varies based on a program’s position in the acquisition lifecycle, the type of items/skills procured, and associated risks. Awarded fixed-price contracts are cited as having "locked in" program costs or result in savings claims at initial award, but these claims may not reflect the actual contract cost performance that occurs over time. This analysis quantified sustainment contract cost changes over time and measured cumulative growth. Cost growth comparisons between fixed-price and cost-type contract groups are presented. Root causes are identified, and work-level characteristics are analyzed.2023Trending Topics
Cost Engineering a MVDC Power & Energy Design for Navy Surface CombatantsRichard Shea, Henry Jones, Ann Hawpe, Victor SorrentinoSince maritime MVDC systems do not exist, a statistically significant cost estimating relationship is not easily determined. The components required for a MVDC system do exist at some scale and technical level of maturity. However, there is nothing in production meeting all the same requirements of a Navy surface combatant ship installation. Therefore, this paper follows an approach to compare the theoretical costs of a MVDC to a MVAC architecture in a US Navy Ship.2023Trending Topics
The Economics of Rocket ReusabilityRyan TimmReusability of space launch vehicles promises some obvious recurring cost savings; but we delve deeper to enumerate and evaluate the effects of reusability on non-recurring costs of plant and equipment and rate and learning impacts. We model and explore the interplay of refurbishment and launch cadence and the maximum number of launch vehicle reuses. Finally, we compare hypothetical costs for the launch of expendable, partially reusable, and fully reusable space launch systems.2023Trending Topics
Using TRLs to Predict the Future of Nuclear WeaponsAbby Schendt, M. Michael Metcalf, Raymond VeraTechnology Readiness Levels (TRL) are used to measure and assess technology maturity. This paper presents innovative research demonstrating how historical TRL data can be used to perform credible, data-driven schedule analysis for programs early in development. Originally designed to help schedule analysis for the nuclear weapons stockpile, the resultant methodology combines historical milestone data and statistical methods to generate a Monte Carlo simulation of a risk-adjusted schedule for complex programs.2022Analytical Methods
Software Licenses: A Bill You Can't Pay?Cheryl L. Jones, Brad ClarkFor the past 20 years the DoD has emphasized the purchase and use of commercial software licenses as an approach to cut overall software development costs. Has this approach achieved its desired effect? How can one tell? This presentation will show the challenges and results of analyzing over 4,500 license records collected for the Army Software Maintenance Initiative. Implications to DoD's future maintenance costs as well as implementing its DevSecOps objectives will be discussed.2022Analytical Methods
8D Cost Trades With EntanglementDoug HowarthMany modern goods have elements that have substantial costs or create hazards if unavailable. If a component begins to add more cost than value to a final product or becomes hard to get, the project's viability may be at risk. Here, we study the connections between a subcomponent and an ultimate product across interrelated market pairs. We look at their seven primary dimensions to visualize market mechanisms and add time as an eighth dimension.2022Analytical Methods
Sustainment Analysis Methodology for Cost Models and Business CasesGeorge Bayer, Austin Lutterbach Government infrastructure systems often require detailed sustainability analyses - historical parts failures analysis, parts' procurement, end-of-life, and economic analyses - to (1) forecast the optimum date for an acquisition to replace aging infrastructure with a New Investment, (2) conduct a cost-benefit analysis to justify further F&E investment, and (3) provide a standard sustainment cost estimate (corrective and preventative maintenance) for acquisitions. Using parametrics, research, and software algorithms, the team demonstrates an optimum approach for sustainment analysis.2022Analytical Methods
Implementing Quantile Regression for Capital Acquisition Project EstimatingZachary Matheson, Jeff Beck, Gregory Stamp, Dr. Charles LoeliusThe National Nuclear Security Administration (NNSA) is investigating the use of quantile regression on NNSA capital acquisition data to determine the cost drivers of project costs at values above the median, instead of via use of Log-Ordinary Least Squares. Utilizing this methodology will provide a better understanding and quantification of the uncertainty inherent in the NNSA's capital project budgets, which are required to be budgeted at between the 70% and 90% confidence level.2022Analytical Methods
The Progression of RegressionsJennifer Aguirre, Kyle DavisDo you lose sleep at night over not being able to find the best regression for your data? Well fear no more, for this brief will alleviate those concerns. We review a case study creating a CER with a limited data set exploring multiple forms of regression (Linear, Nonlinear, Cubic, Polynomial), segmenting data into applicable portions while comparing predicted vs. actual results, and using predictive statistics and guidelines to assess best fit.2022Analytical Methods
Predicting and Minimizing Industrial Base Risks to Emerging TechnologiesKevin RayThe global pandemic has highlighted the fragility of the U.S. industrial base relied on to develop the next emergent technologies. Programs are routinely impacted by industrial base issues causing schedule delays and cost growth without the analytical means to predict and prepare for them. An analytical framework was developed that leverages cost estimating methodologies to predict these industrial base issues, quantify their impact, establish a new risk adjusted baseline, and reduce risk through targeted investments.2022Analytical Methods
Developing Conceptual Cost Estimates for Good IdeasAdrian S Mitchell, John MillhouseThis paper outlines the use of a super learner method for generating conceptual estimates of Australian Department of Defence facilities and infrastructure products. Many organisations require quick turnaround cost estimates for good ideas looking for budget funding. Developing estimates in the absence of firm requirements and a full understanding of proposal scope is challenging. A transformative approach to cost and duration estimation is needed to solve this problem.2022Data Science
Supercharging Machine Learning with Cloud ComputingConner Lawston, Bryan Eckle, Ketan GanatraModern cloud computing environments can speed up machine learning models by up to 100X! In this case study with the DoD Joint Artificial Intelligence Center (JAIC), see how AWS and Databricks can be utilized to improve existing machine learning (ML) models for detecting fraud correcting unmatched transactions. When combined with robotic process automation (RPA), this effort saved 100,000 manual hours in FY20, and identified potential fraud with over 90% accuracy.2022Data Science
Statistical Techniques to Improve Software Effort Estimation Data Quality for Cost EngineersTomeka S. WilliamsSince the topic of improving data quality has not been addressed for the U.S. defense cost estimating discipline beyond changes in public policy, the goal of the study was to close this gap and provide empirical evidence that supports expanding options to improve software cost estimation data matrices for U.S. defense cost estimators. The purpose of this quantitative study was to test and measure the level of predictive accuracy of missing data theory techniques that were referenced as traditional approaches in the literature, compare each theories' results to a "ground truth" data set.2022Data Science
Integrating Data Science & Cost AnalysisSarah GreenData is more abundant than ever before. How can we as cost analysts leverage the full power of the data science world to get the most out of our data and utilize things like automation, machine learning, interactive dashboards and more? The author will demonstrate the innovative ways that AFCAA is utilizing a cloud based platform and various data science tools to transform our entire process from data ingestion/ storage to end products and even end to end cost modeling.2022Data Science
Managing Data Science: A Stacked Approach to Integrating Advanced Data AnalyticsEric J. Hagee, John MaddreyRecent initiatives in applying data analytics to cost estimating have propelled the industry to evaluate how data science can be integrated into the estimating process. In this presentation we will dispel the operational hurdle of incorporating data science by introducing a stacked approach to data science integration, defining the levels of the data science stack, answering ‘What does success in each level of the stack mean?', and organizing the work force within the stack.2022Data Science
Cost Estimating Maturity and a Vision for the FuturePatrick Malone, Henry Apgar, William KingThe science of cost estimating has matured. From the Egyptians to modern methods, cost analyses advances were progressive, recent methods further the science. This paper reflects on rich heritage and looks to the future of cost estimating and analysis using emerging and maturing techniques. We look at postmodern tools and methods like Natural Language Processing and other methods enhance our ability accurately forecast costs of more complex systems. Last, we discuss future research and trends.2022Data Science
Utilizing Artificial Intelligence to Evaluate RequirementsAmar Zabarah , Kevin McKeel Poor requirements have been a problem across the commercial and federal arenas. Business Analysts face the challenge of parsing through hundreds/thousands of requirements to define and understand system needs. This is a grueling process and despite the most meticulous analysis, mistakes are still an inevitability. AI can serve analysts in the understanding and parsing of requirements and dramatically reducing the time and effort necessary to accurately evaluate requirements. This paper provides an overview of select AI capabilities and shows their applications to requirements analysis.2022Machine Learning/NLP
Estimation of Application Maintenance by Means of Machine LearningEric van der VlietThis presentation is about an estimation model for application maintenance. The model calculates the required effort and costs for maintaining software application based on incidents with different service levels and priorities. The calculation is based on historical data collected of different business domains. To manage the large amount of data, application characteristics and domains, advanced data analytics and machine learning techniques are required. The development approach as well as the prototype results will be shown in this presentation.2022Machine Learning/NLP
Linear Regression: How to Make What's Old New AgainKimberly Roye, Sara Jardine, Dr. Christian SmartWith the booming popularity of machine learning techniques, data scientists may have you believe there is no place for linear regression. Though many cost estimating applications are nonlinear, when there are linear relationships among features, ordinary least squares regression often performs better than the most powerful machine learning techniques. Scenarios for which linear regression should be chosen over more complicated algorithms are presented, as well as techniques such as regularization, gradient descent, and Bayesian methods.2022Machine Learning/NLP
Adopting a Data Science ParadigmKyle Ferris, Zoe KeitaThe availability of large unstructured datasets accelerated via autonomous data collection and warehousing, makes the use of advanced analytical skillsets a necessity. The incorporation of data science methodologies can yield meaningful improvements to cost estimating practices through effective data governance, streamlined data collection/normalization, and increased opportunities for exploratory analysis. This presentation will evaluate the software and skillsets required for advanced modeling and analysis, as well as planning considerations for data science training development.2022Machine Learning/NLP
Advanced Natural Language Processing for Work Breakdown StructuresMaura Lapoff, Aaron VermeerschThe National Nuclear Security Administration (NNSA) collects Work Breakdown Structure (WBS) data for Capital Asset Projects. Implementing the structure across projects grows exponentially in complexity due to varying scope, contextual changes, and vendor requirements. This paper will demonstrate how Natural Language Processing can automate the process of identifying and classifying WBS elements used for Capital Asset Projects. NLP can improve the Cost Analyst's workflow by reducing time-consuming aspects of their work.2022Machine Learning/NLP
Strategic Investment Planning using Machine Learning TechniquesWendy E. RobelloEarly identification of anticipated customer needs is a major component to winning new business in competitive environments. This allows bidders to make strategic investment decisions that will improve core competencies to meet anticipated customer needs. Current approaches to predict customer needs are subjective, costly, and time-consuming. This research will provide bidders an economical and objective approach using supervised machine learning to predict customer needs deduced from prioritized customer selection criteria to strengthen their proposed solution.2022Machine Learning/NLP
A Deep Look into Optimistic Bias based on a NASA's STEM 4th Grade ActivitySteve SterkBeyond this 4th, Grade NASA STEM activity found on NASA JPL's web site, this paper is a direct result based on independent research being conducted by Steve A. Sterk who works in the Program Planning and Control (PPC) Branch at NASA Armstrong Flight Research Center. It is a believed optimistic cognitive bias, is the number one problem while developing cost and schedule estimates and thus digging into human behavioral, a heuristic technique in completing the task at hand. This presentation will include; How people think, the start of an Optimistic Cognitive Bias data base for Machine Learning.2022Machine Learning/NLP
Delusions of Success: Overcoming Optimism Bias in Schedule ForecastingJeffrey Voth, Maxwell Moseley, Ann HawpeSchedule performance of major defense acquisition programs remains a challenge, leading to an average capability delay of more than two years and substantial cost growth, according to the latest U.S. Government Accountability Office assessment. Stakeholders must de-risk schedule estimates and improve performance through data-driven approaches based on realized prior program histories to prevent optimism bias. The authors evaluate the merits of taking an ‘outside view' to mitigate risk through reference class forecasting, utilizing empirical distributional information from 116 programs across six commodity classes to develop more realistic and reliable front-end schedule estimates.2022Management & Risk
Real-World Data Transformation Challenges in the DoD Supply ChainMichael AhearnMerger and acquisitions have driven tremendous growth of the top one hundred government contractors. This growth has also embedded land mines in the historical data of these "compete-mates." The presenter will discuss real-world lessons from the extraction and transformation of programmatic, cost, schedule, and effort data in support of processes from Should Cost to Basis of Estimate for bid and proposal pursuit.2022Management & Risk
Cost and Schedule Risk Analysis of MegaprojectsMichael Trumper, Lev VirineCost and schedule risk analysis of megaprojects can be complex, time consuming, and may not give meaningful results. The presentation demonstrates a practical and easy to implement steps how to perform risk analysis of projects of any size. They include schedule diagnostics consolidation. The large megaproject schedule can be separated on subprojects, which can be analyzed individually. The analysis can be performed for any milestone and multiple times during the course of project.2022Management & Risk
Uncertainty of Expert Judgment in Agile Software SizingPeter Braxton, Ken Rhodes, Alex Wekluk, David BrownAgile software estimating and planning often rely on expert judgment to assess the size of the development effort at various levels of granularity and stages of maturity. Previous research by the author quantified the inherent risk and uncertainty of the self-similar scales (e.g., T-shirt sizing) commonly used in these assessments. This paper expands those a priori mathematical results and empirically tests the accuracy of experts in applying those scales. It elucidates the ideal ratio to align with the desired confidence interval, and recommends feedback mechanisms to improve consistency.2022Management & Risk
Alternative EAC Methodologies – Calculating EACs without Standard EVM DataRichard Lee, Emily Goldhammer, Kate MalcolmDeveloping an Estimate at Complete (EAC) typically involves establishing a performance-based estimate founded in the best practices of Earned Value Management (EVM) Gold Card analysis. But what if you are missing EVM data or do not have a performance baseline? This presentation explores alternative approaches to developing an EAC by leveraging burn rate profiles. Estimates are then developed by applying regression analysis and/or linear interpolation techniques to the historical burn rates observed.2022Management & Risk
Decision Support and Operation Design in Mission Critical ApplicationsAmin RahimianThis talk covers key concepts of decision making under uncertainty. Building on applications in soldier-robot teaming, we explain how to balance exploration and exploitation tradeoffs using multi-armed bandits and reinforcement learning. The second half of the talk is concerned with challenges of operation design in costly and limited-information environments. Using examples from adversarial information operations, we explain how to balance costs and benefits of data collection and how to mitigate risks to data acquisition resources.2022Management & Risk
Second Source Manufacturing: Lessons from the Second World WarBrent M. JohnstoneManufacturing defense systems at different sites is increasingly common due to foreign coproduction and international cooperative ventures. These situations challenge estimators, posing questions about the transfer of learning and relative efficiency of multiple production sites. This paper examines cost history from World War II, when U.S. bomber production lines were shared across multiple companies. The conclusions are tested against modern experience and guidance provided to estimators seeking help.2022Modeling & Analysis
Model-based Cost Engineering launches with Space Missions EstimatingVincent Delisle, Shawn Hayes, Mark JacobsNASA continually strives to improve cost estimation for the highly advanced technology flown on planetary as well as earth orbiting space missions. Over the years it has been proven that parametric cost models are a desired way to obtain accurate estimates. Still there is still room for improvement. This paper will discuss two of the latest and best methods for obtaining accurate cost estimates using best-of-breed model-based cost engineering techniques.2022Modeling & Analysis
NRO CAAG Parametric Model for Spacecraft-to-Launch-Vehicle IntegrationDaniel BarkmeyerNew, innovative launch service providers and spacecraft designs have broadened the tradespace of launch service options for government customers, placing new importance on launch cost, an often understudied component of space programs' enterprise-level cost. This paper demonstrates a parametric approach to estimating the most mission-dependent component of launch cost, spacecraft-to-launch-vehicle integration engineering. It addresses primary and secondary cost drivers and typical time phasing, and demonstrates the model's performance compared to NRO and NASA historical costs.2022Modeling & Analysis
Integrating Cost into Model-based Systems Engineering EnvironmentsDaniel Kennedy, Karen MourikasModel-based Systems Engineering (MBSE) incorporates digital models to represent system-level physical attributes and operational behavior throughout the system life-cycle. To date, many MBSE efforts have focused on technical requirements with little emphasis on cost. Integrating cost models and MBSE enables rapid exploration of design trades and associated cost impacts, traceability between requirements and cost targets, and earlier identification of affordability issues. We present results from several ongoing efforts as a continuation of our 2019 presentation.2022Modeling & Analysis
NASA Instrument Cost Model (NICM 9) Introductory TrainingJoseph MrozinskiThe NICM 9 Tool suite allows for cost and schedule estimation for space flight instruments. This introductory training session will examine a case study utilizing both the System and Subsystem tools, moving from inputs and assumptions to the cost and schedule estimates and interpretations. An overview of the NICM model development methodology will be given. Instructions to obtain follow-on training and to download NICM 9 will conclude the talk.2022Modeling & Analysis
Cost and Throughput Analysis for the NASA Ames Arc Jet Modernization ProgramJennifer Scheel, Christian SmartThe facilities of NASA's Arc Jet Complex are used to simulate the aerothermodynamic heating that a spacecraft endures throughout hypersonic atmospheric entry, and to test candidate Thermal Protection System (TPS) materials and systems. Ms. Scheel and Dr. Smart analyze the cost and throughput impact of several different modernization options. Estimating the cost requires independent research into highly specialized subsystems and customizing inflation application. Throughput analysis of the number of possible test runs per year is conducted via a probabilistic simulation.2022Modeling & Analysis
Is Your Organization Ready for Model-Based Cost Engineering?Michael AhearnModel-based design continues to yield unprecedented speeds in technology solution development. Integrated and data-driven Model-Based Cost Engineering ("MBCE") promises to estimate concurrently at the speed of design. Taking advantage of MBCE requires vision and a level of organizational maturity. The presenters will discuss critical attributes regarding people, processes, data and technology that impact the success of model-based costing.2022Modeling & Analysis
Mission Operations Cost Estimation Tool (MOCET) 2022 ResearchMarc Hayhurst, Brian Wood, Cindy Daniels, Lissa JordinThe Mission Operations Cost Estimation Tool (MOCET) is a model developed by The Aerospace Corporation in partnership with NASA's Science Office for Mission Assessments (SOMA). MOCET provides the capability to generate cost estimates for the operational, or Phase E, portion of full NASA space science missions. Research topics being studied in 2022 will be presented, including: Level 2 Work Breakdown Structure (WBS) cost modeling, and extended mission cost modeling.2022Modeling & Analysis
DICEROLLER: Estimating D&D Costs for the NNSAZachary Matheson, Charles LoeliusThe National Nuclear Security Administration Office of Programming, Analysis, and Evaluation has developed a model for estimating the cost of Decontamination and Disposition (D&D) activities. This effort involved collecting and normalizing cost data from past D&D projects, and then generating a parametric cost estimating relationship to predict future D&D project costs. The resulting model, named DICEROLLER, will be used to make lifecycle cost estimates and one-for-one replacement cost estimates of capital acquisition projects.2022Modeling & Analysis
Applying System Readiness Levels to Cost Estimates - A Case StudyPatrick MaloneEstimating cost, schedules and expected technical performance of large complex systems pre-development is difficult. When programs are executed, they are plagued with cost growth and schedule delays due to less than required maturity of some elements. We investigate the James Webb Space Telescope program actual cost and schedule history using System Readiness Level (SRL) methods to reveal trouble areas early. The resulting approach will support future estimating accuracy through higher fidelity information for early decision-making.2022Modeling & Analysis
Cracking Open the 'Black Box' of Product Technical Support ContractsAlexander Bonich, Patrick McCarthy, Rhys Bergeron Product Technical Support contracts represent a significant annual cost to the DoD. While these contracts serve as useful "catch-alls" for various programmatic requirements, estimating these costs for future systems can be challenging. This paper analyzes historical technical support costs for various Army ground vehicle systems and demonstrates how combining contracts, work directives, and cost data sources along with categorizations and tags for "Service Categories" provides a comprehensive understanding and management of technical support cost requirements.2022Processes & Best Practices
Visual Exploration of Data – The Missing Element in CER DevelopmentBenjamin KwokWithin cost estimating training literature, it is common for the discussion around Cost Estimating Relationship (CER) development to focus primarily on its statistical parameters (e.g. correlation, equation form, etc.). An underemphasized component of CER development is the need to first visualize and explore your data. This presentation will show how integrating these processes into CER development leads to faster and better results.2022Processes & Best Practices
Data Management for Cost Engineering ProjectsCara CuiuleAs cost estimation data becomes more plentiful, data management becomes essential for organizing and preparing data for analysis. Even if a formal data management system is unnecessary for a project, concepts from database management can still be incorporated to ensure higher data quality. This paper will contain an overview of relational and non-relational models, along with a case study of a database that contains structured, unstructured, and semi-structured data.2022Processes & Best Practices
Everything You Always Wanted to Know About Affordability...But Were Afraid to AskKaren Mourikas, Denise Nelson The term "Affordability" means different things to different people, depending upon one's employer, organization, function, background, etc. The term has also morphed over the years - expanded or narrowed - based on one's viewpoint. During ICEAA's OEM-COG discussion on Affordability, diverse opinions were revealed. We will assemble various interpretations & implementations of Affordability from multiple perspectives, provide historical background, and explore how our community can clarify, standardize, and promote the concept of Affordability.2022Processes & Best Practices
Predictive Thresholds for Schedule Execution MetricsMichelle JonesWe called it the search for a unicorn. After an intense week of schedule analysis, checking the numbers, updating visual basic code, then presenting objective evidence that the contractor is not achieving the baseline plan, we got the action: provide the metrics of a good program. We tried, but after reviewing countless schedules, we realized that a "good" program is as rare as a unicorn, because every program deviates from the baseline plan. Although we never found a unicorn ("good program" with good metrics), we developed data driven thresholds predictive of significant milestone slip.2022Processes & Best Practices
The BS in BoeS – Oh the Games that are PlayedSandy BurneyA good Basis of Estimate (BoE) uses a historical comparison, and a complexity factor. These two choices are not statistically based, meaning a bias is introduced into most BoEs. This paper explores the bias in BoEs through the framework of Game Theory. A BoE goes through a creation, internal review, internal approval, and external review process, where each player in this process will likely have a different payoff matrix. Each player in this process will want to insert their bias into the BoE, which this paper terms "Bias Selectivity" (BS).2022Processes & Best Practices
No Estimation Without Escalation - The Inflation RevolutionShannon Cardoza, Edward SmithI downloaded the latest USG Inflation Indices, so my work here is done, right? OSD guidance encourages the analyst to explore program-specific escalation to maximize estimate accuracy vice defaulting to existing published indices. This briefing will provide a basics refresher background on common appropriation inflation rates, their outlay profiles, and how to interpret them. It will then explore how to improve to account for escalation, commodity-specific demand, and project-appropriate expenditure profiles increasing estimate accuracy.2022Processes & Best Practices
Transforming Cost Estimation ServicesKevin McKeelIn the last 10-15 years the government contracting industry has quickened the pace of technology delivery. This emerging change has forced cost estimators, as well as acquisition professionals of all kinds, to change to their own processes to keep pace. Cost estimation shops can benefit by changing their service to match customer needs and delivery tempo, add software product instruction with analyst training programs, and extend the scope of "cost" to include budget formulation. This presentation touches on transformation such cost organization infrastructure and repeatable processes.2022Processes & Best Practices
What Does Agile Software Development Need: Predictable Costs or Predictable Outcomes?Christina KosmakosDespite ongoing laudable attempts to advance the state-of-the-art in agile software estimating processes and techniques, the software development and cost analysis communities are far from ‘cracking the nut'. There is a dire need for a cost estimating approach that accurately predicts agile project costs with the correct granularity to enable budget and execution planning. This paper will detail the exploration of three approaches that utilize extrapolation from actuals to implement a solution.2022Software & Agile
Simplifying Software Sizing with Simple Function PointsDaniel B. French, Carol DekkersSimple Function Points (SFP), modelled on the IFPUG Function Point Analysis method was introduced with the goal of simplifying the method while reducing cost, time, and difficulty estimating software size without sacrificing accuracy. We'll introduce the IFPUG version of SFP and highlight the challenges/opportunities when using to size software. We'll explore when to use SPF and key differences between SFP and IFPUG FP while providing guidance on using FP measures in software cost estimates. 2022Software & Agile
Agile Team Performance Measurement as a Basis for Accurate Cost EstimationH.S. van HeeringenAgile team performance metrics Productivity, Cost Efficiency, Delivery Speed, Sprint Quality, and Product quality can be measured in an objective, repeatable and verifiable way, compared to each other, and benchmarked against industry data. The measurement data is used to recalibrate long-term effort, duration, and cost estimates based on the actual productivity delivered, resulting in increased predictability. I'll show a recent study of 4 teams of one organization, each in a different European country.2022Software & Agile
Dynamic Software Effort Estimation: How SWEET It Is!William GellatlyDavid Brown, Lindsey JonesSoftware estimation is a complex and diverse field that must accommodate a variety of technical inputs across the life cycle. Our team developed an Excel-based effort estimation model designed for both flexibility and transparency. It dynamically builds an effort estimating relationship based on analyst-selected data fields and source data. Our paper explores how we used data from ISBSG, statistical analysis, and practical experience to drive the development of this "clear-box" model.2022Software & Agile
Agile Product Roadmap Estimating and Progress TrackingBlaze Smallwood, Ryan BlackburnAgile project managers often struggle developing and maintaining a full project plan, given the churn in requirements and priorities typical of an agile environment. PMs often use a Product Roadmap to communicate high-level work priorities and phasing, but these often lack the quantitative information needed to enable tracking progress against overall project goals. This paper will offer an innovative solution to this problem, which has recently gained traction on an actual government agile project.2022Software & Agile
Let's Go Agile: Data-Driven Agile Software CERs Derived from DHS ProjectsWilson Rosa, Sara Jardine, Kimberly Roye, Kyle EatonThis paper presents a set of effort estimating models for agile software projects using backlog data from JIRA, monthly contract reports, and requirements documents. The first set of models predict effort using either Unadjusted Function Points or Simple Function Points. The second set predicts effort using either functional requirements, stories, issues, or story points. The regression models are based on data collected from 15 agile projects implemented within DHS and DoD from 2016 to 2021.2022Software & Agile
Software Phasing and Schedule Growth AnalysisDaniel LongDoD software research lacks an understanding of development phasing, effort allocation, and schedule. We evaluated conventional rules of thumb for effort allocation against projects in the SRDR database. We compared how effort allocation varies between projects with high or low schedule growth. Results showed that increasing effort in early phases consistently decreases the total schedule growth, and led to improved allocation guidelines. These findings were significant across multiple categories such as Service and project size.2022Software & Agile
Minding Your P's and Q's as Prices RiseAlan Karickhoff, Brian Flynn, M. Michael Metcalf, Omar AkbikChoices and challenges abound in combining individual prices (P's) and quantities (Q's) of labor and material into one single measure of overall escalation for a project. Popular constructs include the Laspeyres, Paasche, and Fisher. This research illuminates the issue of which index to use when in the relentless fight against Money Illusion - the tendency to think in nominal rather than real terms. Methods are offered for forecasting inflation probability distributions up to three decades out.2022Trending Topics
Impacts of Digital Engineering to the Cost EstimateBrittany ClaytonDefense programs across the services are implementing digital engineering throughout the acquisition lifecycle to develop, manufacture, and sustain their platforms. How should cost estimators account for these new practices? This presentation will focus on the impact of implementing DE to the cost estimate. Based on literature reviews, discussions with SMEs, and review of policy, we will propose investments and cost uncertainties for the estimator to consider throughout the lifecycle of a platform.2022Trending Topics
CE^2 : Communication and Empowerment for Cost EstimatorsChristina SnyderToo often, cost estimator training focuses solely on technical abilities, largely ignoring the "soft skills". The behaviors of being a good communicator and empowering the team were shown in a 2020 ICEAA community survey to be important to cost leadership efficacy. Knowing that technical skills alone will only take estimators so far, this presentation leverages communication and empowerment training to demonstrate how all cost estimators can use soft skills to exponentially impact their analyses.2022Trending Topics
What is the OEM COG?Chuck KurtzCost and Affordability communities employed by Original Equipment Manufacturers (OEM) face challenges that differ from those in government or other private sector organizations. To identify and address these challenges, ICEAA formed the OEM Cooperative Opportunities Group (OEM COG) to discuss, collaborate and share best practices on topics such as Affordability, Digital Engineering, Risk and Professional Development - from an OEM perspective. Join us for an interactive session as we share lessons learned with one another.2022Trending Topics
Solving the Climate Crisis with Satellites, Fighter Aircraft & Nuclear ReactorsDale ShermonThis paper examines the challenges of factory built, low cost, mass produced nuclear reactors, not by focusing on the technology, but the acquisition process. It will consider lessons learnt from satellite and fighter aircraft programmes which lead to the formation of a consortium of nations that could engage in nuclear power generation. Can we solve the problem of becoming zero emitters of greenhouse gasses while electricity demand is increasing globally?2022Trending Topics
DoD Cost Estimating Guide v2Molly Mertz, Erin E. VeltmanThe DoD Cost Estimating Guide has something to offer every estimator! This session will provide an overview of the guide, with special focus on version 2. New material includes updated statute/policy references, expanded discussion of Middle Tier of Acquisition estimating, a recommended reading list, and a process-focused case study. Follow analyst Ava's progress as she navigates the challenges of a Milestone C production estimate for a fictional helicopter program.2022Trending Topics
Estimating Costs of Climate Change Impacts to Public Infrastructure (CIPI)Edward Crummey, Nicolas RhodesWhat is the long-term budgetary cost that certain climate change hazards could impose on public infrastructure in Ontario, through accelerated deterioration, increased operating expenses or compromised service levels? CIPI combines climate data with mapped infrastructure data, and applies a derived climate "elasticity" to cost the impact of specific climate hazards. While quantifying the budgetary impacts of climate change on public infrastructure is relatively new, the methodology could support capital funding decisions and adaptation actions by governments now and in the future.2022Trending Topics
Superheavy Launch: The Coming Paradigm Shift for the Space IndustryRyan TimmA new generation of superheavy launch vehicles will provide mass and volume capabilities not available in the last 50 years. This paper describes how superheavy launch will transform space vehicle (SV) design and the space industry. No longer mass-constrained, SVs will be build with different materials and processes, and will include capabilities previously inconceivable. SV developers will adapt their facilities, logistics, and support equipment. Cost estimators may want to revisit parametric CERs dominated by weight.2022Trending Topics
Up is Up; Why Expert Knowledge Should Be Favored in a Technologically Driven WorldArlene F. Minkiewicz, William GbeleeThere have been strong advancements in technology and other tools in the data science community such as big data explosion, machine learning, and neural-network algorithms. Many organizations are losing their appreciation for subject matter expert opinion; instead, they are focusing on technological advancements in tools. We will discuss the importance of subject matter experts, the key role that they play in stitching together data and provide some cautionary tales of leaning too much on technology.2022Trending Topics
Cost Estimate Kick-StartersDaniel HarperI'd like to present several "Cost Estimate Kick-Starters" I created to "kick-start" your cost estimate, including: LaRRGE [Labor Rate Reference Guide for Estimators] Labor Rates resources including newly added Cyber Labor Rates; GRIPS [GSA Robust Infrastructure Pricing Solutions] The GSA Enterprise Infrastructure Solutions Pricer, or "Turbotax" for Enterprise Infrastructure Solutions; SLiCE [Software License Cost Estimator] lookup tool containing over 4,000 prices (and growing) for software licenses, training, etc.2022Trending Topics
Volunteering: Maximizing the ROI of your ICEAA MembershipChristina N. Snyder, Megan JonesJoining ICEAA is as easy as clicking a few buttons, but to truly benefit from the value of your membership, you need to get involved. Volunteering is the first step to making your membership work harder for you. This session shows how to maximize the return on investment of ICEAA membership through volunteer opportunities with commitments ranging from short-term to long-term at the local, national, and international levels.2022Trending Topics
Investigating Causal Effects of Software and Systems Engineering EffortJames P. AlstadCausal discovery can help identify factors that drive project costs. We applied causal discovery to software and systems engineering cost estimation model calibration datasets to determine the causes of effort. Due to few variables resulting from the standard causal discovery algorithms, we came up with a technique called Weak Signal Analysis to better tease out weaker causes. Finally, we estimate the numerical impact of each such causal relationship, resulting in cost estimating equations.2021Analytical Methods
Assessing Regression Methods via Monte Carlo SimulationsMichael SchiavoniThis presentation provides new insights while building on past research into multiplicative-error model regression methods. Prior evaluations have primarily been limited to mathematical arguments or comparisons of sample statistics using a few datasets. Conversely, this effort executes Monte Carlo simulations with thousands of iterations, thus enabling the direct estimation of measures such as true population bias. Furthermore, the capabilities/limitations of each method are explored by simulating different model forms, sample sizes, error distributions, and variances.2021Analytical Methods
The Missing Link: An Evolution of Portfolio Natural SelectionCatherine Dodsworth"It is not the strongest of the species that survives, nor the most intelligent that survives. It is the one that is the most adaptable to change."Charles Darwin. This research presents a conceptual framework and methodology for solving the missing link between learning curve estimates, prediction intervals, and S-curves and generating analytically-based affordable cost constraints to naturally select trade space in a portfolio. This holistic approach expands the evolution of the S-curve.2021Analytical Methods
Reducing Lifecycle Cost Through Aircraft ModernizationJeremy GoucherThe Department of the Navy is planning to upgrade one of the oldest fighter jets in the inventory. To support a milestone decision, an independent Government cost estimate was developed from an engineering build up including over 1,500 individual material procurement line items. Additionally, dozens of past programs were analyzed to provide parametric inputs to the cost model. The detailed data, methodology, and risk adjusted phased results will be presented. Additionally, certain metrics will be compared to past upgrade programs to determine if there are commonalities across platforms.2021Analytical Methods
Adding Cost Credibility with SRL and MBSE Advanced ToolsPatrick MaloneWhen space based system development programs are less than 15% spent of estimated costs, traditionally almost 80% of design costs are committed! Furthermore, programs with less than the recommended GAO knowledge at program start will likely have higher risk and unexpected cost/schedule growth. Using advanced tools like System Readiness Level (SRL) and Model Based System Engineering (MBSE) methods we show how to identify vulnerabilities early in the development life cycle to mitigate these risks.2021Analytical Methods
SatSim: Estimating Satellite Costs via SimulationDaniel NewkirkCost Estimating Relationships (CERs) are the standard for traditional cost estimators in estimating satellite designs. As new satellites and architectures are proposed, traditional CERs become less defensible as the design uniqueness increases. In order to better estimate future systems, the Space and Missile Systems Center (SMC) cost research team developed a simulation-based approach to estimate satellite costs with equivalent or better accuracy than traditional CERs.2021Analytical Methods
Foundation of Structured Architecture, System & Cost ModelingDanny PolidiModern software packages can perform complex physics based simulations and alone typically do not consider cost as an input variable while other packages specialize in determining cost. This paper begins the development of a structured System Engineering approach to System design and defines a standardized modular diagram for a RADAR System applied to military applications in the aerospace industry. It will be demonstrated that a system can be defined in standard terms, modeled with modular blocks and be costed by those same modular blocks.2021Analytical Methods
Beyond the Matrix: The Quantitative Cost and Schedule Risk ImperativeChristian B. SmartThe first step in risk management is gaining an appreciation for risk and uncertainty. However, despite a long history of cost and schedule growth in all types of projects, we tend to focus on averages rather than recognize extremes. Project management is blind to risk. The fixation on averages leads to an underestimation of risk. The use of qualitative methods, including risk matrices, is prevalent, but should be analyzed with rigorous quantitative methods.2021Analytical Methods
Improved CERs to Estimate Commercial IaaS Costs for Federal IT SystemsWilliam GbeleePrevious research established vendor agnostic estimating methods for the following Infrastructure as a Service (IaaS) costs: virtual machine and storage data. New data has led to price updates to the dataset resulting in over 27,000+ world-wide data points. This includes publicly available government data. The statistical analytic approach provides validated vendor agnostic models. The analysis was conducted and verified by randomly separating the samples of data into training and test sets.2021Cyber Security & Cloud Computing
Planning a Cloud Migration Effort: Cost Estimating ConsiderationsKyle Connor FerrisCloud First vs. Cloud Smart? Private Cloud vs. Public Cloud? Multi-Cloud vs. Hybrid Cloud? With the growing prevalence of cloud solutions and strategies across the Federal Government, developing a defensible cost estimate for the migration of legacy data to cloud environments becomes increasingly complex. Responding to these challenges, this presentation will demystify the fundamentals of cloud computing, and establish important cost estimating considerations when defining the purpose and scope of a cloud migration effort.2021Cyber Security & Cloud Computing
Cybersecurity Cost Issues Facing Today's Cost AnalystBob HuntThere are three general approaches to coat analysis: Botton-up (engineering build up), Economic/Cost Benefit Analysis, and Top-down (parametric). Today Cyber cost analysis is principally done by Botton-up (engineering build up) and/or Economic/Cost Benefit Analysis. This paper will address the advantages and disadvantages of these two approaches. In addition, this oaper will propose a new parametric approach to cyber security cost analysis based on cyber "actors" and business sector.2021Cyber Security & Cloud Computing
Estimating Cloud Infrastructure: Requirements, Methodologies, and UncertaintyOlivia LindseyThe DoD is fully committed to the cloud, but where do we start? We'll explore multiple existing models available for estimating cloud infrastructure including those created by NRO CAAG and PRICE Systems, and data sheets available directly through Amazon. We'll discuss technical baseline depth needed and uncertainty considerations for each scenario while exploring economies of scale and pricing differences between security classifications for Government systems and providing ready-to-use factors and reference points for multiple cloud scenarios.2021Cyber Security & Cloud Computing
Comprehending Chaos: Leveraging Text to Improve AnalysisOmar AkbikData has become the world's most valuable industrial commodity. The defense industry is no stranger to this reality. The quantitative application of unconventional data sources, such as text, can fill gaps that exist in traditional analysis, and reduce reliance on opinion based inputs that often require significant uncertainty adjustments. This paper will explore natural language processing (NLP) for classification and clustering in cost analysis and how the authors have applied it in practice.2021Machine Learning & Data Science
The Algorithm of 10,000 VariablesBryan K. AndersonMachine learning opens the door for software cost and schedule estimation for previously unviable scenarios in traditional frameworks. A reliable software estimation requires several intricate factors and often the data is unattainable or incomplete. By training several machine learning algorithms on actual data from an issue tracking system this case study achieved a more reliable model. Factors like issue description, individuals assigned, and issue events are shown to be important features in modeling development work.2021Machine Learning & Data Science
Innovative Techniques for Analyzing Incomplete Data to Improve Cost EstimatesGeorge O. Bayer Jr.Cost estimators are often presented with incomplete data sets from which they must develop business case solutions. Understanding, interpreting, and improving the data integrity are critical factors for cost estimate accuracy. In this use case, the cost estimators analyze and interpret incomplete and subjective data sets, forecast spares depletion, and estimate obsolescence. Using innovative data mining and text analysis techniques, the estimators demonstrate how improved data can result in better cost estimates and business cases.2021Machine Learning & Data Science
Applying Natural Language Processing Techniques to Categorize Cost Estimation DataCara CuiuleIn cost estimation, datasets used for analysis can include text data that needs to be normalized to common terminologies, such as component categorizations. Natural language processing (NLP) and machine learning (ML) could potentially be used to solve this problem. This paper will define terminology and common techniques for both NLP and ML. It will also include a case study using a hardware dataset and NLP techniques using Python.2021Machine Learning & Data Science
Machine Learning and Parametrics for Software Joint Confidence Level AnalysisSara JardineJoint Confidence Level analysis has proven to be very successful for NASA. It is typically conducted using bottom-up resource-loaded schedules. However, the use of high-level parametrics and machine learning has been successfully used by one of the authors. This approach has some advantages over the more detailed method. In this paper, we discuss the approach and provide an example of the application of machine learning and parametric analysis to software programs.2021Machine Learning & Data Science
Data With A Purpose: Technical Data InitiativeJeff McDowellTechnical characteristics are known to influence cost and as such are integral to cost estimating methods. Cost-driver data is an enduring cost community need yet is often an afterthought to cost data collection. This presentation is an overview of CADE's Technical Data Report (TDR) to systemically capture this must-needed information as part of the CSDR process. Also a sample Power BI case will illustrate the powerful analysis empowered by integrating TDR data with FlexFile cost data.2021Machine Learning & Data Science
USCM11: an Evolution of Techniques Used to Build Cost ModelsBenjamin KwokThe development of the 11th version of USCM incorporates many innovative processes that paves the way for how future cost estimating models could be developed. This presentation will showcase analysis, processes, and techniques such as automating the generation of CERs, using machine learning techniques to classify CERs as either primary or secondary methods, using an analytics approach to evaluate your model, and more.2021Machine Learning & Data Science
Dealing with Missing Data: The Art and Science of ImputationKimberly RoyeMissing data is a common occurrence when collecting and analyzing data. Even when a data set includes many data points, many variables of interest will have omitted values. The most common way to deal with this situation is to exclude the data points from analysis. However, this is not ideal. We discuss a better way to deal with this issue, which is the use of imputation, a statistically rigorous method for filling in the holes.2021Machine Learning & Data Science
Continuous Enhancements: An Alternative to MaintenanceDaniel BowersWith a rapidly evolving threat and the enemy's ability to adapt and counter our weapons and technology, gone are the days of designing, producing, and maintaining a system for 20-30 years. Modern systems need the ability to adapt and update rapidly to include the latest technology. This new paradigm changes the traditional Operations & Support methodologies and estimating techniques employed by the cost analyst. We will look at the impacts to maintenance costs and the traditional funding profile.2021Operations & Support Analysis
Using Analytics to Aid in Performance Based Logistics Decision Making ProcessesPaul BrownAs OEMs push towards high dollar value, long-term Performance Based Logistics (PBL) contracts, decisions surrounding these types of contracts should be driven by an analytical process. PBL contracts can be worth billions of dollars and are sold as ways to improve system performance while reducing costs. Modeling and simulation can be used to evaluate the performance and cost impacts of the PBL-suggested improvements and assist decisions makers select an appropriate way forward.2021Operations & Support Analysis
Army Software Sustainment: Righting Our Cost Estimating AssumptionsCheryl L. JonesA challenge in estimating software Total Ownership Costs at the beginning of a software acquisition is estimating the cost of software after development. Rules-of-thumb are often used to estimate the annual change in the software during the sustainment lifecycle. This session will present data analysis on software sustainment profiles that show a rise and fall in the number of changes over long periods of time contrary to the practice of using a constant annual change.2021Operations & Support Analysis
Practical Estimating of Hardware Lifecycle Maintenance and Obsolescence Mitigation StrategiesF. Gurney Thompson IIIThere are many strategies that can be employed to maintain a hardware system, and many phenomena that have been observed in systems with long lifespans which can greatly affect cost and reliability. This paper will present our research approach and will examine case studies to compare various maintenance strategies, approaches to mitigating technology obsolescence issues, modeling of non-constant failure rates, effects of mid-life upgrades, and more.2021Operations & Support Analysis
Business Transformation of Life Cycle Cost Estimating at PEO STRI / ProcessesJames A. GoldenThe U.S. Army's PEO STRI used the tenets of Better Buying Power and Continuous Process Improvement (CPI) to improve life cycle cost estimating, focusing on the processes and applications that produce credible data-driven, comprehensive, and reliable cost estimates. Implementation resulted in earlier and faster production of estimates; increased confidence in program estimates; and improved organizational ability to defend estimates. This presentation describes the challenges, processes, and internal controls associated with success of PEO STRI initiatives.2021Processes & Best Practices
Addressing Challenges in Costing Unique Large Scientific FacilitiesMarc HayhurstThe costing of large scientific facilities poses many challenges to cost estimators as these facilities are typically uniquely designed for specific types of research. However, when the facilities and hardware are examined at a lower level, there may be similar components which can be consulted to evaluate a new proposed system. Aerospace will share their experience in costing unique facilities including the importance of historical data, cost scaling, sensitivity analysis, and other evaluation techniques.2021Processes & Best Practices
Effective Affordability Engineering Teams Top 5 Things You Need to KnowZach JasnoffMuch is written on Affordability Analysis, but what does it take to implement an effective Affordability Engineering team within a DOD contractor? This presentation explores challenges / pitfalls facing establishing an effective team and the top five things needed to overcome them. Also discussed are the methodologies, technologies and best practices required to fully explore the cost / performance trade space. A Model Based Cost Engineering case study demonstrating Affordability Analysis principles will also be presented.2021Processes & Best Practices
Cybersecurity Cost Estimating Factors for Business IT SystemsRichard MabePRICE® Systems and MITRE analysts developed an initial set of cybersecurity cost estimating factors for the 2020 ICEAA Workshop. Peer feedback on the factors identified several issues requiring additional research to resolve. This a paper presents the follow-on data updates and analysis to address these issues, and to update the factors accordingly. The results can be applied to estimate costs for cybersecurity specific activities supporting development and operations of IT systems for the federal government.2021Processes & Best Practices
Where To Miss, What To Give UpDoug K HowarthMany projects miss some or all of their targets and specifications. Unrealistic requirements force designs made with skyrocketing costs. To produce a successful product, one must know which conditions to keep and which to drop. Producers should discover what the market wants, doesn't have, and can afford. In-depth market analysis reveals configurations more likely to fall within cost limits while satisfying customer requirements.2021Processes & Best Practices
The Art of the InterviewJoe BauerCost estimation is often called an art and a science. From an art perspective, the estimator must be able to effectively communicate with people to elicit information and deliver actionable intelligence. In this presentation, we will discuss interview techniques and other 'soft skills' that enable the estimator to effectively obtain useful information, build rapport and collaborate with a number of different audiences, to include subject matter experts, engineers, program managers, customers, and stakeholders.2021Processes & Best Practices
Using Integrated Program Management (IPM) Principles EffectivelyMike ThompsonIntegrated Program Management(IPM) is not just managing the Technical aspects for the program and looking to achieve the program goals. Besides the technical piece, there is an entire team effort, which includes Cost Estimates and Analysis, Scheduling, and Earned Value Management, which work together to present a total picture of the program, based on separate data. In the past there has been a tendency to 'silo' these disciplines and not identify or use the them in an integrated way. The purpose of this paper is to illustrate how the use of IPM Principles.2021Processes & Best Practices
Forecasting of Agile deliveriesEric van der VlietAgile is the most common delivery approach for software projects. Single teams are able to control their delivery, more challenging is the delivery of multiple teams or multiple release trains. How can management keep control in such a situation? How can management determine what value has been delivered? This presentation is about a Power BI based solution that consolidates the information from different teams and release trains and consolidates this information in trends for management to take decisions with respect to performance, quality and value.2021Processes & Best Practices
Does Cost Team Leadership MatterChristina N. SnyderAn anonymous survey of 150+ cost analysts unanimously reported that a cost-team lead's effectiveness ultimately impacts the team's products. However, there has been minimal guidance as to what defines good leadership. Using the ten behaviors identified by Google's Project Oxygen, this paper seeks to understand what skills are necessary for successful cost leadership. The findings lead to a simple conclusion that mirrors that of Project Oxygen: improving our soft skills will improve cost leader efficacy.2021Soft Skills & Communication
Faster! Better! Cheaper! Improving Counting Productivity and DeliverySheila D. DennisDo you want to get the best ROI from your function point team? From a business perspective, there are five core goals for any program: 1-Effectively manage workflow; 2-Proactively manage end user expectations; 3-Accurately plan, budget and forecast deliveries; 4-Accurately estimate deliverables; and 5-Show value to the organization. A function point team should also strive for these goals. Come learn best practices, tips and techniques for building a valuable counting support unit for your organization.2021Soft Skills & Communication
Estimator development and process automationSteven GlogozaAs technology, software, and tools continue to automate, it becomes increasingly challenging to develop personnel. The design and modeling of automated processes may leverage artificial intelligence, but the cost estimator will have to continue to navigate a murky gray environment that requires critical thinking, intellectual curiosity, and analytical aptitude. As we automate estimating activities and continue to develop future estimators, how do we design to capture process velocity without sacrificing insight and credibility?2021Soft Skills & Communication
Finding the Story in Your DataKaren Richey MislickThis presentation will cover data visualization and how to find the story within your data. It discusses how people process information and offers tips for creating effective graphics using data visualization principles and techniques to inform decision-making. Different approaches to visualizing data will be discussed including decluttering your graphics, choosing informative visuals, focusing the audience's attention using pre-attentive attributes, thinking like a designer, and implementing effective storytelling techniques.2021Soft Skills & Communication
Icebergs or Shifting Sands: What's the Key to Software Estimation?Carol DekkersICEAA is a world leader when it comes to estimating tangible programs. With CEBoK, successful estimation of large and small scale programs involving hardware, satellites, buildings, equipment and networks is not only possible, it's become a science. Now, what happens when you bring software-intensive systems into the mix? It's said that estimating software development is like estimating an iceberg, and with the new agile development, maybe even like shifting desert sands. What do you think? Can we create successful software estimates with so much uncertainty? Let's explore the topic.2021Software & Agile
Build It and They Will Come: Keys to Developing a Successful Software Metrics ProgramDaniel B. FrenchDespite the pervasiveness of software development in all areas of industry today, few organizations develop and implement effective software measurement programs to manage their software development projects and leverage information necessary to make business decisions. This presentation discusses reasons why this is, addresses myths and misconceptions regarding software metrics programs, and how to develop an effective program, as well as which software metrics to develop to provide the greatest value.2021Software & Agile
Get to the Point. What's the Deal with Different Function Points Methodologies?Anandi HiraSince the development of IFPUG Function Points (FPs), many variants have emerged to simplify software sizing and improve cost estimation. Which of these sizing techniques should one use? To understand differences among methods, we compare IFPUG FPs, COSMIC FPs, and Simple FPs for effort estimation on University of Southern California (USC)'s dataset of Unified Code Count (UCC) enhancements. Additionally, we investigate automated tools to understand the correlation between Objective FPs and the other FPs methods.2021Software & Agile
Air Force: Standing Up Agile Under A WaterfallTrevor MichelsonOn November 2018 the 'Agile Development Manifesto' was released to parts of the Air Force to provide summary guidance to rapidly deliver software products under Agile Development. Multiple cost teams have worked closely throughout the divisions the past few years to better transition how divisions will work to incorporate Agile practices and useful metrics for software development through the release increments. This presentation will provide a journey through the usefulness of our current Agile metrics, how they will be improved and how we have made use of them throughout the DoD.2021Software & Agile
Cyber Mission Platform (CMP) Program: Analyzing the Full Suite of Agile MetricsAlex SmithGuidance, policies and task force recommendations related to agile software development best practices for DoD continue to surface on a daily basis. The DSB, DIB, GAO and OUSD A&S have all weighed in during the last two years. The OUSD A&S 'Agile Metrics Guide' provides a concise discussion of metrics that cost analysts and PMs supporting agile programs should consider. This presentation will discuss the AF Cyber Mission Platform program's agile transformation, approach to estimation, and provide techniques and visualizations for analyzing the key metrics discussed in the OUSD A&S guide.2021Software & Agile
Lessons Learned from Software Maintenance DatasetsArlene MinkiewiczEffort applied to the maintenance of software applications is thought to be between 65% to 85% of the total ownership costs of the application. Yet there are significantly fewer studies focused on maintenance as compared to development, especially based on public domain datasets. This paper presents results from a study focused on maintenance effort data from the ISBSG's Maintenance and Support (ISBSG M&S) database along with data from other sources.2021Software & Agile
Secure Software Development Levels and CostsElaine VensonWhile the growing field of Software Security provides solutions to address security vulnerabilities, financial issues are still a barrier to their introduction in projects. Evaluating the cost-effectiveness of security practices requires understanding the impact of increasing degrees of security on the development effort. This session will present a rating scale for establishing levels of secure software development, along with effort estimates provided by experts, and the steps envisioned to propose a security cost model.2021Software & Agile
Grandma's Secret Hotdish Recipe for SW Planning: SAFe and AnalyticsLeah WalkerIn today's environment, programs want to develop and deliver quickly, but struggle to manage to cost and schedule baselines. Changes in scope, staffing and process require non-standard analysis of data amidst constant change. The authors employed innovative approaches, leveraging function point analysis and machine learning tools, to generate estimates and metrics to drive business outcomes. This business case explores the impacts of program ownership transition, conversion of software development methodology, and modified scope and requirements management.2021Software & Agile
How Green Was My Labor: The Cost Impacts of Manufacturing Personnel ChangesBrent M. JohnstoneEstimators are frequently confronted with manpower increases or decreases and asked to calculate shop performance impacts. However, existing learning curve literature offers little guidance how to do so. This paper identifies issues associated with both new hires (so-called 'green' labor) and workforce reductions and offers an analytical format. Based on a study of a large workforce expansion on a mature aircraft program, a model to analyze future manpower changes is presented as well as several example cases.2021Technical Management
A Perspective on Cost and Schedule Impacts of Non-Technical Project ManagementTerry JosserandProject and program management have distinct definitions, but the interplay between technical and non-technical roles within these fields has limited research. This observational study evaluates non-technical project management and its impact on the cost and schedule of major nuclear weapon modernization programs by utilizing historic data to comprehend these emerging phenomena. The study results yield multiple observations and recommendations that will assist senior leadership in their review and decision-making regarding future acquisitions.2021Technical Management
Lessons Learned from Implementing Global Project ControlDale ShermonThis paper examines the challenges for organisations to initiate and maintain excellence in Project Controls working with the multi-disciplinary fields of scheduling, cost estimating, risk management, EVM, reporting and monitoring. With more than fifty global stakeholder and 150+ deliverables we will explore the highs and lows of a Project Controls change programme and the lessons learned.2021Technical Management
Cost and Data Issues facing Today's Cybersecurity AnalystsBob HuntCybersecurity threats and counter measures are evolving at a rapid pace. Analysts need a solution for assessing a non-static situation. Many organizations are simply costing a proposed plan without a strategic or sustainment plan. Current studies imply that it is more costly to defend against a cybersecurity attack than to execute the attack. Studies also indicate that the typical security breach is not detected until about 200 days after the breach. Analysts need to understand the scope of cybersecurity. Is it physical, computer, hardware, people, policy, or all of the above?20202020 SCAF/ICEAA Virtual Conference
The Art of JudgementAndy NolanMany of us rely on "expert judgement" when estimating, but the evidence is that it can be unreliable. Based on a study of 3760 guesses, we noticed that 70% of people tend to underestimate. A new study in 2020 of 7400 guesses, showed that 69% of people tend to quote a narrow min-max range when guessing a 3-point estimate. When using judgement, most of us are too low and too precise, we are precisely wrong!20202020 SCAF/ICEAA Virtual Conference
The Quantitative Risk Management ImperativeChristian B. SmartRisk is an important consideration for all projects. However, as a society we are risk blind. When projects plan, they tend to look only at the best-case scenario. Risk is not just a nice to have, it is a critical ingredient in project success, and it needs to be analyzed quantitatively. The use of point estimates, averages, and qualitative methods all underestimate risk. The use of quantitative methods for risk analysis is well established. We discuss how to successfully implement them in the analysis of project risk.20202020 SCAF/ICEAA Virtual Conference
The Future of IT & Software EstimatingCarol DekkersWouldn't it be fun to have a crystal ball to predict the future of IT and software estimating? In this presentation, Carol Dekkers takes us on a roller coaster ride of prognostications based on where we've been, what's going well today and what trends are on the horizon. Let's have some fun connecting the dots between the world today and where we might end up in the future with ICEAA and IT and software estimating.20202020 SCAF/ICEAA Virtual Conference
An Ontology-based Cost Modelling Approach for High-Value ManufacturingMaryam FarsiIn high-value manufacturing, the inadequacy of historical service data, and the high level of uncertainties around service cost in complex assets, make the identification of cost reduction opportunities challenging. These influencing factors also complicate the assessment of the impact of cost drivers on the total lifecycle cost. Addressing these challenges, this workshop presents an ontology-based cost model architecture with a simple data structure and minimum data requirement. Additionally, the impact of cost drivers on the total cost is evaluated and aims to examine the validity and applicability of the ontology and the cost model to different sectors and asset types.20202020 SCAF/ICEAA Virtual Conference
Empirical Effort and Schedule Models for Agile Development in the US DoDWilson RosaIn the Department of Defense (DoD), mainstream agile estimation metrics (story points and user stories) are not available before contract award. This paper presents an effective approach for estimating Agile software effort and schedule using a size measure available before contract award. The analysis explores the effects of initial software requirements (functions and external interfaces), staffing, and domain on effort and schedule. The dataset contained 36 DoD agile software projects implemented from 2008 to 2019.20202020 SCAF/ICEAA Virtual Conference
Splitting Water: A Cost-Benefit Analysis on the installation of an Electrolyser at a HospitalSimon PorterFollowing the governmental commitment of achieving net carbon zero by 2050, interest in the development of the hydrogen economy has increased. Hydrogen can be produced using electricity to split distilled water into hydrogen and oxygen gases, both of which can then be used for other purposes. This study analyses the costs and anticipated benefits of installing an on-site electrolyser at a hospital. This would provide hydrogen to run a Combined Heat & Power (CHP) plant to supply electricity and heat, along with oxygen for medical usage. A number of different options for meeting the brief are discussed, along with anticipation of future trends.20202020 SCAF/ICEAA Virtual Conference
How to Make Your Point Estimate Look Like a Cost-Risk AnalysisCara Cuiule, Amanda Ferraro, Daniel Harper, Richard MabeEstablishing federal budgets for cloud infrastructure costs prior to selecting a cloud provider requires vendor agnostic cost estimating methods. These methods need to reflect the correlation between rates for a variety of infrastructure instances across all viable cloud service providers. This paper describes research and validation leading to CERs based on over 28,000 virtual machine and storage instances. The predictive analytic approaches presented in this paper can provide valid and verifiable vendor agnostic estimates.2020Distance Learning Series
Storytelling for Cost EstimatorsChristina SnyderAs estimators, we advocate the importance of good data; but without context, estimates and analyses are just numbers. To give power to our work, we need to effectively pair good estimating with good communication. There is no existing best practice guidance for estimators on how to create a compelling narrative to accompany analysis. By leveraging a storytelling structure, we can inspire action, communicate our findings in a way that resonates, and ultimately become more effective.2020Distance Learning Series
The Fact That Your Project is Agile is Not (Necessarily) a Cost DriverArlene MinkiewiczAll true agile projects follow the same philosophy, they do not all apply the same set of practices, tools or processes. Agile projects are value driven, thus subject to change. There are, however, business and contractual requirements for up front estimates creating a conundrum. This paper discusses a methodology and rules of thumb for estimating agile projects, based analysis of publicly available datasets, that provides value to stakeholders and aligns with Earned Value Management requirements.2020Distance Learning Series
Are you Smarter than an Algorithm?Andy PrinceCost analysts rely on mathematical algorithms, experience, and subjective assessments to develop cost estimates. However, these analysts often disagree over what is more important: statistically derived algorithms; or experience and judgment. To try to answer this question cost estimating professionals were surveyed for their expert judgment on the complexity and new design values for 15 NASA science missions. The results may or may not be surprising, but will surely be interesting.2020Distance Learning Series
A 3 Market, 10 Dimension TradeDoug K. HowarthAny person, company, or government working across three or more related markets decides how to divide the costs between them. Often decision makers give little thought as to how those resource splits need to work when working in conjunction to a common goal. Using the example of the Prompt Global Strike (PGS) initiative, this paper studies ways to optimize costs in three connected markets (air-to-surface missiles, bombers, tanker aircraft) across ten dimensions.2020Distance Learning Series
Lessons Learned Implementing EVM on Government-led Delivery EffortsJoshua TeitelbaumImplementing Earned Value Management on projects where a Government entity serves as the Lead Systems Integrator presents unique challenges and opportunities when compared to typical EVM applications on industry vendor contracts. This paper will cover lessons learned and best practices for implementing EVM on Government-led integration projects based on field experience from a team that has helped the Government with several such efforts. This will include a description of the methods and tools the team used to baseline projects, gather data from performers, and report status to stakeholders.2020Distance Learning Series
The costverse for the FlexFile: Enabling Powerful Analysis in RBenjamin Berkman, Justin CooperThe Cost and Hour Report ('FlexFile') is a new Contractor Cost Data Reporting (CCDR) format that promises to change the world of Department of Defense (DoD) cost analysis by delivering significantly more granular cost and hour data than its predecessor, the DD 1921 series of reports. The volume of the FlexFile requires a more thoughtful approach to importing, wrangling, transforming, and ultimately communicating data than Microsoft Excel (Excel) may offer. This paper introduces three R packages that help the analyst exploit the FlexFile to its fullest extent.2020Distance Learning Series
Advanced Data Analytics for Maintenance & Repair ReportingPaul Hardin, Alexander LoRusso, Tyler StaffinThe 1921-M/R (Maintenance & Repair Parts Data Report) is the DoD system for collecting actual maintenance event and repair part data in the Cost and Software Data Reporting (CSDR) system. This paper will employ the R Shiny package, which is used for the construction of interactive web applications, to demonstrate the analytical value of -M/R data. Additionally, this paper will explore the mechanics of the R Shiny framework within the environment of advanced data analytics.2020Distance Learning Series
Improving Software Estimating Relations for Army Software Sustainment DataCheryl Jones, Bradford K. Clark, James DoswellNew approaches were employed to improve Army software sustainment cost estimation: causal analysis and annualization of release data. Causal analysis examines the cause/effect relationships between factors that indicate which CERs should be derived. Converting multi-year data to annualized values has improved CERs. This presentation shows what was discovered using causal analysis and the resulting improved CERs.2020Distance Learning Series
Assuring Credibility in the Cost Estimate: Part IIHank ApgarThis presentation updates the original, presented at the 2016 ICEAA International Workshop (Bristol), which traced the maturation of cost estimating attributes and focused on cost credibility. Evidence is provided in the words of government and industry executives, estimating and engineering handbooks, professional journals, and government auditing manuals. This update incorporates the impact of popular cost drivers such as system maturity and cost growth. This presentation concludes with guidance for the estimating professional.2020Distance Learning Series
Costing Out an Air Force Software FactoryStephanie Quintal, Caitlin Burke, Kristen MarquetteThe novel concept of standing up a software factory has left even the most seasoned cost estimator scrambling for guidance. Presented by the Kessel Run cost team, this presentation will provide insights on staffing, physical locale and other hidden stand-up costs. We will discuss real world actuals on team sizing, skill mix, and phasing methods as well as labor rate analysis and acquisition support. Lastly, a template for a generic software factory will be provided.2020Distance Learning Series
Diversity in Software Estimation Approaches: Perceptions to PreferencesShashank Patil, Ria BakhtianiFor strong foundation of Software Estimates, diversity in analytical abilities & preferences of winning teams are the key factors those must be effectively addressed. In this paper we talk about how perceptions and preferences play an important role at very early stages of a bid / pursuits lifecycle. This is an attempt to study and present various preferred approaches adopted by winning teams and their usefulness on critical parameters of Software Estimations; viz: Turnaround Time, Accuracy, Repeatability and Reproducibility. The views in this paper are based on our internal surveys carried out with SMEs and Engagement Managers at various hierarchical levels.2020Distance Learning Series
Navigating the Minefield: Estimating Before Requirements are CompleteCarol DekkersWhile cost estimation is challenging in hardware and manufacturing projects, software cost estimators face unprecedented obstacles when they are asked to estimate software development projects. Not only are these projects rife with uncertainty, requirements often shift, technologies change direction, and customers shift priorities and that's when they are based on solid software requirements! But, what happens when you need to create an estimate before requirements are complete? Learn how software cost estimators in leading organizations are responding to an ever-changing software development landscape and delivering estimates, that enable project success.2020Distance Learning Series
Software Estimating: Is the Problem Solved? Some Myths and FactsSanathanan RajagopalSoftware Estimating is often seen as one of the most difficult domain and an extremely difficult to estimate. However every project (S) now have software and dependency and complexity of which is growing exponentially. In this paper author will explore why people thinks why software estimating is a big issue and will try to break some myths and explore some facts about software estimating.2020Distance Learning Series
Is this Schedule Credible?Jonathan ShriquiA schedule is the life and breathe of any project. As such, it should be constructed with rigor and discipline. But how does one differentiate a credible schedule from a poorly constructed one without being a SME? The US Defense Contract Management Agency (DCMA) has created a scheduling test to objectively determine a schedule's integrity. In this presentation we will review the key concepts, pros & cons and limitation of this scheduling test. Please join me and come find out if your project schedules are transparent or is there more to them than meets the eye!2020Distance Learning Series
Advanced Estimating Methodologies for Conceptual Stage DevelopmentChuck AlexanderThis research paper presents statistical techniques and cost analysis that significantly enhance legacy technology development estimating methodologies. Techniques leveraging independent variables that reflect a comprehensive set of cost drivers relating to technology scale, complexity, type, maturity, and development difficulty are presented. Highly tailored solutions including uncertainty are produced that vastly expand and refine earlier development estimating models. General R&D framework relating key milestones, TRLs and cost benchmarks is constructed and woven into an integrated solution.2020Distance Learning Series
Leveraging the Wisdom of Crowds in Estimating Army SW SustainmentChristian SmartThe use of modern regression and machine learning techniques can improve predictive accuracy compared to traditional log-transformed ordinary least squares, as well as resolving issues with bias and transformation. The combination of multiple models in an ensemble and cross-validation can further increase accuracy. These techniques are discussed in detail and are applied to an extensive set of software sustainment data for 192 Army systems. Results include models based on release type, software changes, and categories.2020Data Management & Machine Learning Category Winner
Augustines Law: Are We Really Headed for the $800 Billion-Dollar Fighter?Brent M. JohnstoneAugustines Law famously proposed fighter aircraft costs are growing so rapidly that by 2054 buying a single tactical aircraft will consume the entire defense budget. Is the situation really so dire? This paper examines the trend in U.S. fighter costs and relates them to generational changes in aircraft design and manufacture. It also examines the new jet fighters of the 2000s to see if Augustines Law is really unfolding as its author originally thought.2020Modeling Category Winner and Best Paper Overall
13 Reasons a Cost Estimate During a Concurrent Engineering Study Could Go WrongAndy BraukhaneDuring early phase spacecraft design, the concurrent engineering (CE) approach is proven to be very efficient. But the condensed and iterative nature of CE sessions can also make life hard for a cost estimator. This work discusses 13 problem areas experienced or observed mainly during one-week, inter-disciplinary space system design studies and provides practical examples on how to tackle them, e.g. how to handle rapid data changes, wrong expectations and a diverse engineering team.2020Processes & Best Practices Category Winner
But Wait, Theres More! Using SFPA for Your Cost, Schedule & Performance NeedsKatharine MannDo you need to estimate software size? Do you want to add value to your program beyond the LCCE? Simple Function Point Analysis (SFPA) can help! We discuss how analysts can engage Program Managers to use SFPA not just for cost estimating, but for scheduling and progress tracking of software development programs. Real DHS Programs and Policies are used to illustrate the benefits of Simple Function Points to the entire organization.2020Software & Agile Category Winner
In Search of the Production Steady State: Mission Impossible?Patrick McCarthyLearning Curves are a vital tool for cost estimators when predicting the number of direct labor hours required for a production run. One challenge of utilizing learning curves is predicting when no additional improvement can be expected, otherwise known as the steady state of the production run. This paper addresses different variables to consider when analyzing data to determine when improvement is likely to cease and the steady state of the production run will commence.2020Analytical Methods & Strategies Category Winner
What's the Big Deal? Is Agile Software Development Really Different in the DoD Acquisition Environment?Katelyn BarbreIncreasingly more projects are leaning towards the use of Agile as a way of mitigating the cost, schedule, and performance of complicated development efforts. Due to complicated acquisition requirements and environment of the DoD, the advantages of Agile versus non-Agile projects are blurred. This research will compare and contrast cost and technical data from multiple DoD acquisition programs to better understand the impact of Agile software development in the DoD.2019Agile
Scrum Agile Software Metric Analysis for AF Information SystemsKyle Davis, Elizabeth Ashwood, Alex Smith, William LanePrograms transitioning from waterfall to agile software development are driving change to the cost analysis process and the way we develop metrics to predict future performance. This case study discusses AF C2ISR portfolio research in a portfolio of information system projects following the Scaled Agile Framework (SAFe). Continuous development and requirements flow introduces additional challenges. This research will discuss the teams evolving approach, study of productivity metrics over time and lessons learned generated from the initial research.2019Agile
Agile Management for Rapid AcquisitionMaureen DeaneAccording to PARCA in the April 2018 Agile Program Managers Guidebook, "Agile philosophies promote rapid incremental product deliveries, provide flexibility to respond to changing requirements, and advocate close customer collaboration." While Rapid Acquisition programs may receive reprieve from the procedural requirements defined by the DOD 5000.01, program managers and the cost estimating community maintain their responsibility to manage programs effectively. This paper presents how Agile program management best practices can be applied to Rapid Acquisition.2019Agile
Understanding Federal Sector Agile Productivity: A Benchmark StudyKevin McKeel, Sheila DennisSoftware estimation in the federal sector is challenging, relying heavily on team productivity. We will present the results of an Agile productivity bench marking study (2017-2018) that was performed across a variety of diverse federal projects from multiple agencies. In addition to quantitative metrics, we captured the effects of federal acquisition constraints on how Agile is implemented, drivers of successful projects, Agile estimation best practices, and other key findings that were part of the analyses and the results.2019Agile
Pitfalls to Avoid in Agile Fsm Productivity MeasurementsRoopali Thapar, Carol DekkersUsing Functional Sizing on Agile projects requires integration of delivery process with measurement process. The gaps, if any, can take away the whole purpose of measurement. Few key things should be kept in mind when doing functional point counting (FP) for sprints or release which are described as part of this paper. It would also provide some recommendations on how to do productivity benchmarking for same.2019Agile
Agile Estimation Challenges When Starting a New Team for a New ProductEric van der VlietThis presentation tells the story of an experienced Cost Estimator with an SPC4 certification that starts with a new team without sufficient Agile experience the development of a complete new product. The budget of the program is fixed and has a duration of two years. Because of the innovative character of the product is it impossible to start with a complete backlog although the program must be managed and controlled and of course estimated...2019Agile
Dont Just Use Your Data ... Exploit It!Adam James, Jeff Cherwonik, Brandon BryantThe Data Age is here. Data are being collected at an exponential rate and cost analysts are struggling to exploit it. Limited structured contextual information and clunky formatting are significant barriers to efficient use. This paper demonstrates the importance of establishing a modern data strategy that considers how analysts identify, access and use data in the context of an example exploitation of a large CSDR data set for the Armys Stryker Family of Vehicles.2019Analysis & Modeling
Production System Cost Modeling Within an MBSE EnvironmentDan Kennedy, Karen MourikasModel-based Systems Engineering (MBSE) incorporates digital models to represent system-level physical attributes and operational behavior throughout the system life-cycle to support product development. To date, many MBSE efforts have focused on technical requirements with little emphasis on cost. Integrating cost models into MBSE provides visibility into cost impacts of design decisions. This presentation explores optimizing production-system design, manufacturing processes, and operations, by integrating various internal and industry production-system cost models into an MBSE environment.2019Analysis & Modeling
An Upgrade to Anderlohrs Retrograde Method for Broken LearningTommie MillerBreaks in production cause havoc for managing and estimating program costs. Estimators must consider the loss of efficiencies that will result from the break as well as the expected rate of learning after the break. Anderlohrs Retrograde Method is a popular technique for estimating the cost of broken learning, but it suffers from a weakness that arise when the learning slope changes at the break. This paper describes this weakness and suggests a robust solution.2019Analysis & Modeling
The Beginning of the End of Traditional Analogous 'Bottoms-up' EstimatingChris PriceTraditional bottom-up estimating methodologies are under fire due to inconsistency and inaccuracy. Leveraging advanced data-driven techniques is key for accurate, rapid cost estimating. Based on a recent endorsement provided by the DCAA in their 2018 Audit Manual, acceptance of alternative estimating methodologies other than 'bottom-up' are now acceptable. We will present a disruptive process for transitioning from a traditional costly, time-consuming analogous 'bottom-up' Basis of Estimate (BOE) bidding process to a new, more cost-effective process2019Analysis & Modeling
Adaptive Curve Fitting: An Algorithm in a Sea of ModelsMichael SchiavoniAdaptive Curve Fitting (ACF) is a novel technique that analyzes finite time series data such as monthly expenditures for government acquisition contracts. By intelligently fitting known resource phasing curve forms to an existing sequence of data, it generates a custom model that extrapolates the remaining values. This enables a variety of joint cost, schedule, and phasing analyses. ACF is based in theory and empirical research, shares similarities with existing techniques, and introduces several key innovations.2019Analysis & Modeling
Hybrid Cost Estimating: The Union of Macro and Micro-ParametricsDale ShermonAt the pre-concept phase of a project the nature of the solution is varied and the number of options numerous. As the project moves into the later phases the feasibility studies begin to produce more information and the number of options starts to reduce. In a hybrid cost estimating framework solution it is possible to migrate from a macro to a micro parametric cost model without need for training or skills.2019Analysis & Modeling
Schedule Estimating Relationship Development Using Missile & Radar DatasetsSara Jardine, Justin Moul, Donald TrappThis paper addresses the challenges of collecting useful historical schedule data for missiles and radars. The objective was to develop schedule estimating relationships and descriptive statistics for predicting the time between major acquisition milestones based on technical parameters. Major milestones are not always clearly defined and consistent. The difficulties of collecting and analyzing schedule data are highlighted. Various schedule drivers are discussed. The results provide a cross-check on the reasonableness of projected schedules for new programs.2019Analysis & Modeling
Estimating Chinas Defense Expenditures (AM08)Jack BianchiWhile DoD releases detailed budget data, China releases virtually none. The Center for Strategic and Budgetary Assessments developed a straightforward, adaptable, and strategy-driven program, the Strategic Choices Tool (SCT), which allows users to modify planned U.S. military force structure and modernization spending over the next decade. We are developing an SCT for Chinese defense spending. This presentation addresses cost estimation methodologies for Chinas procurement, personnel, and O&M costs, and preliminary data collection and modeling efforts.2019Analysis & Modeling
The Legacy of Parametric EstimatingHenry ApgarThis paper chronicles the people, methods, and achievements over past centuries leading to todays acceptance of proven parametric methods for credibly predicting future costs of the worlds most significant tools, software, weapons, structures, and processes. One detailed example is how parametric methods were used to effectively estimate the construction cost of medieval European Castles. 2019Communication & Visualization
Simplifying the Estimate without Sacrificing QualityJeremy GoucherA typical Plan of Actions and Milestones for an ACAT 1 cost estimate is six months. This includes time to define the scope, define the WBS, research and analyze data, develop the estimate, iterate, brief, iterate again, brief again, and so on. Simplifying the estimate will result in shorter research and analysis periods, easier model revisions, easier quality assurance, and easier to understand presentations. This presentation will discuss various methods and processes for simplifying the estimate without sacrificing quality.2019Communication & Visualization
Data Visualization -a Product of Human DesignBenjamin KwokCost estimates are based on data analysis and statistical methodologies that capture complex behavior in mathematical terms. One of the biggest challenges estimators face is describing their analyses in a way that a decision maker can quickly grasp and apply. Data visualization is an important skill set that analysts can use to present complex results in a clear and compelling manner. This presentation will provide an overview and show the benefits of data visualization.2019Communication & Visualization
The Point Estimate Is Not the PointJack Snyder, Joe BauerThe purpose of cost estimating and analysis is to inform planning decisions and secure sufficient / timely funding to facilitate program execution. Or is it? The authors will explore how tangible and intangible roles of cost estimators can better inform design, acquisition strategy, and planning decisions early. Traceable, repeatable, credible and flexible analytical models help to paint the quantitative picture, enabling decision makers to accurately plan a successful program.2019Communication & Visualization
Clearly Communicating Your IGCE to Decision MakersChristopher SvehlakAll that work gathering data, crunching numbers and running sensitivity tests for Independent Government Cost Estimates. Not done yet - now comes communicating them. Its more art than science. Leaders need the key details and facts. But they can get hazy amidst probability distributions, parametrics, discounting, inflation and other elements. This presentation uses a real (sanitized) cost estimate, providing tried-and-true examples of getting the message across understandably, concisely, and decision-ready.2019Communication & Visualization
Comparing Cloud Costs Equitably: J-Funded Capability DevelopmentKevin Buck, John Dubelko, Matthew Griesbach, Anthony RojasGovernment agencies often base cloud investment decisions on prices quoted in published cloud calculators and rate cards that invariably apply different service offering assumptions and cost drivers. The likelihood of making decisions based on incomplete or misunderstood data is high, and misinformed investment decisions can unfortunately be quite costly. MITRE evaluated several predominant commercial and government vendor solutions and developed a suite of vignettes that highlight challenges in comparing price quotes on an apples-to-apples basis.2019Computing
The 11th Commandment: Thou Shalt Migrate to the CloudEmily Hagerty, Orly Olbum, Brian FlynnAnd God gave unto Moses on Mt. Sanai and to General Mattis in the Pentagon the 11th Commandant: Migrate to the Cloud. This paper presents research that supports mission owners, program offices, and cost components in their quest for the Holy Grail - cost effective and cyber-security compliant migration of legacy systems and data to the cloud. It offers innovative artifacts to support up-front requirements and trade-space analyses and back-end cloud design, build, testing and deployment.2019Computing
Predictive Analytic Estimates of Cloud Costs for Government IT SystemsRichard Mabe, Dan HarperThe White Houses "Cloud Smart" strategy encourages agencies to consider life cycle costs focused on three main "pillars": workforce, procurement, security. IT funding requires two years to approve and allocate a budget; so commercial Cloud rates will not be known as the budget is approved. Cost estimates must therefore capture historical workforce, procurement and security cost trends to estimate future solutions. Predictive analytic estimating approaches can provide effective estimates using calibrated historical data.2019Computing
Measurements in CyberspaceArlene MinkiewiczAddressing cybersecurity and software assurance is becoming increasingly important within the DoD and beyond. With software systems constantly talking amongst themselves and across various networks, concern is growing about the protection of our sensitive data and applications. But one must also consider how much security is too much and when the "right" degree of cybersecurity has been achieved. This paper discusses cybersecurity and presents measurements suitable for accessing the efficacy of mitigation strategies.2019Computing
Cost Analysis Needed for Blockchain EffortsHarvey ReedU.S. government offices are exploring blockchain to address key challenges in their missions. The motivation stems from blockchain enabling peer-to-peer information sharing, without a centralized authority, which in turn supports execution of processes which span organizations. Currently, government lacks tools to describe blockchain projects with sufficient consistency to support acquisition and cost analysis. This presentation proposes a blockchain descriptive framework as a first step, with a focus on cost elements and drivers for blockchain projects.2019Computing
Forecasting Future Amazon Web Services PricingHassan Souiri, Andrew KicinskiThe National Reconnaissance Office (NRO) Cost and Acquisition Assessment Group (CAAG) produces independent cost estimates to support decision making, budgeting and trade studies. Cloud service costs procured from Amazon Web Services (AWS) are becoming increasingly scrutinized. A thorough analysis was conducted to collect historical AWS prices and model the downward trend. Autoregressive time series models were fit to storage and compute service prices, resulting in annual price reduction rates to be applied to future estimates.2019Computing
"Big Data" Analytics in Operations ResearchCara Cuiule, Grady NollAs the world becomes more connected and data-driven, accumulating masses of data is becoming a more common corporate strategy. This paper discusses the current PRICE® Systems data collection methodology within the company that utilizes web scraping/crawling. The team will present both lessons learned and potential shortcomings in the current method of approach, along with plans for future endeavors.2019Data Collection & Management
A Discussion on Data Reliability: Evaluating Qualitative and Quantitative DataKevin DeStefano, Faye KimObtaining data is an integral part of any analysis and collecting good data that is accurate and robust can be challenging. The quality and validity of data can strongly influence the final cost position recommendation that is presented to accurately and effectively inform decision makers. This paper will discuss data reliability, how to evaluate qualitative and quantitative data, and how to create and evaluate metrics to track the quality of data to ensure accurate cost estimates.2019Data Collection & Management
How to Create a Cost Estimate Using Data Science and RRRrrr Studio!Jeremy EdenICEAAs effort to expose the community to data science concepts and advantages to cost estimators has been popular. However, you want to know how to apply tools and techniques for creating an estimate NOW. Well prepare to create your very first data science cost estimate. There will be code, there will be data, and they will be for ye to plunder and take as your own because "not all treasure is silver and gold mate."2019Data Collection & Management
When Data isn't EnoughKellie Scarbrough (Wutzke)How can an analyst transform messy data into valuable insights? This presentation outlines several techniques in Excel and R to clean and process large volumes of data. In R, well look for important relationships among our data. Well also create a corpus to utilize an otherwise unhelpful text field and explore some text mining techniques. Finally, well explore how Tableau can help us visualize and communicate our findings.2019Data Collection & Management
CDMS: Developing a Database Solution for Data ManagementBenjamin Truskin, Brian Wells, Tony OcchiuzzoThe NRO CAAG has a data issue...too much for Microsoft Office products to handle! The detail and variety of data the CAAG has collected for 40 years has forced a revamp of their data management and storage systems. The CAAG Data Management System (CDMS) will be the next generation database solution to store/normalize CAAG data. This presentation gives firsthand accounts of the challenges encountered and how the CAAG overcame and implemented our solution.2019Data Collection & Management
Growing Maintenance Costs: Understanding How Weather Impacts Maintenance.Bryan AndersonThe in-service maintenance is often a major cost driver. This paper explores an innovative approach to identifying maintenance trends in the Federal Aviation Administration (FAA). Using NLP and other Machine Learning techniques, the FAAs RMLS maintenance text data is correlated with the ITWS weather data. The session goal is to cover one way on how to think about NLP in a cost setting. Followed by implementation with Python and other data science technologies.2019Machine Learning
Machine Learning Assisted Data Extraction and NormalizationJonathan Brown, Devin GeraghtyData collection and normalization is a key component of analysis. However, it often requires significant time and effort to properly extract quality normalized data from the raw datasets. Given the recent proliferation of larger datasets a more automated approach to data extraction and normalization is required. This research applies machine learning text classification algorithms and methods to automate historically manual data normalization tasks.2019Machine Learning
The Robot Forecaster--A Continued Study of Artificial Neural Network ApplicationNathan EskueIf your job includes predicting the future then you understand the difficulties of the "unknown", of all the little gremlins (sometimes called risks) that can mess up the most elegant and well thought-out plan. There are data models and statistical methods that can help a great deal, but if the conditions are right, a relatively new tool that is as accessible as it is misunderstood might just be what you need. We will cover the basics of Artificial Neural Networks and I'll share the progress I've made since ICEAA 2018 in putting these digital brains to work.2019Machine Learning
Don't be Scared, Machine Learning is EasyMary Johnson, Dakota ShaferThus far the field of cost estimating has only explored the pedagogical utilization of machine learning within cost estimating. The novelty of machine learning makes it difficult to find real use cases demonstrating the efficient application of machine learning to the cost estimating process. This paper and accompanying presentation will document a case study exploring the data cleaning and modeling process for a cost estimate done on DoD installations.2019Machine Learning
Machine Learning and Natural Language Processing Applications for Cost AnalysisKaren Mourikas, Jose Lemus, Enrique SerrotHow can machine learning and natural language processing enhance the traditional methods of cost analysis? By analyzing larger datasets more easily than the human brain can, and by automating many of the manual, time-consuming tasks needed to cleanse and prepare the data for analysis. This presentation examines several applications in which we integrated Natural Language Processing using python libraries, with various Machine Learning methods, in particular trees, randomization, and boosting, to improve prediction accuracy.2019Machine Learning
Beyond Regression: Applying Machine Learning to ParametricsKimberly Roye, Christian SmartCost estimating has relied primarily upon regression analysis for parametric estimating. However, regression analysis is only one of many tools in data science and machine learning, and is a small subset of supervised machine learning methods. In this paper, we look at a variety of methods for predictive analysis for cost estimating, including other supervised methods such as neural networks, deep learning, and regression trees, as well as unsupervised methods and reinforcement learning.2019Machine Learning
Data Impacts on System Readiness and CostPaul BrownOperations and support costs are becoming increasingly in focus as system readiness is addressed. This paper will discuss the impact data has on forecasting readiness and the corresponding effects on cost. Identifying "correct" data and sources, such as CSDRs and OEM data reporting systems, plays an important role in modeling both cost and readiness. Different approaches to improve readiness and their impacts on cost and data needs will be discussed.2019Methods
A New Approach When Cost/Capability Trades Matter MostJeffery Cherwonik, Adam James, Richard BazzyConsideration of life cycle costs (LCC) early in the acquisition process, specifically prior to Milestone A, is paramount to credible requirements generation and improves the probability of program success. This paper describes a parametric universal wheeled ground vehicle LCC model based on CSDR data and designed for integration into a modeling and simulation environment that produces alternative vehicle designs. The LCC model uses outputs from a requirements-driven preliminary design model to generate informed cost vs. capability/requirements trades. 2019Methods
Five Steps for Improving the Accuracy of Rough Order of Magnitude Estimates -Bell V-280 Wing Predictive Cost Analytics Case StudyZachary Jasnoff, Ross RaburnAccording to the GAO Cost Estimating and Assessment Guide, ROMS are "developed when a quick estimate is needed, and few details are available". This leads to challenges with ROMs especially when estimates become "locked in" early, based on the least accurate data. We will discuss five enabling steps that estimators can take to improve the accuracy of ROM estimating. This includes using a predictive analytic data driven methodology for leveraging and extrapolating limited data sets.2019Methods
Intellectual Property Valuation: An OverviewCynthia PrinceIntellectual property (IP) refers to inventions, literary and artistic works, designs, and symbols, names and images used in commerce. IP is protected by patents, copyright and trademarks, which enable earning of financial benefit from these inventions or creations. While most cost estimators already know that, what do we know about IP valuation? If asked to derive a cost estimate for IP valuation, what approaches could be taken? This overview will answer that question and more.2019Methods
PAC Template - Program Acquisition Cost (PAC) TemplateHorace White, Jim Cain, Raymond GarridoOne of the goals of our young and upcoming cost organization was to create a universal cost model template. Fast forward 6 years and the template we developed introduced standardization, a starting point for new analysts, minimized tedious work, and reduced common errors. We would like to share the success story of the cost model template and the lessons learned through the 6-year development process.2019Methods
Self-Organizing Markets And TimeDoug HowarthMarkets demonstrate statistically significant self-organization concerning how they respond to changes in prices and the product features offered to them. The nature of these self-organizing activities changes over time. What works for a market now may not work a few years from now. Being able to characterize market self-organization now and in the future is key to optimizing financial success, which this paper examines.2019Planning & Strategy
Contracting for Agile ProjectsBlaze SmallwoodThe nature of agile, with its flexible approach to requirements and stress on incorporating feedback from frequent stakeholder engagements, makes contracting for agile projects particularly difficult. This paper proposes a new idea that uses a capability-based incentive structure, which is currently being considered on upcoming Department of Defense contracts. The paper will present the mechanics of implementing this approach, its pros and cons, and a comparison to other approaches recently used on similar projects.2019Planning & Strategy
Enhance Estimator Success & Organization Competitiveness in Supplier AssessmentJohn SwarenThis presentation reviews the use of an effective methodology to perform Supplier Assessment. Estimator success and organizational competitiveness are enhanced when supplier quotations can be assessed and validated, leading to stronger negotiation positions as well as basis for preferred suppliers. This paper offers a quantitative methodology to benchmark, monitor and evaluate potential suppliers of a customer-specified technology. Once established, a Supplier Assessment capability creates the outcome of timely consistent bid evaluations as well as ceiling prices for negotiation.2019Planning & Strategy
Application of Conjoint Regression to Requirement AnalysisJacob WalzerConjoint analysis provides a simple method of measuring how the tradeoff between different capabilities can affect the perceived value of a product. Normally associated with marketing research, it can also be utilized to help estimate the cost of developmental items based on their expected requirements. This presentation explores the best practices, common pitfalls, and validation techniques associated with conjoint regression, and includes a notional example derived from an application of this technique in a USMC tradeoff analysis study.2019Planning & Strategy
Measuring Portfolio Value for Government Programs and InitiativesGeorge Bayer Jr., Bryan AndersonTransformational government initiatives which require major capital investments or acquisitions are complex, difficult to measure, and challenging to articulate to decision-makers. Considering the complex dependencies and implementation risk of individual programs, risk-adjusted cost estimates for portfolios are often overstated. This paper examines how cost estimators and analysts measure portfolio value and generate stakeholder advocacy for major government initiatives, which require multiple acquisitions for implementation and agency policy changes to change behaviors and realize value.2019Processes & Best Practices
2020 Census Cost Estimation - GAO Audit Lessons LearnedEdward Kobilarcik, Neala JonesThe 2020 Census LCCE is a large, complex estimate with a projected $15 billion total cost. In 2016, the GAO assessed the estimate as only partially meeting any of the GAOs four characteristics of a quality cost estimate. Using the GAOs Cost Estimating and Assessment Guide, the 2020 Census LCCE team used a data-driven analysis to improve the LCCE and its supporting documentation. The result was a drastic improvement in the 2017 GAO assessment.2019Processes & Best Practices
GAO Updated, New, and Coming Best Practice GuidesJennifer Leotta, Brian BothwellThis session will go through GAO's updated, new, and forthcoming best practice guides. This includes a review of the best practices described in the cost guide, schedule guide, technology readiness assessment guide, and Agile assessment guide.2019Processes & Best Practices
Contractually Speaking: The Story of DOD Contracts & Potential ConsequencesOrly Olbum, Stephanie Lee, Peter BraxtonA contract is the legal document holding government and contractor accountable for their responsibilities and can play a significant role in cost and schedule issues throughout its lifetime. This paper utilizes the Contracts Database and CADE to investigate whether or not contract vehicles themselves contribute to these issues. We also dive into the acquisition process to determine the efficiency of different contract vehicles and how profit may impact a contracts success or failure.2019Processes & Best Practices
Business Case on the Cost/Benefit of U.S. Government Support of Contractors in order to Maintain CompetitionTodd Pardoe, John Stedge, Chad Bielawski, Jaimie SmithThe DoD is facing shrinking industrial bases in its acquisitions due to contractor mergers, specific military applications, and scarcity of materials. National security, strategic posturing, and Congressional interest make it vital to keep these industries viable and competitive. Mitigating the risk of "shrinking industrial bases"offers unique opportunities in structuring these acquisitions. This study looks at the costs and benefits associated with strategies for acquisition of defense systems within the constraints of shrinking industrial bases2019Processes & Best Practices
Engineering the Acquisition Process: Better Value Through Mechanism DesignChristian Smart, Britt StaleyAs both the regulator and the only buyer in a market that measures over $500 billion a year in acquisitions, the Defense Department has tremendous leverage but does not make full use of it. Limited competition in the prime contractor market results in higher prices and lower quantities purchased. There have been some recent strides made to achieve better value, but much more can be done if the government will pursue strategic approaches. In this paper, we consider the use of mechanism design to achieve better value.2019Processes & Best Practices
Creating a Cost Driver S-CurveSandy BurneyS-Curve analysis is not just for Risk Analysis. It can be used for both Contractor and Government Cost Driver analysis. The presentation will: 1) describe the similarities and differences between Cost Driver and Risk Analysis; 2) provide a rational for doing an S-curve on Cost Drivers early in the procurement lifecycle; 3) provide detailed steps to developing a Cost Driver S-Curve; and 4) display examples of Cost Driver S-Curves and additional outputs that result from the analysis.2019Risk
The 3-Point Method Redux: Estimating Cost Uncertainty Given Only a Baseline CostMarc GreenbergLeveraging a 2007 paper entitled "Estimating Cost Uncertainty when only Baseline Cost is Available," this presentation revisits the 3-point method: a scenario-based application of the Analytical Hierarchy Process (AHP). This presentation steps through an overview and notional example on applying the 3-point method to demonstrate how pessimistic and optimistic costs are derived from a given baseline cost. These three costs, in turn, serve as the requisite three parameters of the triangular distribution.2019Risk
When is Less More? Level of Detail in Cost and Schedule Risk AssessmentLaura Krepel, Zachary PirtleWhen performing cost and schedule risk assessment (C)SRA, the level of detail in the analysis schedule dictates the time to build, validate, and review the model. In this brief, we will discuss the pros and cons of using a more or less detailed analysis schedule as the basis for these models, whether 10s or 10,000s of lines. We will explore the level of effort, accuracy, and value (stakeholder understanding) of each approach through the use of real-life scenarios, based upon the authors' experience developing models for NASA and other federal agencies. An overview of the basic requirements for any SRA or JCSA, including detailed schedule for execution, monitoring, and control of the schedule to validate model critical path will also be provided.2019Risk
Risk: Intentional Interaction With UncertaintyKai Lemay, Raymond Britt, Jessica LaitiIn this session we review how Risk Management is too often perceived as a compliance exercise rather than a management tool. Because of this outlook, the vast majority of risk registers record too few risks. The limitations and pitfalls of restricting the amount of risks recorded leads to a reliance on point estimates, an incomplete risk mitigation plan, and an absence of tangible impacts and analysis. Its not the big risk that lead your project to failure, its the 30 little risks that occur in a unique sequence of events that spirals into a perfect storm.2019Risk
Impact of Scope Changes on Schedule GrowthDaniel Bowers, Geoffrey Driskell, Gail Flynn, Kelsey AndersonCan cost analysts predict and thus estimate schedule growth based on scope changes? Is there a relationship between the two? Leveraging from the same research and data as the 2018 ICEAA Workshop paper "Impact of Scope Changes on Software Growth" by Jonathan Brown and Gail Flynn, this research focuses on the possible schedule impacts due to pure growth and scope changes. Analysis was performed from a scheduling perspective to show the impact scope changes can have on the program schedule.2019Scheduling & Programming
The Programmatic Estimating Tool (PET)William Laing, Erik BurgessThe Programmatic Estimating Tool (PET) provides a method for adjusting cost estimates in scenarios where programs face rigid schedule and/or budget phasing constraints. PET integrates program cost, schedule, and budget phasing into a single user-friendly tool. Using historical correlation between cost, schedule, and phasing model residuals to generate a tri-variate conditional distribution, PET can be used to estimate the impact of: Schedule and/or phasing constraints on cost, Cost and/or phasing constraints on schedule2019Scheduling & Programming
A Program Managers Guide to Reliable Subcontractor ReportingPatrick Malone, Garth EdwardsPrime Government contracts are executed with significant subcontractor content. Objective measurement of their performance can be difficult leading to methods such as percent spent or level of effort. These methods can result in erroneous progress masking true performance. Using discrete earned value best practices provides prime contractors and Government agencies realistic subcontractor performance for early corrective action. This paper investigates how to implement low risk objective EV techniques to promote reliable and effective subcontractor reporting.2019Scheduling & Programming
Blending Contract and Project Management to Achieve Cost SavingsTeresa Price, Gerald Jones, Wilfred Tagud, Andrew Drennon, Tia M BarnesThis study examines the waterfall that occurs between segregating projects and contract management, and how interdependencies drive project cost savings. This interpretive study will visit the discussion around the pitfalls of segregating these interrelated activities, and how the process can impact your agency bottom line. It will exam how cost avoidance can be achieved and provide recommendations to assist in the minimization of costs, schedule, and risk.2019Scheduling & Programming
EVM Visualization: The Radar ToolTyler Staffin, Brian FlynnThis research presents an innovation in EVM data analytics and visualization that gives acquisition managers the potential to better understand system level risk and to laser-focus their attention on critical elements of cost. The tool transforms a massive set of disaggregated data into a manageable form, and extracts "critical" elements by filtering on a set of user-defined, rules-based criteria. The tool holistically displays absolute growth, percent growth, and percent complete in one radar graph.2019Scheduling & Programming
The 3C's of Measurement and Cost Estimating Success: Create, Confirm, ConvinceCarol Dekkers, Dan FrenchTo many, it may be obvious that good cost estimating is fundamental to software development"¬. However, most technology organizations remain unaware of the benefits of formal cost estimating and measurement - even when they are technology-centric and use defined project management methods. This presentation promotes a 3C approach to increasing awareness, gaining acceptance and understanding why formal cost estimating is critical to organizational success.2019Software
From Point A to Point Estimate: How Requirements Become Function PointsDan French, Carol DekkersIFPUG function point analysis has been successfully used to size software by organizations for 30+ years. However, there remains confusion as to how the process works. This presentation demystifies the Function Point Analysis process and educates those interested in how function points are counted and/or are considering the use of FP in their organizations. The presenters will discuss functional/non-functional requirements, good/poor requirements and how an FP count is developed based on the functional requirements.2019Software
Estimating and Managing Modern Software Development Programs Bob Hunt, Dan Galorath, Ian BrownThe recent final report of the Defense Science Board (DSB) Task Force on Acquisition of Software for Defense System (February 2018) outlines key challenges facing Defense with resect to software acquisition. The summary report states, "Software is a crucial and growing part of weapons systems and the Department needs to be able to sustain immortal software indefinitely. " Identified issue range from poor initial estimates to the Federal Governments inability to utilize modern developmental tools, such as the "Software Factory". This paper will discuss thes2019Software
Army Software Sustainment Cost Estimating ResultsCheryl Jones, James Doswell, Brad Clark, Robert Charette, Paul JanuszThe Army has conducted a study over the past six years to improve the estimation accuracy of software sustainment systems cost. Based on an extensive data call of 192 Army systems, data analysis revealed several types of cost estimating relationships based on release type, release rhythm, and three categories of data. Analysis of a sustainment cost risk model was also conducted. This presentation will show the study results including what worked and did not work.2019Software
The Journey to Better ERP EstimationJon Kilgore, Jenna Meyers, Arlene Minkiewicz, Cara CuiuleSoftware estimation is never easy, especially when implementing large complicated Enterprise Resource Planning (ERP) Systems comprised of significant off-the-shelf (OTS) components combined with glue code interfaces to numerous legacy systems. Data to support such an estimate is sparse, incomplete, and often inconsistent. This paper presents the first phase of a journey toward improvements to this situation, utilizing analysis of historical data to inform future estimates, focusing on process and lessons learned.2019Software
Software Cost Estimation -Why Is It Different?Harold van HeeringenSoftware Cost Estimation is often considered the most difficult part of an integral cost estimate. Unrealistic software estimates often result in disasters. Examples are new tunnels, metro lines or factories that cant start their operation because everything is ready, except for the software part. One of the main reasons for this is often an inaccurate and optimistic software cost (and schedule) estimate. Why is estimating software differently and how is ICEAA dealing with this?2019Software
Adapting Existing Cost Model to Estimate Section 804 Rapid Prototyping and RapidRick Garcia, Jesse CelisAnalogous prototype program cost data doesnt readily exist and the development of prototype specific cost databases is a lengthy process. Our immediate Cost Estimating needs must address estimating prototype satellites using existing cost models. Our evolving process elicits and applies explicit cost model adjustments to existing cost models to directly address the explicit differences between prototypes and operational systems, specifically in the areas of CERs to Model Commercial-Like Programs and %New Design and Heritage.2019Space & Missiles
The Efficacy of NASA's Joint Confidence Level PolicyAndy PrinceIn January 2009 NASA policy established the performance of a Joint Cost Schedule Confidence Level (JCL) analysis as the basis for project funding and external commitments. This paper examines the efficacy of that policy relative to reducing cost overruns. Projects executed before the policy implementation are compared to projects approved post-policy implementation. Various statistical techniques are used to perform comparative analyses and alternative explanations are explored.2019Space & Missiles
"Phase 0" Space Mission EstimatesMichel van PeltHow to prepare ROM estimates for future space missions, within little time and using even less inputs? Based on experience with preparing estimates for the very first Principal Investigator proposals for new ESA Science missions, this paper provides rules-of-thumb, observed high-level cost drivers and trends, checklists, comparisons with later-phase cost estimates, potential pitfalls and general lessons learned.2019Space & Missiles
An Analytical Explanation for Vertical Integration Behavior in the MarketplaceCaleb Williams, Jack SemrauVertical integration is a costly and difficult-to-reverse strategy with significant risks, yet it has experienced a sharp rise in popularity within the satellite sector over the last decade. Building on previous research, this session demonstrates how commercial parametric tools can be used in combination with business-case analysis to understand why aerospace companies are increasingly pursuing vertical integration strategies. Additionally, it discusses how the framework for this analysis can be applied across other industries.2019Space & Missiles
Estimating Missile G&C Development Cost: An Important AdvanceJames York, Paul Hardin III, Jeffery Cherwonik, Olivia CollinsThis paper describes improved cost estimating methodologies for tactical missile Guidance and Control (G&C) Development Engineering (DE), generally the largest contributor to missile development cost and typically one of the most challenging cost elements to estimate. These methodologies represent an alternative approach to commonly used cost factors (e.g., factor of prototype production cost) which have error metrics that normally exceed 50%. The research developed CERs with error metrics below 50% and as low as 20%.2019Space & Missiles
A Comparative Analysis of Nuclear Security Enterprise Estimates.Terry Josserand, Leone Young, Wendy LeeVarious government publications recommended adapting multiple methods and approaches to enhance the quality, validity and reliability of a program cost estimate. This study utilizes a COTS software tool and two approaches based on different system architectures, for a Nuclear Security Enterprise estimating effort. The cost estimate results exhibit a phenomenon of near-homogeneity. Model techniques, critical parameters, and cost drivers are analyzed and evaluated to help comprehend the model behaviors.2019Space & Missiles
Quantifying Annual Affordability Risk of Major Defense ProgramsThomas J. Coonce, David TateUsing DoD Selected Acquisition Reports (SARs), the authors use a variety of statistical techniques to estimate the distribution of possible funding profiles that might result from a new program, given its initial estimates, program characteristics, and funding climate. The authors then extend this method to address the remaining annual cost risk of partially-completed programs. The results allow analysts to assess the annual confidence levels of a given program.2018Acquisition & Operations
Developing an Independent Government Cost EstimateRichard Shea, Shavaiz SaoodThe Independent Government Cost Estimate (IGCE) is an unbiased cost estimate based upon Government requirements and inputs without input from potential contractors. The IGCE is a tool used by the Government during source selection as the basis for reserving funds for the contract, comparing costs propose by submitting contractors, and used as a guideline to determine contract proposal price reasonableness. This presentation discusses the steps building the IGCE and the next steps in the process.2018Acquisition & Operations
Diversity of Maintenance Logs and DelayBryan Kenneth AndersonMaintenance activities play a leading role during the In-Service Management phase in the FAA's AMS. This paper explores the potential relationships between FAA maintenance activity and cost incurred by the public. The paper objective is to determine if there is a correlation between facility type, maintenance activity incurred by the FAA, and flight delays impacting the public that could drive reclassification of facility type.2018Acquisition & Operations
Portfolio Analysis: Estimating the UK Defence BudgetDale ShermonThis paper will consider how to build a portfolio analysis of a government budget and, once established, what questions can be considered through this analysis. I will examine the capability gaps and the cost of replacement for obsolete capabilities. As an example the paper will consider the UK MOD budget for current and future expenditures with lessons learnt trying to conduct this ambitious analysis.2018Acquisition & Operations
Estimating Software Sustainment CostsArlene MinkiewiczSoftware sustainment costs can make up as much of 90% of the total ownership cost of a program, yet the software industry continues to struggle with the best way to predict these costs. Traditional cost drivers used for acquisition estimates don't necessarily apply for sustainment. This paper discusses an on-going research project collecting cost and technical data from evolving software systems in an effort to determine the best sustainment cost drivers and cost estimating relationships.2018Acquisition & Operations
Tabular CARDs: Orderly Data for the Cost CommunityJeff McDowellRecent CAPE guidance has brought positive exciting change to the Cost Analysis Requirements Description (CARD). This change is a new focus on standardized program data content via workbook templates balanced with reduced-scope narrative content. This paper demystifies the new CARD templates by describing each of the tables emphasizing how they were designed to satisfy enduring and recurring cost estimator needs. Also their relevance to MIL-STD-881D Product Extensions and CSDR Plan Standards will be presented.2018Acquisition & Operations
Exploring How Systems Age (and Fail) and the Impact on O&S Cost EstimatesPatrick McCarthy, Alex KrencickiSignificant time is spent estimating the development and production costs of weapon systems. However, estimating how systems age and fail during the O&S phase, as well as trying to identify optimal sustainment and maintenance strategies can be every bit as challenging and have a significant impact on the estimated lifecycle cost. This paper will address obstacles associated with predicting fleet aging profiles, including dynamic failure distributions and the integration of right-censored data into the estimate.2018Acquisition & Operations
Cost and Competition in U.S. Defense AcquisitionEric LofgrenThe cost estimator has a major role in determining the price, and therefore value, of major systems acquisition in the Department of Defense. Two primary costing methodologies include 'should cost' and 'will cost' analysis, and are affected by 'must cost' realities. This paper explores the history of these costing methods and places them in a theoretical context, first with respect to the meaning of competition, and second with respect to the nature of cost.2018Acquisition & Operations
Using Army Software Sustainment Cost Estimating ResultsCheryl Jones, James Doswell, Bradford CarkThe Army has completed an initial analysis of software sustainment cost and performance data collected from ~250 Weapons, C4ISR, and ERP systems. The analysis addresses primary resource distributions and cost estimating relationships across multiple functional domains, and establishes a foundation for efficient resource allocation decisions across the Army systems portfolio, and projected policy and process changes. The results, including the detailed statistical analysis, will be made available for use by participants.2018Acquisition & Operations
Using Sustainment CSDR DataSandra B. EnserOperations and support costs can exceed 60% of the total Life Cycle Cost of aerospace and defense systems. This work is increasingly performed by contractors rather than by organic facilities. This paper will demonstrate how the Sustainment Cost and Software Data Report (CSDR) data collected by DCARC can be used to estimate future sustainment costs, support Performance-Based Logistics business case analyses, identify maintenance and repair problems, streamline review of hours and costs in negotiations, and evaluate contractor profits.2018Acquisition & Operations
Potential Impacts of Non-Major Program Data Collection on Cost EstimatingBrandon S. Bryant, Stephanie MyrickThe FY 2017 NDAA requires the armed services to collect cost data on all acquisition programs over $ 100 M. The requirement creates a challenge of balancing data collection standardization and flexibility for programs that have not been required to follow traditional data collection strategies. However, this new requirement is an opportunity to improve DOD cost analysis. This paper provides implementation strategies, how to leverage current data collection tools, and potential impacts on cost estimating.2018Acquisition & Operations
Maturing Cost Estimation in a Rapid Acquisition EnvironmentJennifer Manring, Natalie FaucherDoD is moving toward rapid acquisition approaches to deploy critical capability to the warfighter quickly. Rapid acquisitions have shortened timelines and less definition than traditional acquisitions. These constraints challenge traditional cost estimation practices. Multiple case studies were conducted to understand the challenges to cost estimating processes in a rapid acquisition environment. Easily implementable recommendations are made to help ensure credible and confident cost estimates are developed, even within the constraints of these challenging environments.2018Acquisition & Operations
The Concepts of Size and Productivity in an Agile WorldHarold van HeeringenMany organizations have moved to agile development software development. One of the cornerstones of one of the main Agile methods, Scrum, is to de-professionalize estimation and to use an arbitrary effort unit of measurement, the story point, to estimate projects. Because of a lack of understanding of the matter, management in these organizations believe that this is the way to go, but now they are facing huge challenges in estimation and budgeting. it is explained why agile has resulted in lower estimation maturity in the industry, and the ways this can change again using industry standards.2018Agile
Scaled Agile Deliveries: Do We Still Need Estimates?Eric van der VlietOrganizations prefer agile software development methodologies because of the high level of agility of requirements and the focus on delivering business value. Self-controlling teams deliver as much as possible value within the available budget. Attention for estimation is reducing what has a direct impact on the control of agile deliveries. This presentation explains what amount of estimation is required in what stage of a scaled agile delivery to provide the right level of delivery control.2018Agile
An Implementation of Automated Structural Design-To-Cost in a Model Based Engineering EnvironmentChristopher Price, Leonor Hagberg, Brig Bjorn, Apinut "Nate" Sirirojvisuth, Daphne BiddleThe integration of Computer-Aided Design (CAD) tools with Cost Analytic tools in COTS Integration Environments allows designers to quickly and easily estimate the cost of structural designs as they are designing, thereby directly supporting Design-To-Cost, in a Model Based Engineering (MBE) environment. This paper will demonstrate how quick and easy it is to estimate modifications to a structural assembly using such an integration environment.2018Agile
A Perfect Marriage: Mr. and Mrs. Agile-Cloud?Teresa Price, Wilfred Tagud, John Sullivan, Andrew DrennonWhat mandate brought together no one should tear apart. This interpretive study analyzes the relationship between Cloud computing services and Agile Methodology to determine if it's a right fit for achieving cost avoidance. In addition, the study includes experience-based input from the FAA's Chief Information Technology officers as well as other agencies. The research will conclude with the challenges and identified lesson learned when both are implemented together. 'I can't say, 'I Do' without you: '2018Agile
Accurate Agile EstimationKevin McKeelEstimating software development costs in an Agile development environment is challenging due to undefined scope, lack of planning beyond the next few sprint cycles, and distrust of traditional software estimation techniques within the Agile world. This presentation will detail an IRS case study, and describe how these challenges were overcome.2018Agile
Agile Software Development Cost Risk for Information Technology ProgramsAdam LetcherThis paper addresses the issue of how to analyze and quantify cost risk within an Agile software development environment. Because of the unique nature of Agile software development, some of the traditional cost estimating and risk analysis practices should be altered. This paper proposes a process for assessing Agile development risk using a capability sizing metric and presents the governing equations for doing so.2018Agile
Using Function Points to Manage Agile Product Backlog: Fact vs. FictionDaniel French, Carol DekkersThe Agile framework continues to grow in popularity as IT organizations struggle to deliver projects on time and budget. Agile is not a silver bullet that many think it is and the primary reason for this is that Agile estimation methods are completely subjective. This presentation details how IFPUG function points and other rule based size metrics can be effectively used to help Agile teams estimate sprint sizes, calculate velocity and manage the product backlog.2018Agile
Agile Software Development Cost Factors: A Case StudyBlaze SmallwoodThe lack of data on government agile software development programs has made estimating costs for new agile development programs challenging. This paper seeks to address this challenge through a case study of several completed DoD agile projects with cost, schedule, and performance data. Several relevant metrics will be examined, including cost per story point, cost per requirement, scope growth rates, impacts of team size changes on velocity and productivity, and various others.2018Agile
Processes of Weapon Systems AcquisitionEric LofgrenOne major outcome of recent reforms in defense acquisition has been the organizational separation of research and development from production and sustainment. But is the reform simply a change in organization charts or does it have real implications for the cost estimator? This companion to a 2017 paper will build a historical understanding of the defense innovation and procurement processes and suggest a proper role for the cost estimator under the new organization.2018Comprehensive Perspectives
Investigation of the Disconnect Between the Use of CSDRs Cost and Contracting CommunitiesMarc Stephenson, Brian DavisToday, the organization of project offices separates the program manager (PM) from the contracting officer. It might seem that for this reason, the PM's cost team often fails to adequately support contract negotiations. However, this paper will demonstrate that a more fundamental problem is the disconnected structure and flow of information to each of the functional teams. It will investigate the ties and prospective improvements to bring the cost estimator back into the contracting cycle.2018Comprehensive Perspectives
Where Have All the Estimators Gone?Cris Shaw, Tom Dauber, Andrea West, Bryan Kenneth AndersonThe declining budgets, expanding oversight, sequestration challenges, and acquisition changes faced by Government programs has resulted in the increased importance of cost estimating and analysis, and higher value of qualified cost estimators. However, like many associations and organizations, ICEAA has seen a decline in membership despite expanding the overall focus of the organization into new areas. We will identify key influencing events, analyze recent ICEAA membership trends and recommend potential areas for membership growth.2018Comprehensive Perspectives
Being Certain About Uncertainty: Part 2Andy Prince, Christian B. SmartThis paper addresses the difficult and pervasive challenge of identifying extreme cost growth early in a project's life cycle and preventing it before it happens. The paper examines how DoD and NASA have implemented policies and practices to minimized or eliminate extreme cost growth and why those policies can sometimes fail. Finally, we propose some remedies that could help and identify some warning signs that a project may be headed for trouble.2018Comprehensive Perspectives
WBS vs CES: Navigating Different Structures for Software SystemsBrad Dahlin, Bakari DaleArmy estimates are created using either a Work Breakdown Structure (WBS) and the Cost Element Structure (CES). Determining which structure to use for your cost estimate can be a challenge. Through this presentation, the authors will present when to use which structure, and what information and level is required for each and provide information as to when a WBS/CBS is required.2018Comprehensive Perspectives
Introduction to the Organizational Cost Community Framework: UpdateTerry Josserand, Edwin P. Chamberlin, Leone Z. YoungOne specific area of emergent research and assessment, with respect to organizational role and responsibility overlaps and gaps, exists within an organization's ability to generate sustainable and defensible cost estimates. This paper will assist organizations facing the challenge to improve their cost estimation and analysis capabilities through the general characterization of the four interdependent cost functions, their processes, and the participants all within the systematic Organizational Cost Community Framework.2018Comprehensive Perspectives
Have All the Cost Estimates Already Been Done? Data Science in Cost AnalysisJeremy EdenData Science and 'big data' or the analysis of large quantities of consolidated and searchable information are having great impacts on operations, industries, scientific fields, and analytical disciplines, including cost estimating. This paper will provide a basic understanding of data science, discuss some current issues and solutions for cost estimators using data science, and suggest some steps ICEAA can take to incorporate data science with current cost estimating training and certifications to capture and shape the benefits of the data science revolution in cost analysis.2018Comprehensive Perspectives
An Empirical and Visual Tale of a Cross-Country Bicycle AdventureRick Collins, Maggie Dozier, Orly S. Olbum, Paul Lanier Hardin IIIThe 'road' from Anacortes, Washington to Bar Harbor, Maine is paved with amazing landscapes, small towns, interesting people and cold beer. This 74 day cross-country cycling trip was an ideal opportunity to collect data that might explain daily riding speed. This presentation (and companion paper) paper describes the journey and post-ride analysis of the data using influence diagrams and constrained optimization (via Excel Solver) and will hopefully inspire others to get on a bike and experience the 'power' of cycling.2018Comprehensive Perspectives
Data Science Cost Estimating Challenges: Cut Your Time in Hack!Joe Rohner, Lorraine FeuryThe increasing velocity and variety of cost analysis data demands holistic data science approaches performed in near-real time. How can you discover multiple innovative, unbiased, cutting-edge solutions in just three days? Host your own Data Science Hackathon cost estimating challenge! Hackathons are environments that incorporate external open-source tools coupled with super-smart data scientists to solve challenges impeding success. Learn Hackathon best practices that serve as innovative catalysts and transform how your organization solves complex problems.2018Economic/Data Analysis
A Robot Brain Might Be the Best Forecasting Tool PossibleNathan EskueDo you have to predict the future as part of your job'revenue, headcount, demand, production, etc.? Is it frustrating that something always seems to mess up your estimates? The robot brain'Artificial Neural Network (ANN)'can predict patterns in the midst of chaos. I'll share its mechanics/purpose, my experience using ANN's to self-learn (what could go wrong?), and how that same technology can produce forecast accuracy better than any other method.2018Economic/Data Analysis
Demand, Recurring Costs, And ProfitabilityDouglas K. HowarthCustomers in all markets collectively abide by their self-imposed demand curves, which dictate their responsiveness to changes in price and the maximum quantities of products they can absorb. Concurrently, producers in all markets face recurring costs, which typically fall over time due to a variety of factors. Producers can effectively model demand and recurring costs before product launch. Understanding how demand curves relate to recurring costs is key to enhancing profitability, which this paper examines.2018Economic/Data Analysis
The Art of Employing Data Science to Improve Cost Data AnalysisRichard Shea, Shavaiz SaoodWhile large data sets are highly desired and used frequently, there is not a universally accepted standard format for large data extractions and transfers. This presentation focuses on gaining insights into large amounts of data, spotting inconsistencies, and getting data into a usable format. This presentation examines the possibility of putting data into a standard database format so it can be easily manipulated and cross compared against other datasets that may have comparable elements.2018Economic/Data Analysis
Robust Non-Design, Code, Test, and Integration Cost Estimating RelationshipsBritt Staley, Nicole RobertsonComputer program development (CPDEV) models are founded in accurately estimating the design, code, test, and integration (DCTI) of the computer program. Given that non-DCTI support costs can account for 50 percent or more of the estimate total, it is also critical to accurately capture these indirect costs. This paper analyses almost two decades of historic data from nineteen CPDEV efforts to derived timely non-DCTI cost estimating relationships for use in these types of estimates.2018Economic/Data Analysis
Calibrating Use Case Points Using Bayesian AnalysisKan Qi, Anandi Hira, Elaine Venson, Barry BoehmUse Case Points (UCPs) has been widely used to estimate software size for object-oriented projects. Yet, many research papers criticize the UCPs methodology for not being verified and validated with data, leading to inaccurate size estimates. This paper explores the use of Bayesian Analysis to calibrate the use case complexity weights of the UCPs method to improve size and effort estimates. Bayesian Analysis integrates prior information (in this study, we use the weights defined by the UCPs method and the weights suggested by other research papers) with parameter values suggested by data. The effectiveness of the calibration approach has been evaluated by our empirical study of 34 use case driven projects. We found that the Bayesian estimates of the use case complexity weights consistently provide better estimation accuracy compared to the weights proposed by the original UCPs method, the empirically calibrated weights, and the expert-based weights.2018Economic/Data Analysis
Improved Cost and Technical Data Collection for ContractorsGreg Kiviat, John SwarenAs a follow on to the 2017 ICEAA Workshop presentation 'Lessons Learned in Leveraging Historical Cost, Schedule and Technical Data' the authors present results of a Part II post-study enhancement applying standardized CARD (Cost Analysis Requirements Description) to existing data collection process, per audience feedback. The team has applied this enhanced approach and is exploring how contractors can leverage government standard process to improve the existing approach and gain insight beyond model calibration (e.g. CER development, analogous programs, etc).2018Economic/Data Analysis
A Case Study for Future Munition: An Analysis of AlternativesFaye Kim, Meagan GadreaultAn Analysis of Alternatives (AoA) is essential in making a rational funding decision particularly before MS A decisions. An AoA was conducted to determine the Total Ownership Costs (TOC) for 21 possible Courses of Actions (COAs) for the design of a future munition. Multiple options within the future munition's three main hardware sections and software section create the COAs, and differences in these options within the COAs drive the TOC that impact the overall AoA decision.2018Economic/Data Analysis
Projecting Future Costs with Improvement Curves: Perils and PitfallsBrent M. JohnstoneImprovement curves are one of the most common projection tools used by cost estimators. Their use is surrounded however by perils and pitfalls. Common errors include: the fallacy of 'straight edge and graph paper' projection, the dangers of recovery slopes, failure to understand how development and production environments differ, and the dangers of using learning curve slopes to measure production line efficiency. This paper examines these potential pitfalls and proposes ways to avoid them.2018Economic/Data Analysis
Diamonds in the Rough: How to Normalize Cost Accounting DataNiatika Griffin, Jason DeLorenzoEvery government agency has a cost accounting system. For cost estimation purposes, this is a wealth of actual cost data that can be used as a reference for estimates. Unfortunately, the needs of financial analysts differ from cost analysts. This paper explores how to normalize cost accounting system data using FAA's Delphi system as a case study. We conclude given certain assumptions the data can be normalized into usable source data and customized reports.2018Economic/Data Analysis
Applying Economic Theory to Cost Recovery in DoD Working Capital FundsKathryn Connor, Michael VasseurA DoD working capital fund must recover its costs, but some customers have questioned the cost of these services, which are perceived to be too high and include readiness and other costs not directly tied to customer demand. This study identifies and tailors best commercial practices in order to better align customer incentives with DoD goals and readiness needs. We recommend that this DoD working capital fund implement preferential pricing for those activities that contribute to readiness.2018Economic/Data Analysis
Integrating Excel Cost Models and MS Project SchedulesMelvin R. Etheridge, Jr.The FAA requires non-risk-adjusted 'resource loaded' schedules as part of its Milestone Decision process. Integrating cost and schedule risk in MS Project schedules and Excel cost models is challenging. These challenges stem from the fundamental differences between a cost estimate, which is a tree, and a schedule, which is a network and the dependency of some cost elements to duration. This paper investigates techniques for linking risk-adjusted schedules and cost estimates. It highlights the need for close collaboration between cost estimators and schedulers.2018Management, EVM & Scheduling
Adventures in Using Contractor Cost Data Reports for Wheeled and Tracked Vehicles AnalysisKimberly Roye, Jennifer ScheelThe Cost Assessment Data Enterprise (CADE) contains multiple types of contractor cost data reports (e.g. CDSRs, FCHRs, CPRs) that cost analysts may find useful in developing cost estimating relationships. Understanding the best report(s) to use depends on the analysis being performed and the optimal choice is critical to meaningful analysis. This paper will provide lessons learned to help analysts successfully navigate available cost reports and avoid common pitfalls.2018Management, EVM & Scheduling
The Living Estimate: Leveraging LCCEs Throughout the Program LifecycleSriram Krishnan, Kevin SchuttWhile cost estimates are crucial for program planning and approval after program approval, where does the estimate go? This paper examines how the LCCE can support and inform a program office beyond the planning stages. We will also look at the cost analyst's role in the Program Control and Business Management functions, and how s/he can add value to all aspects of the Program Control team.2018Management, EVM & Scheduling
Modern Methods for Budget-Constrained Schedule AnalysisNick DeToreThere's a need across the community for methods to optimize schedules based upon constraints imposed by the available budget. Work cannot be performed until funding is available, yet current SRA/JCL tools don't reconcile this discrepancy. The cost/schedule analysis communities understand the prevalent and vital concept of a budget-constrained schedule and yet research has been inconclusive. This paper introduces a robust method of adjusting the schedule within complex program plans in order to satisfy a budget.2018Management, EVM & Scheduling
Your Schedule is in Shambles and This is Why: A Systematic Approach to Why So Many Programs FailKai Lemay, Patrick MyersIn this presentation, we explore the root causes for why programs fail to achieve schedule targets, either set by them or set by external stakeholders. In this presentation we review how requirements, contracts and contract requirements, program management approaches (or lack thereof), scope creep, unrealistic estimates, and not taking a data driven decision approach to decision making all come together to form a perfect storm that prevents programs from achieving the schedule targets.2018Management, EVM & Scheduling
Proactive Estimating: The Analysis of Sixth Generation AircraftDale ShermonThis paper will explore some of the options and alternatives which as a cost community we should be pursuing for new projects. It will examine the big, first order assumptions which we should be considering to ensure that we have a voice and that the cost is considered at the forefront of the decision process. As an example the paper will consider the options for a sixth generation fighter capability.2018Methods & Models
Design Life StudyAlex Wekluk, Colleen Adamson, Matt ReileyThe ODNI/SRA/CA division studied Design Life (DL) impact on the cost of traditional satellite acquisition strategies and associated launch costs. If launch costs continue to decrease it may be prudent to move away from constellations of extremely reliable vehicles to constellations of less reliable vehicles at an affordable price. A constellation of shorter DL vehicles tolerates a higher risk at the unit level and increases flexibility to evolve ahead of a changing technology landscape.2018Methods & Models
Commercial Applications of Predictive Analytics: Requirements-Driven ForecastingJohn SwarenOur job is often to fine-tune a predictive capability, process or system by best applying knowledge to justify key drivers for forecasting. Regardless of methodology, internally-developed or licensed, the common objective is creating valid/defensible estimates based on actual historical data as well as performance parameters. This paper will examine four commercial case studies where unique Predictive Analytics methods were implemented to both leverage proprietary knowledgebases as well as reflect requirements metrics to create justify forecasts.2018Methods & Models
Estimating Future Air DominanceDavid StemEstimating the cost of aircraft programs early in development presents special challenges. The process requires a consideration of the content of the program and various methods to be employed. The Air Force Cost Analysis Agency recently provided cost advice to the Scientific Advisory Board on how aircraft programs are estimated, recent historical experience/lessons learned, and approaches to reduce life cycle costs while the program is in the conceptual design stage.2018Methods & Models
Just-In-Time Cost Estimate in a Multidisciplinary Design EnvironmentApinut "Nate" Sirirojvisuth, F. Gurney Thompson IIIPoor cost estimates cause inefficiency in the program and run the risk of increased bottom-line, program delay and cancellation. One remedy is to have an effective cost management framework that can evolve and adjust as the program matures. In this research case study, we present the formulation of cost-capability trade space that automates multidisciplinary collaboration resulting in a just-in-time cost estimate from any design change or technology infusion. An aircraft redesign example will be presented.2018Methods & Models
Unmanned Space Vehicle Cost Model: Past to PresentBen Kwok, Chinson YewSpace, the final frontier. These are the cost estimates that employ the Unmanned Space Vehicle Cost Model (USCM). USCM's 45+ year mission: to assist in the development of cost estimates, to seek out new data sources and new methodologies, to boldly evolve to meet the ever changing landscape of the cost world: Take a journey into USCM's development, a model traveling at warp speeds since the 60's that has undergone major evolutions over the past decades.2018Methods & Models
Building Dynamic Cost Estimating ModelsMiranda JonesOrganizations are driving requirements for both cost estimators and analysts to be agile and deft in the application of dynamic, heterogeneous data efficiently with process repeatability. Developing versatile estimating models saves time, reduces errors, and drives consistency. This presentation will provide an overview of the 'what' and 'why' of dynamic estimating models and the techniques for building adaptable models that can be applied to any type of cost estimating discipline.2018Methods & Models
An Approach Towards Determining Value Through the Application of Machine LearningChristopher HutchingsWithin the field of data analytics, machine learning is a method used to develop algorithms that lend themselves to prediction; often known as predictive analytics. These analytical models allow 'interrogators' to expedite the production of reliable and repeatable results and uncover hidden insights through learning from legacy relationships and trends in the available data. The intention of this paper is to demonstrate a value based engineering application of this embryonic field of analysis.2018Methods & Models
Learning Rate Sensitivity ModelTimothy P. Anderson, Nichols F. BrownIn space cost estimates, learning curves are used to estimate cost of small quantity acquisitions. Recently, spacecraft providers have started proposing unprecedentedly large constellations. The authors have developed a methodology to test assumptions about learning rates versus proposed cost estimates, providing a data-driven assessment of whether a proposed learning rate/cost combination is feasible or even likely, and further describes the learning rate would be necessary in order to meet a proposed cost estimate.2018Methods & Models
Enhancing Risk Calibration MethodsChristian B. SmartCalibration methods such as the Enhanced Scenario-Based Method allow analysts to establish cost risk analyses that are based on objective data. Some methods currently in use rely on the normal and two-parameter lognormal. Empirical data, however, indicates that a three-parameter lognormal is more appropriate for modeling cost risk. We discuss three-parameter lognormals and how to calibrate cost risk using this distribution. We compare the results with traditional calibration to two-parameter normal and lognormal distributions.2018Risk & Uncertainty
Calculating a Project's Reserve Dollars from its S-CurveMarc GreenbergA probabilistic method was developed to calculate funds a project would need if it exceeded the project's point estimate. This conditional cost reserve is depicted as the amount of funds held in reserve 'above the project of program. The method requires three data inputs to calculate cost reserve: (1) the project cost dispersion (measured by its coefficient of variation), (2) the project point estimate, and (3) the confidence level of the project point estimate.2018Risk & Uncertainty
Estimating the Cost of Pharmaceuticals: Managing Cost and ExpectationsThurman D. Gardner, Robert HuntRecently headlines have been addressing the high price of pharmaceuticals. Contrary to what it may seem, greed does not drive pricing, it is the unknown and recovery of costs. Science provides a big piece of this, but the regulatory pathway is almost as big a factor, and in some cases bigger. Estimating new products requires a solid methodology and forward-thinking approaches to adequately bound projected costs and expectations.2018Risk & Uncertainty
Risk-Adjusted Contract Price MethodologyPeter J. Braxton, Keith S. Hetrick, Orly S. OlbumTraditional risk approaches rely on program-level risk and uncertainty benchmarks from SARs, but governments and other buyers need to assess and manage risk at the contract level. For the first time, the Risk-Adjusted Contract Price Methodology explicitly models both 'off-the-shareline' risk and 'on-the-shareline' risk to present a complete and accurate distribution of final contract price. Drawing from a robust CLIN-level database of cost, fee, and price changes over time, it incorporates historical benchmarks for contract changes and other growth.2018Risk & Uncertainty
To Monte Carlo Or Not To Monte Carlo: That Is The QuestionJoe BauerIn cost estimating, the two most often used risk / uncertainty analysis methodologies are Monte Carlo simulation and Method of Moments. Have you ever stopped to wonder about the difference between the two? Fear not! In this presentation, the authors will compare and contrast the two risk / uncertainty analysis methodologies through several case studies. The authors will also share several unique methods for allocating risk dollars across the program phases.2018Risk & Uncertainty
Establishing Standards as the Basis for Effective Measurement and AffordabilityPete PizzutilloMeasurement of application development output has long been a controversial topic. Yet as contracting relationships within industry and public sector become more strategic, buyers and sellers of software development and sustainment services require consistent and effective measures of application development output to provide: objective visibility into application development output; a rational basis for Application Development and Maintenance (ADM) investment decisions; and vendor and buyer accountability supported by data, not subjective judgments.2018Software Estimating
Software Data Collection and Analysis for Proposal EvaluationKen Rhodes, Eric HippensteelSoftware data collection forms provide necessary information for the Government to assess the validity of proposed software development effort, productivity, sizing, and schedule. This paper discusses use of the forms with an RFP to validate contractor proposals and compare across multiple bids. This data collection and evaluation approach helps ensure selection of the best vendor and defend against protests, and was recognized by Defense Procurement and Acquisition Policy (DPAP) as a DoD acquisition best practice.2018Software Estimating
Cost of Software Obsolescence ResolutionsSanathanan RajagopalSoftware plays an important role in defence. Almost every project in defence has software elements with various degrees of complexity and dependencies. This has brought its own challenges to the availability-based contracts. The challenges to both the contractors and the suppliers is that they have to have a good understanding of the whole life cost of the product and have confidence in the whole life cost model at the time of negotiation and contract signing. This paper will look into the ways to estimate the cost of Software Obsolescence resolutions of real-time defence software.2018Software Estimating
Software Made Simple: Effort Adjustment Factors and the Accuracy of the EstimateJeremy GoucherThis research investigates the sensitivity of estimated hours for software development in relation to effort adjustment factors (EAFs). The analysis highlights the importance of performing original data analysis for new estimates, rather than relying on rules of thumb or industry standards. Accuracy and precision metrics are used to demonstrate the applicability of the ESLOC method to a variety of software projects, including agile projects. Finally, the analysis is supported with data from over thirty historic programs.2018Software Estimating
Predicting Maintainability for Software Applications Early in the Life CycleCara CuiuleMaintainability is defined as the difficulty of altering a software system's source code, thus it is tied very closely to the concept of software maintenance. The following research is an investigation of the methods used for measuring this characteristic (including the Maintainability Index). Guidance on how maintainability affects maintenance effort will be proposed. This will be followed by a discussion of which metrics could possibly predict maintainability early in the life cycle.2018Software Estimating
Impact of Scope Changes on Software GrowthJonathan Brown, Gail FlynnThe SEI DoD Software Factbook summarizes MDAP/MAIS SRDR data for DoD programs. The mean value reported for ESLOC growth is 106%. While accurate, the SEI's and other similar analyses capture total software growth, including the impact of scope changes. This paper introduces 'Pure Software Growth' which differentiates planned scope changes from traditional software growth. Several programs are analyzed from this perspective to show the difference between pure and total growth and the unexpected impact this could have on estimates.2018Software Estimating
Why Does Software Cost So Much? Toward a Causal ModelMichael D. Konrad, Robert StoddardHow can we control the cost of software-intensive systems? To contain costs we need to better understand which factors truly drive costs versus those merely correlated with cost. In this talk, we will share results from application of a newly developed causal discovery and modeling tools to evaluate these potential causes of code quality and effort: number of requirements, software size, schedule, team experience; programmer productivity and error-proneness; and architecture pattern violations.2018Software Estimating
A Probabilistic Method for Predicting Software Code Growth: 2018 UpdateEric M. Sommer, Bopha Seng, David LaPorte, Michael RossSoftware estimating is challenging. SMC's approach has evolved over time to tackle this challenge. Originally based on Mike Ross's 2011 DSLOC Estimate Growth Model, we've updated our model to include more recent SRDR data and an improved methodology (Orthogonal Distance Regression). Discussions will focus on non-linear relationships between size and growth, unique growth for new, modified, and unmodified DSLOC, as well as correlation between DSLOC types and future efforts to include space flight software data.2018Software Estimating
Estimating the 'Total Cost of Ownership' of Cybersecurity in an Increasingly XaaS world.Zachary Jasnoff, David Cass, Richard MabeConverging technology trends in XaaS have profound effects on how organizations are evaluating decisions regarding XaaS outsourcing and hybrid deployments as more business functions move to the cloud. Most organizations have a security skills gap that XaaS and moving to the cloud can help solve giving choices on insourcing or outsourcing cybersecurity. This paper explores how XaaS impacts the TCO of cybersecurity and also deliver guidance on the estimating the cost of the DFARS cyber policy to defense programs.2018Technology & Innovation
Modeling Technology and System Readiness Level Impacts on LCCApinut "Nate" Sirirojvisuth, Bakari DaleTechnology Readiness Levels (TRLs) is an important construct used to assess maturity of a technology and set criteria for inclusion in a program with an implied level of risk. In this research, we attempt to quantify the risk implications in terms of cost and schedule impacts. A new cost model will be introduced that utilizes technology maturity assessment of constituent components of a system as a metric to determine cost, schedule, and uncertainty response.2018Technology & Innovation
Application of K-Means Clustering Methodology to Cost EstimatingJacob WalzerCluster Analysis provides an efficient method of analyzing large datasets and grouping them based on similar characteristics. While not suitable for all datasets, clustering allows analysts to easily classify data while considering all relevant variables. This presentation explores the best practices, common pitfalls, and validation techniques associated with clustering, and includes a notional example derived from an approved application of this technique utilized in a USMC estimate.2018Technology & Innovation
Machine Learning & Non-parametric methods for Cost AnalysisKaren Mourikas, Nile Hanov, Joseph King, Denise NelsonThe world of big data opens up new opportunities for ICEAA, such as machine learning and non-parametric methods. These methods are more flexible since they do not require explicit assumptions about the structure of the model. However, a large number of observations is needed in order to obtain accurate results. Hence, the use of big data. This presentation examines several non-parametric methods, with examples related to our community, and discusses opportunities and limitations going forward.2018Technology & Innovation
Costs Considerations in Refreshing Vulnerable IT NetworksJohn LeahyVirtually all IT networks must deal with the growing threat of cybersecurity intrusion and yet retain sufficient features to meet mission needs. Many IT network components are close to, if not past, their end of life. This presentation will provide a high level view of alternatives analysis focused on the cost of a network refresh with secure, state of the art components. Topics include different approaches to network refresh and assessment of stakeholder interests.2018Technology & Innovation
Social Media and Submarines: How Machine Learning and Unconventional Methods Can Change Cost EstimatingOmar Akbik, Jeffrey PincusAs technology advances and analytical techniques evolve, machine learning and big data systems stand to dramatically change the cost industry- these changes are coming sooner than we may think. Industry requires timely, efficient and defensible analysis, and new data sources are allowing previously unexplored opportunities to meet these goals. We explore how automated algorithms can identify market trends to forecast program cost with greater accuracy, and in less time than historically required by an analyst.2018Technology & Innovation
Agile Delphi Estimating and Analyzing Schedule/Resource Realism for Software ProjectsBlaze SmallwoodOne of the most important tasks for a software project manager is to accurately forecast the schedule and resources needed to complete the project. This paper will describe a novel approach to this problem that combines Monte Carlo-based uncertainty analysis with an innovative new technique for estimating software requirements, which utilizes an agile-style Delphi estimating methodology. This method has been successfully implemented on several government projects and is a viable option for any software project.2017Agile
Agile: You're Doing it Wrong (or how to know you're doing it right)Daniel B. FrenchAgile is a hot topic in the IT world today. If your organization is using Agile, or considering it, there are a number of things you need to know. This presentation will discuss the pros and cons of Agile, how to determine if your organization is ready for it, proper Agile implementation, hybrid methodologies, and the organizational challenges when trying to make an organization Agile.2017Agile
Agile and GAO Cost Estimating Best PracticesJennifer Leotta, Karen RicheyThis paper will examine how GAO's cost estimating process can be applied to programs that are using an Agile framework. First, it will provide a brief overview of Agile processes and methods. Second, it will examine each of the 12 steps in the GAO cost estimating process and how those steps relate to an Agile framework. Finally, it will discuss how Agile artifacts can be leveraged to fulfill cost estimating documentation needs.2017Agile
A Cost Model for Early Cost Calculation of Agile DeliveriesEric van der VlietAgile software development methodologies provide the IT industry with the flexibility they need to keep up with the faster change of business requirements. In agile software delivery, upfront detailed specifications are absent, yet investment decisions need budget input. The challenge is to build a cost model that takes essential (size) and additional cost drivers into account. This presentation explains a cost model that supports the cost calculation for agile delivery early in the process.2017Agile
Agile 'Mumbo Jumbo'Jeremy EdenYou don't put any stock in this Agile "mumbo jumbo" do you? Well, Agile is here to stay and is gaining popularity.This paper will discuss current issues and some solutions for cost estimating Agile projects, clear up common misconceptions about Agile, and Introduce an Agile approach to cost estimating, including a draft Agile Cost Estimating Manifesto with suggestions on how to incorporate it with current cost estimating training, so ICEAA and its members can BE Agile.2017Agile
Beyond the Manifesto: Tracking Agile Performance in a DoD EnvironmentGordon M. Kranz, Michael ThompsonThe use of Agile methods are becoming prolific within the DoD Enterprise and Weapons systems development environment. Agile supports the incremental discovery of end system capability within the constraints of the DoD parameters. This paper investigates, through real world experience, the use of earned value techniques to strategically manage the Agile development process. One of the essential elements key to success is traceability of user requirements to Agile features.2017Agile
How Should We Estimate Agile Software Development Projects and What Data Do We Need?Tom Coonce, Glen AllemanThis paper proposes and demonstrates how to estimate agile software projects using collected data from prior similar projects. The authors discuss traditional ways of estimating software projects and show why these approaches are not feasible in estimating agile projects. They demonstrate estimating agile projects using actuals from historical features. They recommend a few additions to DoD's Software Resource Data Report (SRDR) as well as a Standard Feature Breakdown Structure (FBS) to permit using this approach.2017Agile
CMMI and Agile Development: A Binary Choice?Arlene F MinkiewiczThe Capability Maturity Model Integration for Development has a long and impressive history for progressing the cause of process improvement. Agile development is a paradigm for software projects that are characterized by collaborative, cross functional teams working closely with customers to deliver functionality regularly. Many say these two approaches can't work in tandem. This paper explores this question with examples of both successes and failures.2017Agile
Assessing ERP Cost, Schedule and Size GrowthHasetetsion Gebre-Mariam, Abishek KrupanandThis study will examine percentage changes in cost, schedule, and size across Milestones A, B, C, and full deployment for DoD Enterprise Resource Planning (ERP) programs. The analysis is based on 9 fielded systems collected from DoD authoritative data sources. Cost contributors, drivers, and factors by major cost elements will also be examined. Results may be used for crosschecking cost estimates or business case analyses at an early phase to inform funding decisions.2017Data Analysis
Lessons Learned in Leveraging Historical Cost, Schedule and Technical DataGreg Kiviat, John SwarenThe process of applying historical cost, schedule and technical data to develop new program estimates is often more difficult than textbooks suggest. The paper provides real-world insight to the issues that arise when capturing, analyzing and applying data for the next estimate. The goal is to integrate information from multiple products, systems and subsystems. Lessons learned include experience and best practices when working with financial, engineering, manufacturing and program management.2017Data Analysis
Technology Readiness Level (TRL vs. Percent Development Cost)James LinickAdvanced technology acquisition often proceeds in steps characterized by TRL of systems or components. The relationship between TRL ordinal number and percent of total acquisition cost could be used as a factor in estimating the cost of new acquisitions at each phase from concept definition through production decision. A study of historical data will be performed to establish such a relationship based on a specific definition of TRLs, which vary across government agencies.2017Data Analysis
Machine Learning Approach to Cost AnalysisKaren Mourikas, Joseph A. King, Denise J. NelsonAt ICEAA 2016, the keynote speaker presented a challenge to the cost estimating & analysis community: to develop cost estimating methodologies based on machine learning. Machine learning provides an alternate to parametric modeling. At Boeing, we are experimenting with several machine learning techniques for cost estimation and analysis; Random Forest prediction methodology, in particular, has shown encouraging results. This paper describes our cost applications of Random Forest.2017Data Analysis
Technology Development Cost and Schedule ModelingChuck AlexanderA tangible need exists in the scientific, technology, and financial communities for economic forecast models that improve new or early life-cycle technology development estimating. Industry models, research, technology datasets, modeling approaches, and key predictor variables are first examined. Analysis is then presented, leveraging a robust industry project dataset, applying technology and system-related parameters to deliver higher performing parametric cost and schedule models.2017Data Analysis
DHS Cost Analysis OverviewDarrin DeReusCost Estimating and Analysis is a vital resource for program and project managers, regardless of agency or activity. July 2014 the cost analysis responsibilities and personnel were transferred from the DHS's Office of Program Accountability and Risk Management to the Office of the Chief Financial Officer. This presentation will provide a history of the creation of a cost analysis capability in a new agency, changes over time, lessons learned and current successes of the Cost Analysis Division.2017Data Analysis
Production Rates: Do They Really Matter?Brent JohnstoneProduction rate is widely assumed to be an important contributor to unit cost -- higher production rates lead to lower unit costs, and vice versa. Examination of published data, however, leads to a more ambiguous picture. This paper examines the impact of rate by functional cost element, including the impact on learning curves. It concludes production rate impacts are real, but the impacts are uneven and sometimes reveal themselves in surprising ways.2017Data Analysis
Using Real Options to Quantify Portfolio Value in Business CasesGeorge Bayer, With the challenges in quantifying program risks, interdependencies, and the business case impact on portfolios, cost estimators can use real options to estimate probabilistic value in business cases and assess portfolio value. Many business cases not only add value as stand-alone investments, but they provide the opportunity for subsequent investments. Cost estimators can measure this incremental program value and the impact of a specific investment on a larger portfolio by using real options.2017Economic Analysis
Discount Rate for Government InvestmentRon Beheler, Laura BarkerWe often use a discounted-cash-flow construct to evaluate the Economic Viability of an investment. OMB-94 provides discount-rate guidance for internal investment (treasury rates) and public investment/regulatory programs (7% - rate of return on an average private sector investment). Since any resources consumed by the government must be foregone by the private sector, we will argue that the discount rate should be based on a rate that at least proxies the value of private investment.2017Economic Analysis
Are the Rates Right? Benchmark Protection Against Escalation SWAGBrian Flynn, Brian Torgersen, Greg Mihalek, Adam JamesMedical escalation rates continue to trend higher than general inflation - with significant impact on defense firms and acquisition affordability. Modest deltas in rates translate into billions of additional costs. This research presents benchmark escalation values for employer costs of employee medical compensation. The benchmarks, in turn, support company planning & collective bargaining and liberate government cost estimates from their traditional reliance on solely FPRPs and FPRAs.2017Economic Analysis
The Rate You Want " Rethinking Cost Escalation in FAA DevelopmentSriram Krishnan, Andrew DrennonLike most civilian agencies, the FAA has used 'raw' OMB escalation indices to produce then-year costs for its life cycle cost estimates. However, as an agency which incrementally funds many development programs, could the FAA benefit from recognizing time value of money principles at the estimation stage? We look at outlay data from across the agency to determine whether defense appropriations styled weighted indexes would significantly impact then-year FAA solution development costs.2017Economic Analysis
Terminal Facility Realignment: A Business Case ApproachDeji Oladipupo, Ayn Smith, Melvin Etheridge, Richard SheproThe Federal Aviation Administration applies a business case analysis approach to the decision process for realigning and consolidating Terminal facilities. A Microsoft Excel model has been developed to produce these analyses. It uniquely displays both costs and benefits in one place and includes graphical outputs of NPV over time. A key output of the analysis shows interesting cyclical trends of NPV rising, falling, and rising again over time in almost all scenarios.2017Economic Analysis
Kennedy Space Center's Transformation to a Multi-User SpaceportTerry LambingIn the past, launch pads were used almost exclusively for government missions. To support a growing private sector space economy, NASA's Kennedy Space Center has transformed into a multi-user spaceport capable of handling the needs of a variety of companies from launch processing and operations through recovery. This presentation will provide an overview of the transformation and the NASA Business Case requirements needed in order to gain approval, and successfully establish these unique partnerships.2017Economic Analysis
NRO CAAG Spacecraft Test Schedule Estimating RelationshipDaniel BarkmeyerThe NRO CAAG's parametric model for estimating spacecraft testing schedule duration has been updated. The updated model incorporates data from government and commercial spacecraft contracts. The intent of this presentation is to share the model with the ICEAA community. The latest model will be presented, along with a case study of a satellite program that illustrates the importance of the chosen schedule drivers to system test schedule.2017EVM & Scheduling
Fully Integrated Cost & Schedule Method (FICSM) Analysis Schedule ImplementationJonathan Brown, Benjamin T. UnruhThe Joint Agency Cost Schedule Risk and Uncertainty Handbook released in March 2014 introduced the FICSM which integrates cost, schedule and risk estimating. Key to the FICSM process is creation of an analysis schedule including duration uncertainty. This study leveraged data from an active program to develop an analysis schedule as part of building a FICSM model. This paper describes the methods used to develop the analysis schedule and duration uncertainty used for this program.2017EVM & Scheduling
Beyond RIFT: Improved Metrics to Manage Cost and ScheduleNicholas DeToreRisk-Informed Finish Threshold (RIFT) presented an innovative solution to the problem inherent in schedules that risk analysis results (time) cannot be allocated the same way as in cost models (dollars). Developing RIFT validation methods inspired an exploration into analyzing simulation data more meticulously. Methods described here provide unique insight into cost and schedule uncertainty results while introducing powerful new techniques to improve a project's potential to complete on time, on budget.2017EVM & Scheduling
Adopting DOD Best Practices in IPM to non-DOD NeedsDavid L. Wang, Daniel Schwartz, David T. ChiangNon-DoD federal acquisition agencies have their own unique acquisition requirements and acquisition culture. However, many non-DoD acquisition agencies face similar challenges, specifically in the areas of acquisition program execution and program affordability. To address these challenges, there has been increased interest to consider tailoring DoD best practices in Program Management, Earned Value Management and acquisition lifecycle schedule management. This presentation discusses a systematic approach to tailoring DoD best practices for non-DoD acquisition needs.2017EVM & Scheduling
Cost Associated with Acquisition Complexity and Differing Levels of Mission AssuranceErik BurgessIn Government acquisition programs, the cost of what we buy is largely driven by how we buy. For satellites, this includes contracting, funding, oversight, reporting, testing, parts assurance requirements, and other factors. All of these are related, and no single attribute defines the cost curve. The NRO has developed and implemented a method that quantifies the cost of these factors and circumvents the usual debate about whether or not an acquisition is streamlined or commercial-like.2017Methods
Don't Dis LOE: Modeling Production Sustaining Labor Across Multiple LotsSandy BurneyThe estimation of sustaining labor is often given little analysis; just assuming some constant level. However, in estimating production hours on large Programs, spanning multiple lots, where each lot is negotiated separately, conflicts often occur over the cost of sustainment labor. This briefing will look at different approaches to estimating and modeling sustaining labor, based upon varying cycle times for completing a single unit, and the number of units being procured in each lot.2017Methods
Automated Data Collection Using Open Source Web Crawling TechnologyAnna FooteFor many reasons, data gathering is often one of a cost estimator's biggest challenges. Even in cases where data is available, the logistics of data collection can be daunting. This paper explores a methodology developed by PRICE utilizing free, open source web crawling technology to automatically seek and find commodity pricing for information technology hardware and software items. This presentation shares the nuts and bolts for everyone's benefit.2017Methods
A Framework to Price and Cost IT Network ServicesJohn LeahyThe correct pricing of an IT network offering aimed toward internal users can be an involved process. Different services targeted to some, but not all customers makes proper cost allocation challenging. Considerations such as mirroring the structure, if not the rates, of commercial offerings; reconciliation of differences between cost allocations and rational pricing; and influencing customer purchasing behavior adds to the complexity. An approach for costing and pricing IT services that address these requirements will be presented.2017Methods
Don't Get Caught in the Learning Curve VacuumPatrick McCarthyProduction environments featuring multiple end items benefit from a manufacturer designing products and processes with a high degree of commonality and standardization. When estimating labor hours using learning curves, the estimator often overlooks commonality when considering quantities, slope, production rate and lost learning. This paper will analyze alternative approaches to defining commonality and how to account for commonality using learning curves in integrated production environments.2017Methods
Modeling Hardware Development Cost in a Low TRL / Pre-Acquisition EnvironmentJack SnyderEstimating the cost of pre-concept programs, often with very immature TRL's, is always challenging. This case study highlights the use of analogous programs and predictive modeling to overcome the challenge. We will explore the strategy of finding analogous components of a program, even when the overall program is unrelated. Predictive models can easily be tailored for the inherent uncertainty of these early programs, when we are unable to find relevant data.2017Methods
Exploring the of Results of two Methodologies for Unmanned Space EstimationJohn SwarenIndependent validations of two separate predictive analytics methodologies were performed over the last two years. In both cases, common detailed assessments of past missions integrated technical and programmatic requirements to mimic a grass-roots bottom-up methodology. This presentation compares the process and top-level results from both approaches as well as explores several lessons learned.2017Methods
Deployment Cost Estimation for Electronic/IT SystemsF. Gurney Thompson IIIThis paper will discuss our research into cost estimation for the deployment of Electronic and IT systems, also known as Operational/Site Activation. The deployments can span multiple sites/locations, and often require technical studies, end user working groups, and site survey visits in addition to the actual installation activities. This paper will discuss our research approach, sizing metrics, cost drivers, model structure, and future work.2017Methods
Budgeting for Canadian Shipbuilding " Examining Predictive Cost Analytic MethodsPeter Weltmann Zachary JasnoffPredictive Cost Analytics plays a major role Canada's Shipbuilding Strategy in developing cost and budget estimates. However, these estimates must be based on sound historical cost data. This paper discusses challenges faced by the Canadian PBO in developing reasonable estimates for the JSS, AOPS and CSC shipbuilding programs based on the quality of historical data available, the methodologies employed to address these challenges, and the influence of these estimates on decision-makers2017Methods
Decision Trees and Cost EstimatingJosh WilsonDecision trees are predictive modeling approaches that recursively partition data, and apply simple prediction models to the resulting subgroups. Decision tree based prediction approaches have the benefits of being easy to interpret and explain, requiring little data preparation or cleaning, and the ability to model complex nonlinear relationships and variable interactions. This presentation explores the applicability of decision tree based prediction methods to cost estimating.2017Methods
Cost Estimating Canada's Future Surface CombatantsRod StoryCanada has not built a new surface combatant since it completed the last of its 12 Halifax class frigates in 1996. Currently Canada is in the middle of a request for proposal to build, using a common hull, 12 new frigates and 3 new destroyers. This paper presents the first public cost estimate of these ships highlighting both the challenges that were encountered and the solutions that were used in determining a reliable cost estimate.2017Methods
Modeling the Influence of System and Application Complexity on the Cost of Cloud HostingDaniel J. Harper, Kevin BuckTo refine cost estimates produced within a Cloud Total Ownership Cost model, MITRE is now prototyping an additional modeling feature that estimates the cost impact of migration and hosting complexity. While many cloud cost models rely on a limited set of cost drivers, other factors associated with general complexity significantly influence costs and are often ignored. Within the prototype, complexity assessments are plotted using intuitive spider diagrams for ready comparisons of candidate systems and applications for hosting in various cloud solutions.2017Methods
Marco-Parametrics; Its Unique Capability and ApplicationDale Shermon, Arlene F MinkiewiczThe Family of Advanced Cost Estimating Tools (FACET) has been used around the world as a macro parametric cost model at the early stages of a project life cycle. It has a unique capability to seamlessly transition from performance to design based estimating. FACET will also combine uncertainty in the inputs with those of the algorithm to produce a true uncertainty range. Now implemented in TruePlanning the FACET model is easy to access and apply.2017Methods
Projecting Program Spare Parts Sustainment with Incomplete DataBryan Anderson, George BayerAnalyzing a government acquisition, the team discovered that the existing agency supply chain system lacked transparency, and historical failure data was unavailable due to implementation of a new supply chain system. The team examined how to accurately project parts demand using incomplete data by conducting a comparative analysis between data sources, using subject matter experts to estimate upper and lower bounds, and applying statistical analysis fit curves to project spare parts needs.2017Operating & Support
Sustainment Cost Data CollectionSandi EnserOperations and support costs can exceed 60% of the total Life Cycle Cost of DoD systems. Currently, the individual services have their Visibility & Management of Operating and Support Costs (VAMOSC) systems. However, with many sustainment functions performed by contractors, improved reporting on sustainment contracts is essential. This paper presents several data collection initiatives for sustainment efforts and their impacts on estimating O&S costs.2017Operating & Support
Expanding the Range of Your Data: A Small Ships Case StudyKathleen Hudgins, Robert Nehring, Elizabeth Koza, Anna IrvineWhile the Navy has comprehensive Operating and Support (O&S) data for current Navy ships, there are a limited number of smaller boats available for inclusion. With increasing technological advances and a renewed effort to reduce personnel, smaller boats are receiving increased interest. This paper explores using Coast Guard O&S data to supplement Navy data for use in estimating O&S costs for smaller boats. Topics include data sources, normalization, and comparisons of data between the services.2017Operating & Support
Army Software Maintenance Cost Estimating RelationshipsCheryl Jones, James Doswell, John McGarry, Jenna MeyersFor the past four years, the Army, under the leadership of DASA-CE, has been collecting and analyzing Army system software maintenance cost and technical execution data to support the development of more accurate cost estimation methods. The presentation will present the cost methods and cost estimation relationships developed from the analysis of the execution data sets. The results of the Army's analysis efforts, including the detailed statistical analysis will be made available.2017Operating & Support
Implementing Additive Manufacturing Technology into the Logistics Supply ChainPatrick K. Malone, Bruce FadRecent explosive growth in Additive Manufacturing (AM) or 3D printing is providing logistics supply chain economic opportunities. We investigate self-sufficient repair and maintenance capabilities for isolated environments, impacts on strategic readiness, increasing responsiveness and cost efficiencies not available in traditional supply chains. Our research will evaluate and contrast legacy logistics architectures against AM elements that will drive cost downward over lifecycles to meet current and future affordability goals within the Government and commercial organizations.2017Operating & Support
Data Driven Confidence Regions for Cost Estimating RelationshipsChristopher JarvisCost estimating models typically contain many uncertain parameters. The uncertainty of the parameter values drives the uncertainty of the model outputs. In this paper we review and compare the techniques to compute the confidence regions for linear and nonlinear regression methods. Application to two common CERs is presented with comments regarding the practical implementations and limitations within the current cost estimating environment.2017Parametrics
Moment One, PleaseChad Krause, Erik BurgessCost models constructed without statistical correlation are designed to underestimate the mean, unless they are simple sums. Any model that uses a factor or other instance so functional correlation will miss the first moment, not to mention others, unless statistical correlation among the CERs is also applied. An internal audit of 13 recent satellite cost estimates comprising of 52 CERs shows the mean cost changes by as much as 9% when this correlation is applied.2017Parametrics
Modeling with Gumby: Pros and Cons of the Weibull CurveMichael Mender, Ann HawpeThe Weibull function is used to model various cost phenomena. This popularity is driven by its significant flexibility, driven by its nature as a two parameter function. Additional parameters result in flexibility, which while advantageous, also places incorporating estimates at risk of over fitting (i.e., a model that fits the observed data well, but has little predictive power). Presentation relays theory and exhaustively reviews pros and cons for cost estimating.2017Parametrics
Maximum Likelihood Estimation for Regression of Log Normal ErrorChristian SmartThe use of Log-transformed Ordinary Least Squares (LOLS) has been criticized for the use of transformation, which results in biased estimates. LOLS is a maximum likelihood estimate (MLE) of the median when the residuals are lognormally distributed, for which we provide evidence. We discuss MLE and show how to use this technique to directly model untransformed lognormal error. We also discuss two other popular methods, ZMPE and MUPE, in the context of MLEs.2017Parametrics
Novel Manufacturing Methods ~ Characterizing the Impacts to CostZachariah Sayre, Frank CampanileAverage unit procurement costs for military aircraft are growing at a rate that far exceeds inflation. Efforts such as the Air Force Research Lab's Composite Affordability Initiative and the Advanced Composite Cargo Aircraft have demonstrated alternative manufacturing methods utilizing composites to achieve significant part and fastener count reductions. This analysis highlights the concepts of large composite unitization in fixed wing aircraft manufacturing and its resulting effects to cost.2017Parametrics
Assessing Confidence Levels in Funding and Budgeting NASA Science MissionsRobert E Bitten, Charles D. HuntNPR7120.5E requires that NASA utilize joint confidence level (JCL) analysis to set budget and funding guidelines for projects within its portfolio of missions. This paper addresses analysis that was conducted to determine the effect of different confidence levels on the performance of different cases of mission portfolios. The results show that the most effective confidence level varies depending on the case. An overview of the policy, methodology, cases and results are discussed.2017Parametrics
Assuring Credibility in the Cost EstimateHenry ApgarCredibility can be the most important attribute of a cost estimate. This paper traces the evolution of quality metrics that assess cost credibility in the words of senior government executives, industry leaders, estimating and engineering handbooks, professional journals, and government auditing manuals. The presentation concludes with recommendations for the estimating professional.2017Policy & Standards
In Theory Or In Practice? The Optimistic World of Pessimistic Cost EstimatorsErin K. BarkelIn 2015, Andy Prince won the Best Paper award for The Psychology of Cost Estimating, a piece which highlighted a number of cognitive biases which impact the work of cost estimators. Of these cognitive biases, the one that seemed to resonate most strongly with the community was optimism bias. However, not everyone is worried that the glass is half-full. Where do negativity and pessimism bias fit? How does our understanding of these biases change how we communicate with stakeholders?2017Policy & Standards
How to use Predictive Analytics for a DCAA-Compliant Estimating SystemAnthony A DeMarcoThe DCAA ensures that Defense contractors use an acceptable estimating. They establish acceptable estimating system compliance criteria and audit's contractors to determine compliance. Non-compliance can be devastating. This presentation demonstrates a data-driven estimating methodology that complies and reduces the time and labor necessary to produce estimates. It illustrates the roles and procedures necessary for success. Six key compliance tenants are identified and key benefits quantified.2017Policy & Standards
Developing a Cost Capability RoadmapJohn FitchDeveloping cost estimating tools and databases is critical to analyst productivity and the quality of analyses. Historically, organizations struggled to systematically improve capabilities due to a lack of a long-term plan, unstable budgets, and priorities that shift with changes in leaders. This presentation walks through how one organization confronted this challenge by developing a roadmap that links the mission to requisite tools, data, and training; and a plan to achieve the roadmap.2017Policy & Standards
The Shortcomings of Cost Estimating TemplatesMeagan Gadreault, Faye KimTemplates are seen as a best practice in many industries to standardize workflow and reduce redundancy. However, templates may not be the ideal solution for cost estimating as they can suppress innovation and constrain analysts. Instead, best practices should be determined using process improvement methods and implemented to create customized estimates. This paper explores leveraging Lean Six Sigma techniques to determine alternatives to templates in order to put forth the best estimates.2017Policy & Standards
The Art of Cost: Sun-Tzu's Strategic Insight in Cost EstimationBrian A. GillespieStrategy is the process employed to determine how political purpose is translated into action. Clausewitz in his grand work "On War" argued that war serves a political purpose. So too does cost estimation, for at its most fundamental level, cost estimation attempts to determine the level of resources needed to achieve a political outcome. As one of the greatest enduring works on military strategy, what does The Art of War teach us about cost estimation?2017Policy & Standards
Coherence & Oddities: A Retrospective of Cost Estimating Publications 1978-2016Ross A. Jackson, Bradley C. BoehmkeIntellectual sedimentation can operate unobserved within professions. Assessing publication trends serves to make such conventional wisdom explicit. Our analysis of articles from the Journal of Cost Analysis and Parametrics (and its predecessors) provides insight regarding research coherence and oddities within the cost estimating community. This knowledge is essential for envisioning alternative futures and could be of benefit to those engaged in the praxis of cost estimating or its research.2017Policy & Standards
A History of Thought in Defense AcquisitionsEric M. LofgrenAs Congress debates another round of defense acquisition reform, the necessary role for the cost estimator is affirmed. But how did this role come about and what are future implications? From the famed RAND systems analyses of the 1950s to the introduction of data reporting systems still in use today, this paper will explore the rich history of thought in defense acquisition, giving a special eye to controversies and continuing challenges that affect cost estimators.2017Policy & Standards
Establishing and Implementing Cost Estimating StandardsHetal Patel, Denise Dulee, Danielle SpencerAs part of its efforts to create a world-class cost estimating capability, the Missile Defense Agency (MDA) wrote a cost estimating handbook and established an independent assessment team to ensure adherence to the handbook. As a result of this, MDA has received positive feedback from GAO in its progress towards this goal. This briefing describes in detail the processes and procedures that MDA has developed.2017Policy & Standards
Estimating Challenges and Solutions at NASA: Past, Present, & FutureCabin Samuels, Jeff BrownNASA's cost estimating landscape is fraught with unique challenges. Missions often incorporate cutting-edge technology and operate in harsh environments with no margin for error. Goddard's Cost Estimating, Modeling, & Analysis (CEMA) Office has developed robust methodologies to create realistic, transparent cost estimates for missions with little or no cost heritage. This paper discusses past and current states of cost estimation at Goddard and provides insights to shape future improvement.2017Policy & Standards
Anatomy of the Future DoD Cost EstimatorTamiko L. Ritschel, Jonathan D. RitschelThe need for a specialized, properly qualified DoD cost estimating workforce has resulted in a migration of civilian billets from a financially focused job series to the 1515 Operations Research series. We analyze the origins of this change, implications for the current workforce, and then discuss the anatomy of the future DoD cost analyst. While we focus on recent changes occurring within the U.S. Air Force workforce, the future implications are relevant to all Services.2017Policy & Standards
Masters in Cost Estimating and AnalysisDan Nussbaum, Greg MislickThis presentation provides an update on the all-distance Learning Master's Degree and/or Certificate Program in Cost Estimating and Analysis (MCEA / CCEA) offered at the Naval Postgraduate School (NPS).2017Policy & Standards
Contract Geometry Best Practices for Incentive ContractingPeter J. BraxtonWith the current emphasis on Incentive Contracting, the cost analyst plays a vital role in establishing target cost, fee, and shareline. Drawing from the Joint Contract Price Database, this paper examines actual contract geometries across a wide range of programs and their effectiveness in encouraging cost management. Using the published Risk-Based ROS methodology, it provides a framework for implementing incentive arrangements for both competitive and negotiated procurements.2017Risk
A 'Common Risk Factor' Method to Estimate Correlations Between DistributionsMarc GreenbergA 'common risk factor' method uses expert-derived risk factors to estimate correlation between two distributions. The premise is to estimate mutual information among risk factors that underlie each distribution. Approximation of such mutual information leverages properties of the joint probability distribution of a unit square. Geometric outputs of each pair of common random variables are compared to estimate common risk factor 'intersections' that are, in turn, proxies of correlation.2017Risk
The End of S-Curve Alchemy: Gold from a New SAR DatabaseTodd Andrews, Jeffrey Pincus, Brian FlynnS-curves typically address only "known unknowns." They're built upon an analysis of cost risk and uncertainty" element by element. But, they fail to address the ambiguity of the curve itself, or the degree of confidence that its proffered probabilities are actually correct. This confidence factor, of course, is hard to measure. The "unknown unknowns," in other words, are not explicitly captured since their probabilities are a mystery. This research fills the gap in knowledge.2017Risk
Being Certain About Uncertainty, Part 1Andy PrinceDoing cost risk analysis is hard because we don't really know what a good cost risk analysis looks like. In this paper we will explore the challenges to doing good cost risk analysis and discuss ways to know if your cost risk analysis is any good. We will also examine the phenomena of extreme cost growth and lay the groundwork for future work.2017Risk
Integrated Cost-Schedule Risk Analysis Improves Cost Contingency CalculationDavid T. HulettThe main benefits of integrated cost-schedule risk analysis are improvement of the estimates of cost contingency and identification of the main risks to cost for mitigation purposes. The main focus will be on estimating the cost contingency needed and identifying risks to cost, which may be independent of schedule or indirectly due to schedule risk. New simulation software including iterative risk prioritization will be used to illustrate these points.2017Risk
Using Quantum Theory and Monte Carlo Multiverse to Manage RiskNathan EskueNeed to better predict your EAC? Tired of being surprised? Great! Not an expert on quantum theory or the multiverse? No problem! This hands-on session will give a high level view of how quantum theory can help us understand our EAC by viewing it as a multiverse. We'll then explore an Excel model that takes your EAC and uses Monte Carlo analysis to reveal all possible futures. Attendees get a copy of the model!2017Risk
Software Projects Estimation & ControlAlain AbranThis talk presents 1) an outline of the design of the '2nd generation' COSMIC Function Point method for measuring a functional size of software; 2) Industry evidence of its versatility & value in software project benchmarking & estimating in business, real-time, infrastructure, component, mobile, IoT, cloud software, via traditional & agile methods; 3) how it can be used for early and rapid sizing at estimation time; 4) how size measurement can be automated with very high accuracy.2017Software
Bottom Up Methods of Estimating Software SE/PM and Non-DCTI CostsJames BlackSystems Engineering/Program Management (SE/PM) and additional non-Design, Code, Test, and Integration (Non-DCTI) activities performed during software development efforts are often significant and drive estimates of total project costs. Yet, cost estimates often omit the detailed research and analysis needed to adequately model SE/PM & Non-DCTI costs. This brief will present bottom up methods useful for understanding and estimating these costs and share analysis of recent SE/PM & Non-DCTI data.2017Software
Reliable Non-Design, Code, Test, and Integration Cost RelationshipsJeremy Goucher, Brittany StaleySoftware cost estimates require ratios derived from historic cost reports for non-design, code, test, and integration (NDCTI) cost elements. Since NDCTI accounts for as much as 50% of the estimate, a comprehensive historical data set is critical to ensuring an accurate estimate. The authors have recently analyzed over ten years of actual cost data from DoD command and control systems to develop a new set of NDCTI ratios. The results also bring new insight into "fixed" versus "variable" cost.2017Software
Objective SLOC: An Alternative Method to Sizing Software Development EffortsAndrew KicinskiEquivalent Source Lines of Code (ESLOC) is the basis of methodology used by many organizations for collecting and estimating software development costs. Selecting ESLOC parameters requires insight into the software reuse. Too often data collectors are unable to verify the appropriateness of the assigned ESLOC parameters and validate their implementation. This paper examines the drawbacks of ESLOC, and presents an alternative and more objective method to estimating software development effort.2017Software
Software Effort Estimation Models for Contract Cost Proposal EvaluationWilson Rosa, Corinne WallsheinThis study will introduce regression models and benchmarks for predicting software development effort using input variables typically available at contract bidding phase. The analysis is based on 200 DoD projects delivered from 2004 to 2016. The first set predicts effort using initial software requirements along with peak staff, application domain, and other categorical variables. The second set predicts effort specifically for agile software development using data from 16 agile projects.2017Software
Software Size GrowthMarc Russo, Wilson RosaSoftware cost estimating relationships often rely on software size growth percentages. Actual delivered source lines of code (SLOC) may be predicted with categories of early code estimates such as new, modified, reuse, and auto-generated SLOC. Uncertainty distributions will be presented to represent growth by code category for use in cost modeling.2017Software
Analysis of Software Cost Estimation Models Using Normalized, Stratified DataDan StricklandSoftware Resources Data Reports (SRDR) are the DoD's effort to improve cost estimation efforts by collecting metrics on software programs. Existing SRDRs were normalized in a manner consistent with previous efforts and productivities were analyzed using three popular software cost estimation models (SLIM, SEER, and COCOMO). Calculated values were tested against model thresholds and the results were tested for accuracy. Findings indicate some models outperform others and schedule impacts results.2017Software
The Journey from "Bottom-up" to Predictive Modelling BOELori SaleskiPredictive Cost Modeling has the potential to save significant money over conventional bottom-up bidding with better accuracy. This paper outlines a process to implement predictive cost analytics in an organization that been dependent on bottom-up estimating for decades. Walk with us down the path discussing implementation, change management, adoption and stakeholder challenges. Explore architecting the solution for maximum leverage at ROI.2017Software
Rapid Cost Estimation for Storms Recovery Using Geographic Information SystemsRolando A. Berraos-Montero, Steven M. F. Stuban, Jason DeverThe present study introduces a new approach to estimate the recovery costs of public property in the aftermath of a storm, by integrating geographic information systems. Estimating recovery costs for a disaster is a current concern for emergency responders. This work focuses on applying economic indicators, population, and storm event tracking to geographic information systems for rapidly estimating recovery costs. 2016Journal of Cost Analysis and Parametrics
Review of Quantitative Methods for Designing Availability-Based ContractsAmir Reza Kashani Pour, Peter Sandborn, Qingbin CuiUnderstanding the total life-cycle cost is an essential part of all sustainment contracts. Sustainment constitutes 70% or more of the total life-cycle cost of safety-, mission-, and infrastructure-critical systems. For many types of systems, availability is the most critical factor in determining the total life-cycle cost of the system. To address this, availability-based contracts have been introduced in the governmental and non-governmental acquisitions space (e.g., energy, defense, transportation, and healthcare). 2016Journal of Cost Analysis and Parametrics
Tooth-to-Tail Impact Analysis: Combining Econometric Modeling and Bayesian Networks to Assess Support Cost Consequences Due to Changes in Force StructureBradley C. Boehmke, Alan W. Johnson, Edward D. White, Jeffery D. Weir, Mark A. GallagherCurrent constraints in the fiscal environment are forcing the Air Force, and its sister services, to assess force reduction considerations. With significant force reduction comes the need to model and assess the potential impact that these changes may have on support resources. Previous research has remained heavily focused on a ratio approach for linking the tooth and tail ends of the Air Force cost spectrum and, although recent research has augmented this literature stream by providing more statistical rigor behind tooth-to-tail relationships, an adequate decision support tool has yet to be explored to aid decision-makers. 2016Journal of Cost Analysis and Parametrics
Forecasting the Unit Price of Water and Wastewater Pipelines Capital Works and Estimating Contractors MarkupRizwan Younis, Rashid Rehan, Andre J. A. Unger, Soonyoung Yu, Mark A. KnightMunicipalities and water utilities need to make realistic estimates for the replacement of their aged water and wastewater pipelines. The two main objectives of this article are to present a method to forecast the unit price of water and wastewater pipelines capital works by investigating inflation in their construction price, and to quantify the markup that contractors add to bid a project price. The Geometric Brownian Motion model with drift is used for investigation.2016Journal of Cost Analysis and Parametrics
Generalized Degrees of FreedomShu-Ping HuTwo popular regression methods for the multiplicative-error model are the Minimum-Unbiased-Percent Error and Minimum-Percentage Error under the Zero-Percentage Bias methods. The Minimum-Unbiased-Percent Error method, an Iteratively Reweighted Least Squares regression, does not use any constraints, while the Minimum-Percentage Error under the Zero-Percentage Bias method requires a constraint as part of the curve-fitting process. 2016Journal of Cost Analysis and Parametrics
Development of Unit Cost Indices and Database for Water and Wastewater Pipelines Capital WorksRashid Rehan, Rizwan Younis, Andre J. A. Unger, Brendan Shapton, Filip Budimir, Mark A. KnightThe objective of this work is to develop a unit cost database and index for water and wastewater pipelines capital works, and estimation in their construction cost. This was accomplished by analyzing tender summaries and progress certificates from the cities of Niagara Falls and Waterloo, Ontario, Canada, that span the period from 1980 to 2008, as well as using data from RS Means construction cost database. 2016Journal of Cost Analysis and Parametrics
Using Pre-Milestone B Data to Predict Schedule Duration for Defense Acquisition ProgramsChristopher A. Jimenez, Edward D. White, Gregory E. Brown, Jonathan D. Ritschel, Brandon M. Lucas, Michael J. SeibelAccurately predicting a realistic schedule for a defense acquisition program is a difficult challenge considering the inherent risk and uncertainties present in the early stages of a program. Through the application of multiple regression modeling, we provide the program manager with a statistical model that predicts schedule duration from official program initiation, which occurs at Milestone B, to the initial operational capability of the programs deliverable system. 2016Journal of Cost Analysis and Parametrics
Dynamics of New Building Construction Costs: Implications for Forecasting Escalation AllowancesMichael T. Dugan, Bradley T. Ewing, Mark A. ThompsonConstruction projects often require multiple years to complete and the costs of supplies, materials, and labor may increase substantially during a projects time span. As a result, construction contracts often include an escalation clause to account for cost increases. This article examines the time-series properties of new building construction costs using several producer price indexes. 2016Journal of Cost Analysis and Parametrics
Balancing Expert Opinion and Historical Data: The Case of Baseball UmpiresRicardo ValerdiMany decisions benefit from situations where there exist both ample expert opinion and historical data. In cost modeling these may include the costs of software development, the learning curve rates for specific manufacturing tasks, and the unit rate costs of operating certain products. When making forecasts we are often faced with the decision to base our estimates on either expert opinion or historical data. 2016Journal of Cost Analysis and Parametrics
Using Robust Statistical Methodology to Evaluate the Cost Performance of Project Delivery Systems: A Case Study of Horizontal ConstructionDares Charoenphol, Steven M. F. Stuban, Jason R. DeverThe objective of this study is to demonstrate the application of the bootstrapping M-estimator (a robust analysis of variance [ANOVA]) to test the null hypothesis of means equality among the cost performance of the three project delivery systems (PDS). A statistical planned contrast methodology is utilized after the robust ANOVA analysis to further determine where the differences of the means lie.2016Journal of Cost Analysis and Parametrics
Multiproduct Cost-Volume-Profit Model: A Resource Reallocation Approach for Decision MakingGabriel Soares Zica Bergo, Bruna Hoffmeister Lucas, Vinicius Amorim Sobreiro, Marcelo Seido NaganoThis work addresses the problem of reallocating productive resources to maximize proat. Most contributions to the topic focus on developing or improving the Cost-Volume-Proat model to obtain solutions that provide an ideal mix of products before the data is given. In particular, some algorithms are available for the problem, such as the ones proposed by Kakumanu and Shao and Feng. 2016Journal of Cost Analysis and Parametrics
NASA Commercial Crew Cost Estimating - A Look at Estimating Processes, Challenges and Lessons LearnedLance Cole, Rick BattleTo support annual PPBE budgets and NASA HQ requests for cost information for commercial crew transportation to the International Space Station (ISS), the NASA ISS ACES team developed system development and per flight cost estimates for the potential providers for each annual PPBE submit from 2009-2014. This paper describes the cost estimating processes used, challenges and lessons learned to develop estimates for this key NASA project that diverted from the traditional procurement approach and used a new way of doing business.2016Government Processes
Health CERs - Using an interdisciplinary approach to estimating the cost of caring for Canada's veteransErin Barkel, Carleigh MalanikReports from the Veterans and the National Defence and Canadian Forces Ombudsmen have brought attention to the prevalence of mental illness among veterans. However, there are no published estimates for the cost of treating this growing population. A key obstacle to this research is that the relevant data is dispersed across several government organizations. We fill this research gap by leveraging actuarial and epidemiological studies of the veteran population to create new cost estimating relationships.2016Government Processes
Federal Shared Services: The future Shared-First and challenges facing the Federal Shared Services Providers (FSSPs) and their customersRuth Dorr, Marilyn Fleming, Sarah PopeThis paper will discuss the progress being made in the "Shared-First" approach for sharing of administrative services. Federal Agencies are being asked to execute their missions to ever increasing standards in a resource constrained environment. Administrative shared services have been identified as a way to reduce redundancy and improve efficiency in Federal business solutions. The identification of shared services as a Presidential cross-agency priority along with the designation of several providers for administrative shared services are recent developments in the movement toward shared services within the Federal government.2016Government Processes
Lessons learned from applying cost capability curves on an Air Force AoARobert GeorgiNew direction from Air Force leadership established a requirement for the presentation of life cycle cost versus capability tradeoff analysis for all AoA final reports. This presentation examines the approach that was used for a recent AoA that was a pilot program for the Cost Capability Curve process. When appropriately implemented cost capability analysis can provide significant advantages. There are lessons learned from attempting to implement this approach. We will also discuss examples where this approach does not apply.2016Government Processes
Mining for Cost Estimating Relations from Limited Complex DataMark Jacobs, Shawn HayesNASA's robotic Earth and space science missions cover a diverse group of projects addressing complex science objectives that include challenging implementation approaches. Progress applying Principle Component Analysis techniques covering project management, systems engineering, mission assurance, integration & test, and spacecraft subsystems is described. Supporting data analysis efforts include a large detailed set of technical and programmatic input candidates that are analyzed to identify the primary spacecraft cost drivers.2016Government Processes
Manufacturing Assembly Plan (MAP) Tool: Bridging the Gap between Performance and the Construction ProcessRichard Lee, Haitham Ghannam, Edward WalshThis presentation explains the four module build plan for the VIRGINIA Class Submarine (VCS) program and provides an in-depth look into labor hours by Section 1/2A, Section 2B/5, Section 6/7, Section 8/9, Final Assembly and Test (FAT), and Post Shakedown Availability (PSA). Further understanding these concepts will not only allow cost estimators and program managers to better understand the current construction build strategy, but it will also aid in better aligning cost estimates with reality at the deck plate.2016Government Processes
It Ain't Easy Being Green... Sustainable Manufacturing with an eye on Cost Avoidance and StewardshipKammy Mann, Jessica BoatwrightThe DoD, one of the largest energy consumers in the world, has committed to sustainability. An important element of sustainable weapon systems acquisition currently not under the purview of the DoD's strategic plan is sustainable manufacturing. This paper will leverage current research and industry best practices to identify the impact of sustainable manufacturing on cost avoidances relating to human health and environmental benefits. Recommendations will also be provided for complying with the FAR sustainability requirements.2016Government Processes
Business Case Analysis WizardLauren Nolte, Cassie Capots, Marcie Pfeuffer This paper describes the purpose and design of a "Business Case Analysis (BCA) Wizard" that was created for the Air Mobility Command (AMC) Enterprise Learning Office (ELO). As the value of BCAs within the learning transformation at AMC has grown, ELO identified the need to develop a tool that allows "non-cost analysts" to develop high-quality BCAs. This "BCA Wizard" guides users through a standard process of conducting a BCA via a user-friendly interface.2016Government Processes
Masters in Cost Estimating and AnalysisDan Nussbaum, Greg MislickThis presentation provides an update on the all-Distance Learning Masters Degree and/or Certificate Program in Cost Estimating and Analysis (MCEA / CCEA) offered at the Naval Postgraduate School (NPS).2016Government Processes
Methods Used in Pricing and Conveying NASA Assets for Use by Commercial ProgramsJames Roberts, Torrance LambingDifferent agreement types including EUL's, Use Permits, Land Leases and others are used to convey NASA assets no longer needed by programs such as Space Shuttle and construction of the Space Station. By law, some situations prohibit NASA from pricing facilities in competition with the local market. Costs are estimated either via market-based or actual cost methods, according to circumstances. This paper will explain and clarify how these costs are estimated and pricing is established.2016Government Processes
ATC Zero: Estimating the True Impact of the Chicago Control Center FireKellie Scarbrough (Wutzke)For seventeen days, the Chicago Air Route Traffic Control Center went offline after a contractor set fire to the intricate communications network that controls some of the busiest airspace in the country. Workers scrambled around the clock for seventeen days to restore functionality to the center. This paper seeks to determine the true economic impact of the event, including the costs of time, operations, and environmental impacts.2016Government Processes
Where Does Your Cost Estimate Go?Emily Stafford, Christopher MetzYou have spent several months developing your cost estimate and presenting it to the customer. Now what? This paper will track what happens to your cost estimate after it has been delivered to the customer. It will follow the estimate through the budget formulation process, ultimately becoming a part of the President's Budget. It will also discuss the cost estimator's role during the budget process, and the influence of the budget process on future estimates.2016Government Processes
Facilitating Predictive Cost Analytics via Modelling V&VJohn SwarenWe are often asked to "fine tune" estimating software to unique situations. That "calibration" effort is key to Verification & Validation of parametric models. In this presentation, we show an example of using Predictive Analytics to develop a method to predict a Software-estimating model's inputs as a function of Key Performance Parameters (KPPs). We demonstrate how a calibrated knowledgebase can best be used to V&V an estimating approach reflective of past and future requirement metrics.2016Government Processes
Hyperbolic Discounting and Defense AcquisitionsEric LofgrenCost estimation has largely been approached as an engineering problem which can be solved using data and techniques. While these are undoubtedly invaluable considerations, the broader production process is an inherently open-ended system subject to human interactions which are ill-defined and non-deterministic. This paper explores how behavior and institutions affect cost outcomes through the avenue of hyperbolic discounting, a time-inconsistent model of personal valuations. It will be argued that evidence for hyperbolic discounting in both government agents and contractor management have negative effects on cost estimates and will proffer some solutions based on commitment schemes.2016Government Processes
Analysis of Alternatives from a Cost Estimating PerspectiveKirby Hom, Kenny RaglandSubsequent to the Weapon Systems Acquisition Reform Act of 2009, there have been other significant DoD policy changes that require Service sponsors to conduct an Analysis of Alternatives (AoA) prior to Milestone A. "About three-quarters of a program's total life cycle cost is influenced by decisions made before it is approved to start development" [GAO-09-665]. From a cost estimating perspective, a successful AoA rests with the study approach, overall costing methodologies, key insights from trades analyses, and lessons learned.2016Government Processes
The Navy Modernization Program: Estimating the Cost of Upgrading AEGIS Guided Missile CruisersJeremy GoucherThe Navy is well into its 20 year, $16B (CY10$) plan to modernize 84 AEGIS warships. This cost estimate covers eleven of the planned ships and accounts for maintenance and upgrades to the ship and the combat systems. The estimate leverages recent contract and shipyard performance data and interviews with engineers, resulting in a detailed study and recommended cost savings initiatives. The methods and data in this estimate will assist ship modernization cost efforts across the fleet for the foreseeable future.2016Government Processes
Cost Estimating Challenges in Additive ManufacturingJoe Bauer, Patrick MaloneAdditive manufacturing possibilities are virtually endless: from rapid prototyping, to printing food, to recreating human tissue. This presentation highlights the challenges of estimating 3D printing processes and offers some solutions to the estimator. Challenges include: amortizing non-recurring tooling across multiple programs, capturing software-centric nature of 3D printing, printing speed, and printing media limitations. This presentation highlights the use of existing cost estimating tools and areas for future development.2016Methods & Models
Long Term Impact of Ship Concepts on Operating and Support Cost AffordabilityEric Buller, Stuart Merrill, Bryan MillerAchieving affordability can be challenging especially once a program enters full rate production. Developing accurate Operating and Support (O&S) cost estimates that reveal tradeoff impacts and sustainment risks provides decision makers with affordability analysis during key acquisition events. We intend to demonstrate through analysis the impact of legacy and new ship concepts on O&S costs and program affordability. We will examine the cost tradeoffs for each ship concept in each O&S cost element.2016Methods & Models
The Peculiar World of Sole Source Contracting; Maybe There is a Better WaySandy Burney, Shawn LarsonIn 2014, 21% of U.S. Government contracts were sole source, accounting for 31% of total contracted value. They typically require negotiation and certification of all cost estimates, which can take up to 18 months. This light-hearted presentation will explore the peculiar world of developing and negotiating sole source estimates, including differences from budgetary and competitive estimating, as well as offer suggestions for achieving fair and reasonable sole source prices for less time and money.2016Methods & Models
Migration of Microsoft Excel Tools to Next Generation Platforms: "Can You Hear the Footsteps"Jeremy EdenAnalysis tools are popular solutions for extended deployments and at the same time an acceleration of platform migrations is occurring to keep up with rapidly changing technology and end user demands. This paper provides specific examples of platform migration challenges, resulting failures, and how they can be best addressed. It will also provide specific best practices for ICEAA member take-away that can be followed to help ensure smoother transitions across platforms.2016Methods & Models
Generalized Degrees of Freedom (GDF)Shu-Ping HuMinimum-Percentage Error/Zero-Percentage Bias (ZMPE) method is commonly used for multiplicative-error models. But ZMPE users do not adjust degrees of freedom (DF) for constraints included in the regression process. This generates misleading ZMPE CER fit statistics and underestimates the CER error distribution variance. Hence, ZMPE results are incomparable with those derived without constraints. This paper details why DF should be adjusted and proposes a Generalized Degrees of Freedom measure to compute fit statistics for constraint-driven CERs.2016Methods & Models
Bottom Up Methods of Estimating Depot Level RepairablesTim Lawless, James Black, Dave GoldbergDepot Level Repairables (DLR) costs comprise a significant share of total Operations and Support costs as they capture the scope of repairing and/or replacing failed hardware items. This paper will provide detailed descriptions of bottom up methods to estimate DLR costs using programmatic technical baseline documentation and acquisition costs as well as supply system requisitions and repair price data sources. Emphasis will be placed on producing data-driven analysis to inform cost estimates and estimate customers.2016Methods & Models
Seven Degrees of Separation: The Importance of High-Quality Contractor Data in Cost EstimatingCrickett PettyThe popular notion that any two people are linked to one another on average via a chain with "six degrees of separation" is based on a relatively small sample. Using massive data sets, researchers have since discovered that the average degrees of separation is closer to seven. This highlights the need for complete and thorough data in any analysis, and cost estimating in particular. Despite its importance the collection of high-quality contractor cost data is challenging. Processes for meeting these challenges are presented.2016Methods & Models
Interviewing Subject Matter ExpertsMelissa TeicherAs estimators, we often have to obtain technical information from Subject Matter Experts (SMEs) in order to provide thorough and accurate analysis. But sometimes, getting the information you need is easier said than done. This paper discusses the steps for interviewing SMEs in order to make the most of their time and your time. It explores the benefits of building a cooperative relationship of mutual respect and the "dos" and "don'ts" of SME consultations.2016Methods & Models
The Impacts of Design Change on Reliability, Maintainability, and Life Cycle CostAndreas Viberg, Oskar Tengo When making decisions about a design change to a technical system, it is essential to understand how it affects reliability and maintainability, as that directly impacts both overall performance and cost of operations and maintenance throughout the life cycle. LCC analysis based on modeling and simulation of relevant scenarios is an effective way to gain that understanding. This paper presents an approach for such analysis and illustrates its successful application in a recent project.2016Methods & Models
A Prime Contractor's View Using JCL Tools and TechniquesChip WoodsFor several years now, Lockheed Martin has been integrating cost and schedule data into a single model to perform integrated analysis. This paper will discuss Lockheed Martin's JCL process, the tools used to develop JCL curves, and how this analysis can be used to shape and steer the program-including modeling funding constraints.2016Methods & Models
Class List of Characteristics: Changing the Landscape of Estimating by Removing BiasPaul BreonCLOC (Class List of Characteristics) is a coding system wherein each digit of the code indicates one of the cost drivers for the product/activity. CLOC gives the ability to "Call a thing by what it really is" and creating a universal language that can be utilized for any purpose. CLOC does NOT include Bias like quantity, cost, etc. These things are used in conjunction with the code to do any kind of analysis needed.2016Methods & Models
Achieving Affordability through high fidelity ROM cost estimatesEric BullerAchieving affordability can be challenging especially if early decisions are made without proper cost estimates. Developing high fidelity ROM cost estimates and identifying capability tradeoffs during the requirements generation phase can be critical to the success of achieving long term affordability. We intend to demonstrate through analysis that preforming high fidelity ROM cost estimates with capability tradeoffs allow decisions makers to informed choices on mission requirements that achieve affordability for future programs.2016Methods & Models
Building an Integrated Aerospace Electronics Development Parametric Model from the Ground UpDavid BloomThis paper presents the results of a multiple year effort to create a business specific set of integrated parametric cost models for complex electronics hardware development in such a way that all disciplines and activities that participate in that development process have a piece of the model and a stake in the data collection that drives the model. This "grounds up" approach provides that added benefit of cost allocation guidance once the development is approved.2016Parametrics
Using Historical Cost Reports to Derive System Specific FactorsJocelyn Tague, William BanksNon-Recurring Engineering (NRE) is often a cost driver for new, complex systems and must be given high attention during Milestone B cost estimating. Our team developed an approach for developing system specific Cost Estimating Relationship (CER) using actual costs for from combat weapon systems with shared capabilities, resulting in a credible defensible, yet flexible approach that can be replicated for a wide range of estimates.2016Parametrics
Improving Parametric Software Cost Estimation Through Better Data CollectionBob KouryIn 2010 PRICE Systems LLC in conjunction with PEO STRI agreed to work together to improve the data being collected to support software development cost estimation in the command. The paper will compare and contrast the experiences of two other collection efforts with those of original effort. The paper will identify core findings which may be used universally to improve the quality, quantity, and applicability of data being collected.2016Parametrics
A Regression Method for Sparsely-Populated DatasetsDaniel BarkmeyerA method is presented for including sparsely-populated independent variables in a regression analysis without introducing subjectivity or statistical bias. The method is shown to reproduce the result of a traditional regression method for a fully populated dataset, and to converges to the same result on a sparsely-populated dataset as that dataset decreases in sparseness. Application of the method to develop a new, empirically-derived model for spacecraft testing schedule duration will also be presented.2016Parametrics
Improvement Curves: An Early Production MethodologyBrent JohnstoneLearning slope selection is a critical parameter in manufacturing labor estimates. Incorrect ex ante predictions lead to over- or understatements of projected hours. Existing literature provides little guidance on ex ante selection, particularly when some actual cost data exists but the program is well short of maturity. A methodology is offered using engineered labor standards and legacy performance to establish a basic learning curve slope to which early performance asymptotically recovers over time.2016Parametrics
The NRO CER Analysis ToolDonald MacKenzieThe NRO Cost and Acquisition Assessment Group (CAAG) uses its CER Analysis Tool, CERAT, to inform CER development. CERAT analyzes CERs for sensitivity to influential data points (IDPs). The impacts on estimated costs for a target data point when data points are removed are the primary outputs. The changes in recorded CER constants when data points are removed also help in assessing overall CER "stability." CERAT also provides several other aids to CER developers.2016Parametrics
A Parameter Driven WorldJeffrey PlattenParametric estimating is gaining acceptance. DFARS requires DOD contractors to maintain an estimating system. Parameters are the characteristics, features, or measurable factors that define a product or program. What makes a good parameter? What are the pros and cons of parametric cost estimating? How do you develop and maintain a parametric estimating database/model? What does the future hold, beyond just estimating? How do you gain parametric control over touch labor, support labor, and parts costs to improve affordability?2016Parametrics
Generating a Semi-Automated "Similar To" Basis Of Estimate from a Complex Parametric Hardware Cost Model for AntennasDanny Polidi, David BloomThis paper discusses the development of a "Similar-To" Basis Of Estimate (BOE) generation tool used in conjunction with a Complex Parametric Antenna Cost Model. Starting with the parametric model based on quantifiable sizing mechanisms which are designed to quickly and accurately calculate the "top-down" cost for all engineering and operations disciplines and functions required for antenna development and test, cost is generated, "similar to" programs are identified, and a BOE can be auto-populated.2016Parametrics
Macro-parametrics and the applications of multi-colinearity and Bayesian to enhance early cost modellingDale ShermonThis paper will consider the spectrum of parametric cost models from cost estimating relationships (CER) to micro and finally macro-parametric models. This will lead to a description of the Family of Advanced Cost Estimating Tools (FACET) parametric suite of models and their top-down capability to estimate costs incorporating the application of multi-colinearity and bayesian techniques.2016Parametrics
A CER Development Process for Spares EstimatingCheryl WilsonThis presentation shows the direction of a group of Spare Parts Estimators and how we are advancing our statistical skills and CER development processes. It shares the progress in our CER development journey attained through education, research, analysis and teamwork.2016Parametrics
Essential Views of the Integrated Program Management Reports (IPMRs)Thomas Coonce, Glen AllemanDuring this session, the authors will describe key views of the IPMRs and other data that will allow a government stakeholder to a) understand the cost, schedule and technical status on applicable contracts, b) investigate the nexus of performance problems, and c) observe likely future problems. The presentation represents a "synthesis" of existing traditional earned value management metrics and other metrics designed to provide an integrative picture to assist in keeping the program "green".2016Program Management
Meeting Fiscal Constraints: the Evolving Role of Performance MeasurementSteve Green, Kevin Davis, Kurt Heppard Performance measurement tools can reflect the strategic intent of management and help them prioritize initiatives to meet strategic objectives. In fiscally austere times, the Balanced Scorecard can help organizations address these tough decisions. Applying concepts from the Balanced Scorecard to cost analysis and budgeting, this paper develops an integrated aligned system of strategic goals, performance metrics, and cost parameters at a university facing pressure to manage costs. There may be lessons learned for the DoD.2016Program Management
The graph is always greener on the other side: examples of how to interpret and fix bad visual representations of dataStephen KetchamIn my role performing independent reviews of estimates, I've seen a quality spectrum of different types of graphs, charts, and forecasts. These visuals summarize key points and remain in decision-makers minds, often providing the convincing point to win funding approval for the project. Unfortunately they can also be poorly crafted, employ scare tactics or even frame the data in a misleading way. This presentation provides bad examples of visuals, and discusses possible improvements.2016Program Management
Seeing What We Need to See vs. What We Hope to See: Visualizing Integrated Cost and Schedule Data for Earned Value AnalysisBrian LeachWhen applied properly, data visualization can serve analysis by showing us what is otherwise difficult to see (and not always what we hope to see). With the emergence of the UN/CEFACT XML standard, integrated cost and schedule data can be shown in exciting new ways. This paper will examine and explore innovative new ways to show integrated cost and schedule data which have practical applications in earned value management system analysis.2016Program Management
Different Incentives in Government ContractingJennifer LeottaThis paper shows stock prices are used as the primary incentive for executives of publicly traded government contractors rather than profit sharing contracts. Looking at the reaction of companies' stock prices to public and private sector news, this paper shows stock prices do not capture "bad news" for both the public and private sector customers compared to overall market indicators; and would make poor contract incentives. Finally, alternate methods of contract incentives will be explored.2016Program Management
Agile Project Management Controls: Issues with EVM and a Viable AlternativeOmar Mahmoud, Blaze SmallwoodAs more government software programs adopt agile practices, program managers are trying to find effective ways to monitor and control cost and schedule. The challenge is that traditional project control techniques, like EVM, are not well suited to deal with the requirements fluidity and new terms/metrics that come with agile projects. This paper will describe the inherent issues with using EVM on agile projects and offer a recommendation for a more effective solution.2016Program Management
The Psychology of Cost EstimatingAndy PrinceResearch into human psychology has yielded amazing findings into how we process information and how we use information to make decisions. Cost estimators can use psychology and behavioral economics to improve not only our cost estimates but how we communicate and work with our customers. By understanding how our customer's think, and more importantly, why they think the way they do, we can have more productive relationships and greater influence. The end result will be better decisions by the decision makers.2016Program Management
Cost ManagementGeorges TeologlouCost Management has a key role in project management and is the management trend due to the actual financial and economic worldwide constraints. Cost Management solution was built to be in line with the recommendations of the GAO (US Government Accountability Office) that stated that a large number of Defense projects are at risk of experiencing cost overruns and schedule delays. GAO found that the two major causes of project failure were first an inaccurate initial estimate and secondly poor subsequent cost control.2016Program Management
Using Total Lifecycle Cost as the Basis for Proposal Cost EvaluationRyan TimmThis paper discusses benefits and obstacles of using the TLCC of an entire system instead of the bid price for the purpose of proposal cost evaluation. An example acquisition of a space vehicle (SV) ground system explains the impact to TLCC of portions of the ground system developed by the SV manufacturer, SV operations, external software interfaces, government furnished IT infrastructure, and legacy sustainment until transition. Using TLCC allows higher-level requirements and enables innovative architectures.2016Program Management
A Maturity Analysis of the American and European Cost Community Mark GilmourThe Cost Engineering Health Check (CEHC) is a cost capability benchmarking tool used to identify strengths and weaknesses within organizations costing functions. In 2014 the authors of this paper conducted a CEHC at the ICEAA conference in Denver. That workshop used e-voting to anonymously gather the views of the audience in attendance. This paper presents a summary of these results and contrasts them against data gathered from European workshops, providing a snapshot of the international costing capability.2016Program Management
Presenting EstimatesMelissa TeicherDecision-makers rely on cost estimators to guide them in making the most educated judgments on what is best for the program or organization. A thorough and well-organized presentation can make the difference between an easy decision and a difficult one. This paper discusses how public speaking skills contribute to an effective estimate presentation. It also delivers tips and tricks on how to make an estimate presentation more visually pleasing and easier to understand.2016Program Management
Using Stochastic Optimization to Improve Risk PrioritizationEric DrukerMinimizing the cost of complex programs is critical for government agencies trying to meet their missions in today's fiscal. Today, identifying cost-savings measures is a manual procedure where an analyst must make an educated guess as to what risks to mitigate. This process is manual and slow. This paper will outline a methodology for using the emerging field of stochastic optimization to automate the identification of cost-savings measures on complex programs.2016Risk
Why Are Estimates Always Wrong: Estimation Bias and Strategic MisestimationDan GalorathThe biggest source of estimation error usually comes from people, either by accident or strategically. This is a disaster because viable estimates are core to project success as well as ROI determination and other decision making. Most people don't know how to estimate. Those who estimate are often optimistic, full of unintentional bias and sometimes strategically mis-estimating. This paper discusses the issues of estimation bias and strategic mis-estimation as well as how to mitigate these issues.2016Risk
A "Common Risk Factor" Method to Estimate Correlations between DistributionsMarc GreenbergA "common risk factor" method uses expert-derived risk factors to estimate correlation between two distributions. The premise is to estimate mutual information among risk factors that underlie each distribution. Approximation of such mutual information leverages properties of the joint probability distribution of a unit square. Geometric outputs of each pair of common random variables are compared leading to an estimated 'intersection' of pair of common risk factors. This intersection is then assumed to be a proxy of correlation.2016Risk
Data-Driven Guidelines for Correlation of Cost and Schedule GrowthSidi Huang, Mark Pedigo, Chris Shaw When running Monte Carlo simulations, one of the most difficult aspects to account for is correlation between the independent variables. Ignoring correlation generally reports overly optimistic analysis results; industry standard suggests that when no additional insight is available, a default value of 0.2 can be used. This study analyzes historical NASA program data to develop data-driven guidelines for correlation. Moving forward, this study can be replicated to other industries and organizations with critical schedules.2016Risk
Risk Mitigation: Some Good News after the Cost / Schedule Risk Analysis ResultsDavid HulettThe results of your cost-schedule risk analysis say you are likely to be overrun on both schedule and budget. Are you are out-of-luck? No. Do not forget the risk mitigation phase. Risk mitigation typically proceeds from a prioritized risk list developed from analysis of the risk and uncertainty-loaded schedule. Complete mitigation of risks is not usually feasible, and uncertainty is usually deemed to be irreducible. Some good news comes from the risk mitigation actions.2016Risk
A Mathematical Approach for Cost and Schedule Risk AttributionFred KuoThe level of sophistication in cost estimate has progressed substantially in the last decade. This is especially true at NASA where, spearheaded by the Cost Analysis Division (CAD) of NASA Headquarters, cost and schedule risks in terms of confidence level have been codified. In this paper, the author attempts to employ the portfolio approach in calculating risk attribution; first applying to a portfolio of cost risks and extend the same concept to schedule risks analysis.2016Risk
Triangular Distributions and CorrelationsJennifer LampeTriangular distributions are often used in estimating cost risk because the math is relatively simple. The concept of an estimate having a lower limit, an upper limit, and a mode is easy to understand. We show the math behind triangular distributions and correlations. What does it mean to say that two risks are 30% correlated? How do risk analysis programs handle triangular distributions and correlations? How can you use random numbers in Excel to generate a Monte Carlo simulation for any defined triangular distribution?2016Risk
The Impact of Selected Assumptions and Core Tenets on Schedule Risk Assessment Results (A Progressive Model Comparison)James QuilliamIn the quest to ensure the sound representation of Schedule risk assessment (SRA) simulations this case study will provide a progressive model comparison of schedule risk assessment assumptions and core tenets. The elements of this approach will focus on the: methodology and tools; the progressive assumptions and core tenets applied; conclusions and lessons learned for practitioners. This will greatly enhance the understanding and confidence that leadership and project teams have in the schedule risk assessment results.2016Risk
Wine as a Financial InvestementMichael Shortell, Adiam WoldetensaeThe purpose of this presentation is to look into wine as an investment and analyze its characteristics as a financial asset. The goal here is to look into vintage wines through time, analyze its profitability and predict how investing in wine could be a good source of diversification. Our objective is to detect the types of wines that are a good source of investment and their associated risk. We aim to identify a good investment portfolio for a wine collector in order to maximize returns and minimize the associated risk.2016Risk
Beyond Correlation: Don't Use the Formula That Killed Wall StreetChristian SmartRisk models in cost estimating almost exclusively rely on correlation as a measure of dependency. Correlation is only one measure (among many) of stochastic dependency. It ignores the effect of tail dependency, when one extreme event affects others. This leads to potential outcomes that do not make sense such as a program with a large schedule overrun but no cost overrun. This method has been widely blamed for the financial crisis a few years ago. The copula method is introduced as a way to overcome these deficiencies.2016Risk
Connecting the Dots: Integrating the risk cube with the POEChristina SnyderDoD guidance instructs program managers to include risk and uncertainty analysis in their cost estimates. Program managers also maintain a risk cube that tracks known program risks and assigns a likelihood and consequence. Risks, regardless of if they are programmatic, technical, or schedule defined, will have an effect on the overall program cost. This paper investigates the linkage between identified risks and POE cost and shows how to translate the risk cube into dollars.2016Risk
Methodology for Integrating Risk mitigation Activities into JCL analysisJames Taylor, Justin HornbackThe intent of this whitepaper is to outline a an improved methodology for capturing risk mitigation activities for both cost and schedule into a Joint Confidence Level (JCL) analysis and assessing mitigation effectiveness.2016Risk
The Continual Pursuit of the One True Software Resource Data Reporting (SRDR) DatabaseRemmie Arnold, Peter BraxtonThis paper will demonstrate the ability for one database to store Software Resource Data Report (SRDR) source data in its original form, as submitted by WBS element and reporting event. This database allows evaluations, interpretations, and annotations of the data, including appropriate pairing of Initial and Final reports; mapping of SLOC to standard categories for the purposes of determining ESLOC; normalization of software activities to a standard set of activities; and storage of previous assessments.2016Software Estimating
Agile Estimation Using Functional MetricsThomas CagleyAgile methods have matured and are now being integrated into many different approaches to the development of software. Estimation has been problematic for all methods, agile to plan based and therefore it tends to be a lightning rod for experimentation and synthesis such as is being described in this paper. Agile Estimation Using Functional Metrics has presented a path for integrating the discipline found in functional metrics with the collaborative approaches found in agile parametric estimation.2016Software Estimating
Put some SNAP in your estimating modelDavid LambertAn effective estimating model is dependent upon having the capacity to accurately size the problem domain for the solution being developed. Function Point Analysis is a widely accepted industry and ISO standard for sizing functional requirements; however, it does not take into account the sizing of non-functional requirements. Learn how the newly released Software Non-Functional Assessment Process (SNAP) can provide a more complete and accurate assessment of project size.2016Software Estimating
Forecasting Software MaintenanceAlea DeSantisProviding a defensible estimate for software maintenance can be complicated. By overlaying historical data with lifecycle events one can derive event driven factors and use trend analysis to forecast future Software Problem Reports. Topics addressed are: common-cause variation, special-cause variation, severity level, complexity level, and decay rate. This method moves away from using a general defect rate calculation to a more complete product-specific calculation.2016Software Estimating
Effective use of Function Points for Analogous Software EstimationDaniel FrenchAnalogous estimation is often used to estimate software projects, especially in the early phases, because it is believed to be quicker and easier than other techniques. However, many organizations do not employ this technique effectively and correctly. One pitfall is improperly applying the technique & basing these estimates on Software Lines of Code (SLOC). This presentation will discuss common mistakes made when using analogous estimates as well as how the use of IFPUG function points can greatly improve this estimation technique2016Software Estimating
NASA Software Cost Estimation Model: An Analogy Based Estimation MethodJairus HihnSoftware development activities are notorious for their cost growth. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we will describe the methodology being used in the development of a NASA Software Cost Model using data mining clustering algorithms and evaluate its performance by comparing it to estimates from COCOMO II, a calibrated COCOMO II, liner regression, and K-nearest neighbor models.2016Software Estimating
Estimating Agile Software DevelopmentBob HuntThe use of Agile Software Development continues to grow in large and small Federal programs. These programs often utilize Story Points as a key size metric. This presentation will roovideman update on Agile Cost Estimating techniques and emphasize how to,utilize Story Points as a key size metric.2016Software Estimating
How Much Does Software Maintenance Cost?Cheryl Jones, James Judy, John McGarryTo accurately estimate, budget, allocate, and justify the resources required to meet evolving mission and service affordability requirements across the system life-cycle we must be able to answer the fundamental question: How much does software maintenance cost? This presentation provides a summary of the findings from an ongoing U.S. Army study that is focused on developing improved software maintenance estimation processes, cost relationships, and associated models.2016Software Estimating
Exploring DoD Software Growth: A Better Way to Model Future Software UncertaintyNicholas LanhamThis presentation highlights trends and estimating relationships derived from the April 2014 SRDR data. This analysis provides the estimating community with several new variables to better model future software growth uncertainty. In addition, the analysis includes growth by contract type, domain, and program type as well as introduces a multivariate model to predict final hours. All of the relationships included within this slide set have been derived from the best 400+ data points, were approved by DCARC, and are DoD-specific.2016Software Estimating
Considerations in Data Center Migration Life Cycle Cost Estimating: A Joint Information Environment (JIE) PerspectiveDennis McKeonThe potential benefits of consolidating data centers has led to the creation of initiatives such as the DoD Joint Information Environment (JIE). The goal is to reduce IT infrastructure costs by consolidating system support functions, improve hardware utilization, and lower energy and facility costs. This paper is from the perspective of an organization migrating to a consolidated data center. It will describe the migration steps, cost estimating approach, benefits, risks, and organizational challenges from their perspective.2016Software Estimating
Early Phase Software Cost and Schedule Estimation ModelsWilson RosaSoftware cost estimates are more useful at early elaboration phase, when source lines of code and Function Points are not yet available. This study introduces effort estimation models using functional requirements as a size predictor and evaluates development schedule using estimated effort based on data from 40 military and IT programs delivered from 2006 to 2014. Statistical results show that estimated functional requirements and peak staff are significant contributors and that estimated or actual effort is a valid predictor of development duration.2016Software Estimating
Software Size GrowthCorinne WallsheinNCCA will present their findings including cost estimating relationships (CERs) of software size growth using selected subsets. CERs will be compared using initially estimated parameter distributions against CERs using software size. Other parameters include requirements, staff size, hours, and duration. As the 2014 update of the Joint Agency Cost and Schedule Risk and Uncertainty Handbook advises use of standard error for uncertainty bounds, NCCA will compare bounds for recommended CERs, by selected data subsets.2016Software Estimating
At the Intersection of Technical Debt and Software Maintenance CostsArlene MinkiewiczWard Cunningham introduced the technical debt metaphor in 1992 to ease business leaders understanding of the cost of making poor decisions during software development. While this author has seen many different definitions of technical debt, it is basically an indication of the amount of 'should fix' issues that remain in production code. This paper reports on an on-going research effort to understand technical debt and its implications on software maintenance cost and effort estimation.2016Software Estimating
Software size measures and their use in software project estimationHarold van HeeringenUsing the right size measure is extremely important for software cost engineering purposes. However, the number of software size measures is growing rapidly and there is a lot of confusion in the industry with regard to the methods that should be used. The advantages and disadvantages of the major software size measures are discussed and recommendations are given on what size measures should be used to estimate the different types of software systems.2016Software Estimating
The ABCs of the UCF: A look at Section L and Section MJennifer LeottaThis paper will examine the government cost analyst's role in the acquisition/contracting process. Specifically, it will look at the Uniform Contract Format (UCF) and how to create effective language for Section L that provides clear directions for offerors and includes all information necessary to complete a thorough proposal evaluation. It will also look at developing Section M language for varying contract types (cost plus, fixed price, etc.) that is consistent with the Federal Acquisition2016Acquisition
A Weapons Acquisition Case Study: Cost Overruns and Schedule SlipsRandy Bowen, David L. Peeler, Jr.This story is true. Certain details have been omitted to "stump the audience" and protect the guilty. The trials and tribulations of an acquisition program cost overrun are explored for persistent lessons and future applications. This saga includes materiel difficulties, workforce issues, congressional interest, funding and schedule perturbations. The aim is to educate, as the persistence of fundamental acquisition problems and the insignificance of acquisition reforms are framed.2016Acquisition
GAO-COFC Concurrent Bid Protest Jurisdiction: Are Two Fora Too Many?James LinickThe Government Accountability Office (GAO) and the Court of Federal Claims (COFC) are the bid protest fora where a disappointed offeror may challenge an agency's procurement decision. Critics claim that concurrent bid protest fora, which have differences in jurisdiction, standards of review and remedies, creates inefficiency in the procurement system, e.g. serial protests. This presentation examines whether concurrent bid protest fora are healthy for the US federal procurement system.2016Acquisition
Early Stage Cost Estimating for Radars and SensorsJeremy GoucherThe majority of costs for programs are locked in even before a program enters production, which makes accurate early stage cost estimates vital for effective resource management and program success. This study proposes a method for analyzing combat system cost prior to a complete requirements description. The data set includes old, new, big, small, ground, sea, air, domestic, and foreign systems. The result is a model that requires limited data and is widely applicable.2016Acquisition
Federal Agency Independent Government Cost EstimateCassandra M. Capots, Lauren Nolte, Lavanya YeleswarapuTo achieve mission success in today's constrained fiscal environment, federal agencies must establish a fair and reasonable cost baseline to make informed decisions. This baseline is commonly called an Independent Government Cost Estimate (IGCE). We have developed a customizable process that forms a starter kit for any agency that wishes to quickly and effectively perform IGCEs. We discuss best practices, lessons learned, and implementation tips for both DoD and civilian agencies.2016Acquisition
Observation of Production Rate Effect observed in Historical Weapon Systems CostWilliam Banks, Timothy LawlessProduction Rate Effect has been observed in several weapon systems program's historical cost data during the procurement phase of the acquisition life-cycle. The main contribution of this paper is to illustrate real life observations of production rate effect and the economies of scale shown when purchasing end items from major defense contractors.2016Acquisition
Analyses of Alternatives for Space SystemsR. Alex WeklukThis presentation provides an overview of recent experiences performing satellite analyses of alternatives, including technical and cost trades. Our team developed a scalable mechanism to perform satellite sizing using physics and stochastic modeling. Once modeled, an array of costing methods is applied to develop repeatable and defensible estimates. The team made great strides in presenting the large number of nuances to decision makers.2016Business Case Analysis
Training Cost Analysts, a Cohesive Pedagogical Framework for SuccessKammy Mann, Danielle LainezHow do we train cost analysts? This paper will take a critical look at peer-reviewed academic research regarding pedagogy, or the method of teaching. Leveraging existing knowledge, current best practices and resources available, a framework for teaching and training will be proposed to the cost community. By establishing a consistent methodology for educating our professional workforce we can ensure that new members of the cost analysis field have the tools and skills to succeed.2016Business Case Analysis
3D Printing in the DoD; Employing a Business Case AnalysisNicole Santos, Richard Shea, Robert Appleton3D Printing is the family of technologies that enables users to produce items on demand from CAD files or 3D scanning. The potential benefits to military logistics include cost savings, weight reduction, and responsiveness to the warfighters' needs. To demonstrate and measure the benefits in the Department of Defense (DoD), a rigorous Business Case Analysis (BCA) will identify benefits and challenges to implementation including evaluating its costs, risks, and benefits.2016Business Case Analysis
Fixing the Flawed Nature of DoD BCA'sWilliam G. WilliamsonThe government is not a business that is profit-oriented. Monetary business metrics as a means of determining what items to consume are inherently flawed for the government since metrics such as Return on Investment's assume the organization is a provider of the goods or services, not a consumer. A better approach for the government would be to measure the areas that the government most values, i.e. schedule and requirements. Using those areas as anchors, a best value metric can be developed, pointing the government to smarter acquisition decisions.2016Business Case Analysis
Developing a Dynamic Expense-Volume-Profit Model to Determine Break-Even pointWilliam KentAnalysts must be able to determine when a program will be profitable. This insight is used to drive contract negotiation, pricing, and strategy. Many factors need to be evaluated: expenses, revenue, and volume of sales. These factors must be determined based on limited historical data and extrapolated over a time horizon. This paper introduces a framework for analyzing these factors using learning curves, regression, and financial statement analysis to determine break-even.2016Business Case Analysis
ROS vs. ROC: Government Can Lower Price by Raising FeeSandy BurneyMany Government Contractors measure profitability using Return on Sales (ROS), which includes Fee in the denominator. The FAR regulates fee percentages or Return on Cost (ROC). In negotiating sole-source contracts, this metric difference can result in the Contractor negotiating a higher risk adjusted cost and price to offset lower fees offered by the Government. The briefing provides a mathematical example of the tradeoff between Cost and Fee for a CPFF contract.2016Business Case Analysis
Bridging the Gap between Capital Investment Valuation in Private & Public SectorGeorge Bayer, Jr.The public and private sectors have different valuation techniques and considerations when evaluating capital budgeting decisions. Understanding the difference between private and public sector capital investment analysis (discounted cash flow, cost/benefit, tax impacts, stakeholders) helps decision-makers make better informed decisions. In the cost estimating community, understanding investment distinctions makes us better stewards of information and more effective Finance professionals.2016Business Case Analysis
Begin With a Joke? What John Oliver Can Teach Us About Communicating Cost EstimatesErin K. BarkelAre you having a hard time holding management's attention when you explain the results of your estimates? It's time to change your approach. Consider this: millions of people tune-in and click to watch John Oliver talk about cost estimates gone wild, and they stay tuned-in for 20 whole minutes. What can you learn from a comedian about explaining cost estimates? Maybe you should start with a joke.2016Business Case Analysis
Liars! Why Cost Estimators and Budgeters Inflate and Underestimate Costs!Travis Winstead, Ann HawpeProjects and programs regularly suffer from poor cost estimating techniques that result in insufficient budgets and unanticipated cost overruns. To issue a contract or set a realistic budget, the cost analyst must verify and validate the mountain of cost information and determine a fair and reasonable price for a contractor's goods and services. This paper will discuss strategies to assist program analysts in reviewing cost estimates and strategies used to support Government negotiations.2016Management, EVM & Scheduling
Fiscal Sustainability of Canada's National Defence ProgramPeter WeltmanThis report provides PBO's estimate of the fiscal gap between the status quo budget allocations and the cost of sustaining Canada's status quo national defence forces. Parliamentarians may wish to examine scenarios that will reduce or eliminate the gap between the cost of maintaining the current force structure and the amount of funding being allocated to paying for it.2016Management, EVM & Scheduling
The Performance Metrics Model and StudyDerreck Ross, Haitham Ghannam, Richard C. LeeThe Performance Metrics Model and Study (PMMS) is a comprehensive technique for selecting a performance metric TCPI. The PMMS uses historical data of completed products to determine the best EVM estimation technique for similar products that are in progress. The result is an auditable and repeatable approach that increases the objectivity of an Estimate at Completion. This approach can be applied across any program that has historical data.2016Management, EVM & Scheduling
Cost Consistency and Completeness as an Impossible ExerciseDavid L. Peeler, Jr.In previous papers, Hilbert's Problems were used to propose and revisit Hilbert's Problems for Cost Estimating. This paper employs Godel's theorem w/respect to Hilbert's application onto cost. What can we learn about ourselves as estimators and where can we exert greatest impact with our estimates? Using Godel's two theorems of undecidability as catalyst, we explore the effect and utility of exacting math and other notions on cost estimates specifically and programmaitics generally.2016Management, EVM & Scheduling
Preventing Program Management Pitfall Using Portfolio EstimatingNick Morales, Christopher DewberryEstimating the effort, time, and resources needed to complete project activities is one of the most challenging tasks that project managers must face. Along with the inherent uncertainty associated with managing these activities, additional management uncertainty occurs when there are multiple incremental efforts associated with that project. In order to handle the complication of managing several program increments, each with their own budget lines, portfolio estimating is a method that helps to provide program managers with a fully comprehensive overview of the resource requirements across multiple projects. Program managers can use the portfolio estimate as a tool to help make informed management decisions at the appropriate level and identify how those decisions impact the entire program.2016Management, EVM & Scheduling
Putting Schedule Quality Checks to the TestEric M. LofgrenAnalysts often use the 14-Point Assessment for Schedule Quality as a benchmark for determining the overall reliability of a schedule. But how much of the variation in schedule error can be explained by the results of the 14-Point check? This paper will use actual defense contract data to find the correlates of schedule reliability, measured using both the accuracy and the timeliness with which the schedule slip is predicted.2016Management, EVM & Scheduling
Provide Agile Analysis to Stakeholders Using Category Tagging ApproachEric HongIt is becoming more and more important for organizations to use data analytics for improved decision-making to create a competitive advantage and/or operate efficiently. This paper will describe the challenges in creating and implementing a strategy. It will identify potential reporting requirements and how to leverage existing work products as input into a standard structure feasible to perform analysis for different stakeholders. The proposed solution will increase the accuracy and agility of an organizations' analysis.2016Management, EVM & Scheduling
Applying Earned Value to Agile Development ProgramsBob Hunt, Michael ThompsonAgile Software and Agile Development continue to dominate the Federal acquisition arena. This presentation will address current trends in estimating and applying Earned value to Agile programs.2016Management, EVM & Scheduling
EVM's Potential for Enabling Integrated Cost-Risk ManagementDavid R. Graham, Bryn TurnerIntegrating the cost estimating, EVM analysis and technical risk management disciplines to realize comprehensive cost-risk management has proven elusive. Recently developed EVM cost-risk tools offer the potential to successfully enable integrated cost-risk management. The paper will illuminate the process of how integrated cost-risk management can be realized through cooperation of cost estimating, EVM analysis and technical risk management disciplines and briefly describe these new EVM cost-risk tools.2016Management, EVM & Scheduling
Conducting Root Cause Analysis for Should Cost ManagementNancy A. R. Droz, Mr. Robert L. HernandezRoot-cause analysis (RCA) is typically performed after a technical failure or after a program anomaly is found. However, the technique is key to the development and selection of effective initiatives to reduce program cost. RCA is a critical aspect of successful Should Cost Management execution. The presentation will provide a how to approach of applying root cause analysis during the evaluation of program data to develop more effective costs reductions. Lessons learned will be presented.2016Management, EVM & Scheduling
Organization Cost Estimation as a Human Intensive Systems Engineering ProblemDavid Bloom, Robert WrightThis paper will compare and contrast the difference between Intensive and Non-Intensive Human Interactive Systems. Further, this paper will investigate the role of Governance, Culture, Process and Tools (GCPT) in each of the 2 systems, Intensive and Non-intensive. Finally, this paper will apply the GCPT Framework to the system of Organizational Cost Estimation and provide examples of results that Raytheon has seen in its effort to improve the accuracy, affordability and accountability of the current bidding/estimation process2016Management, EVM & Scheduling
Beyond CSDRs: Collecting Contractor Cost Data for Detailed Cost EstimatesTim LawlessQuality cost estimates are based on actual costs, and for the Department of Defense, usually are acquired through Cost and Software Data Reports (CSDRs). However, CSDRs may not be available, complete or consistent for quality analytical purposes. This paper examines the collection and use of detailed contractor actual costs for estimates, such as with Priced Bills of Material and invoices, and their analytical benefits, perhaps supplanting CSDRs as the preferred data source for cost estimators.2016Methods & Models
Crew and Space Transportation Systems Cost ModelRichard WebbAs part of the MSFC Engineering Cost Office's new Program Cost Estimating Capability (PCEC) suite of cost estimating tools and capabilities, we are developing the Crew and Space Transportation Systems Cost Model (CASTS), a new, unique cost model for use in estimating crew and space transportation systems. This paper will provide an overview of the capabilities, estimating approach, historical database, and key features of CASTS as well as plans for future improvements.2016Methods & Models
The Space Situational AwarenessJames Smirnoff, Brennen WoodruffToday there are over a thousand operating satellites orbiting the earth providing a variety of critical functions such as communications, navigation, weather sensing, reconnaissance, and astronomy. As the space environment becomes more crowded, Space Situational Awareness (SSA) is increasingly important. This paper describes a new tool called the Space Situational Awareness (SSA) Cost Model which provides an objective and data-driven framework to estimate the cost of ground-based electro-optical components of future SSA architectures.2016Methods & Models
Estimating the Costs of Future Air-to-Ground WeaponsTom SandersThe weapons acquisition community has been acquiring air-launched weapons in a slowly-evolving manner for years. While technologies experience great leaps forward, acquisition strategies evolve at a slower pace. The AF Research Lab is evaluating revolutionary weapons designs that facilitate newly-emerging acquisition strategies...thereby presenting real challenges to cost estimators. This paper documents techniques for estimating costs of those future weapons.2016Methods & Models
Challenges in Estimating the Development Cost of Microcircuit TechnologyChristopher Price, Gurney ThompsonElectronic Technology continues to change at a rapid pace. Significant changes in the level of integration of electronic components dictate that circuit designs using these parts be redesigned every three to five years. Estimating the development cost of these new designs can be challenging. This paper will discuss these challenges in more detail, and describe some methods and tools available to rapidly and accurately estimate the development, production and O&S cost of these new technologies.2016Methods & Models
Workload-Adjusted Labor Cost ModelMichael MenderGovernment estimators typically use Forward Pricing Rate Agreements (FPRAs) to estimate future labor costs. FPRAs make assumptions about future workload, and can be inaccurate as realized workload often differs from projections. To enable understanding of this uncertainty, the Naval Center for Cost Analysis (NCCA) developed a lagged regression model that predicts future overhead costs as a function of workload, allowing NCCA to quickly and accurately assess the impact of Navy procurement decisions on acquisition programs.2016Methods & Models
Using Physics Based Reliability Methods for Should Cost EstimatesPatrick K. MaloneValue based should cost methodologies supporting Better Buying Power tenants can help realize significant MDAP cost savings over the life cycle. The use of Physics Based Reliability techniques in program development is described to meet affordability goals, identify cost drivers affecting reliability, availability and maintainability of weapon systems, and demonstrate PBR advantages over the classical reliability growth model with actionable plans to maintain technological superiority.2016Methods & Models
Delphi Methodology for Cost EstimatingCole J. Kupec, IIThe Delphi methodology can be a practical and valuable tool for cost estimators. This paper explores its proper implementation based on current research to include group selection, question formulation, statistical analysis, and the survey process. Special attention is paid to tailoring the Delphi method to cost estimating applications. A case study is presented for example purposes. The history of the Delphi method as well as modifications to the traditional Delphi method are also presented.2016Methods & Models
Beyond Anderlohr: An Alternate Approach To Estimating Production BreaksBrent JohnstoneEstimating the cost impacts of production breaks has long been problematic. Use of the Anderlohr method is widely accepted, but requires a significant degree of estimating judgment and can produce substantially different answers based on the individual user's assumptions. This paper suggests an alternate empirical methodology based on recent research on organizational learning and forgetting.2016Methods & Models
Estimating the Cost of the Aegis Flight III Test PlatformJeremy GoucherThe Flight III upgrade to the Aegis Destroyer program brings unparalleled new capabilities to the surface Navy fleet which will require in depth testing prior to deployment. For live fire testing, the Navy is considering four unique options to procure, install, and test the required systems on an unmanned vessel. This paper discusses the available data, methodology, and results of the cost estimate. The paper will also discuss some of the technical challenges associated with each option.2016Methods & Models
Footprints in the Sand: A Conversational Approach to Basis of EstimateFrank R. FlettThe Basis of Estimate (BOE) is the mechanism by which the cost estimator or cost proposal preparer hopes to convince the evaluator or reviewer that the method he/she took in preparing the estimate was "reasonable" (not the same as "correct"). This paper will present a technique of writing BOEs as if they were a part of a conversation with the evaluator/reviewer, who is, after all, a human being who will early on develop a distinct impression of the "goodness" of the end result.2016Methods & Models
Department of Defense Commercial-Off-The-ShelfHeather L. BrownNCCA partnered with DoD Enterprise Software Initiative (ESI) to periodically receive commercial-off-the-shelf (COTS) software orders with the intent of improving software cost estimating. As of today, more than 250,000 orders from the Army, Navy, Marine Corps, and other federal entities have been recorded. Together, these represent over six billion dollars in sales. NCCA examines this data and categorizes various product descriptions as licensed products, maintenance and support, or services in order to present statistical trends.2016Methods & Models
Naval Sea Systems Ship Command Logistics Requirements Funding Summary ToolJeremy EdenThe United States Navy has focused on planning and estimation of operations and support costs for programs. A task force created by Secretary Stackley (Assistant Secretary Navy Research, Development & Acquisition) to identify actions for achieving this goal recommended creating a logistics tool similar to that developed by the United States Marine Corps (USMC). This paper demonstrates the tool, development approach, cost estimating methodologies, interface, and deployment to the next generation2016Methods & Models
How Predictive Analytics is Improving Parametric Cost EstimationAnthony A. DeMarcoPredictive analytics is defined as the use of data, statistical algorithms and machine-learning techniques to identify the likelihood of future outcomes based on historical data. This paper will explore the history and parallels of parametric estimating and predictive analytics. It will highlight how cost management professionals can use current predictive analytics tools, methods and talent to improve their estimates and their success rates.2016Parametrics
Multiple Regression: Prediction & Confidence Intervals DemystifiedStacy M. DeanWe all know the value of prediction/confidence intervals as a measure of the certainty in the best fit line for a regression equation; however, once we get past bivariate regressions how to actually calculate these statistics gets a little murky. This paper attempts to de-mystify multivariate confidence/prediction interval calculation as well as provide some fun facts on the origin and history of these infamous intervals.2016Parametrics
The Collinearity Kill Chain: Detect, Classify, Localize, NeutralizeBrian Flynn, Adam JamesThis paper examines the issue of multicollinearity from the fresh perspective of a statistical warfighter. A kill-chain of steps is offered to defeat this illusive foe. Heuristics, chi-squared tests, t-tests, and eigenvalues are the Aegis, Predator, Bradley, and Ma Deuce 50-caliber statistical equivalents used to detect, classify, localize, and neutralize the problem.2016Parametrics
Dialing for Dollars: Improving Cost Estimating Accuracy Through Data MiningBrittany Holmes, James Glenn, Christian SmartThis paper shows the benefits of leveraging obligation and expenditure data from DoD Enterprise Resource Planning (ERP) Systems to supplement cost data from CCDRs or IPMRs. One benefit is that cost trends and funding patterns can be analyzed with more precision because the ERP data is updated real-time vice monthly or annually. Another benefit is that more comprehensive CERs can be created because ERP systems capture all cost vice only contracts with cost reporting CDRLs.2016Parametrics
Develop PRESS for Nonlinear EquationsShu-Ping HuPredicted residual sum of squares (PRESS) and Predicted R2 statistics are commonly used in regression analysis outside the cost analysis industry to determine (1) how well the model predicts new observations and (2) if the model includes more independent variables than necessary. Currently, these two leave-one-out statistics are only available for ordinary least squares. This paper develops these two statistics for nonlinear equations and demonstrates their value in practical applications.2016Parametrics
Including Escalation in Cost Estimating RelationshipsChristopher JarvisPrior to developing a Cost Estimating Relationship (CER) the data should be normalized for effects not explained by the assumed CER model. This typically includes the effects of inflation and/or escalation which has recently generated considerable discussion within the DoD. In this paper we demonstrate two common CERs models and augment them to include an escalation term. Discussion will be made regarding the reliability of the solutions of the augmented models.2016Parametrics
Dangers of ParametricsAndy PrinceWhat if our models are not solving our estimating problems, but instead are the source of our problems? The purpose of this paper is to address this question. We will look at what a cost model is, and what it isn't. We will examine how cost models appeal to our need for certainty and helps us create a good story for our cost estimate. We will take a look at a simple cost model to see what can go wrong when we trust the model over trusting the data. Finally, we will identify specific actions2016Parametrics
Development of an Additive Manufacturing Cost ModelF. Gurney Thompson, III, Grady NollThis paper discusses our research and development of a cost model for additive manufacturing (AM), also known as 3D printing. Working with academia and multiple AM companies, we investigated cost drivers, collected cost and technical data, derived AM CERs, and developed an approach for including AM components within a larger parametric cost estimate. This paper discusses the research approach, data collection, results, cost model implementation, and future work.2016Parametrics
X-Planes to X-Wings ~ Developing a Parametric Cost ModelSteve SterkIn today's cost-constrained environment, NASA needs an X-Plane database and parametric cost and schedule model that can quickly provide rough order of magnitude predictions of cost and schedule from initial concept to first flight of potential X-Plane aircraft. This paper takes a look at the steps taken in developing such a model and reports the results. The challenges encountered in the collection of historical data and recommendations for future database management are discussed.2016Parametrics
Integrating Cost Estimating and Data Science Methods in RJosh Wilson, Laura BarkerData science is a growing, interdisciplinary field about the extraction of knowledge from data. It emphasizes the utilization of statistics and advanced analysis techniques to answer questions in a well-documented and reproducible manner. R is a free, open source programming language that is popular among data scientists. This presentation discusses the benefits of using R to supplement traditional cost estimating and data analysis practices.2016Parametrics
AoAs the Right WayAmanda Wilson, David Macdonald, Mariam UzunyanThe follow-ons to two large space acquisition programs from the Space and Missile Systems Center completed the AOA processes last year. They were set apart from past AoAs and AoA-like studies with the leadership and involvement of OSD CAPE and the AFCAA teams, with broad community involvement strengthening the teams. This change resulted in benefits, challenges and lessons learned; various visual tools were created to highlight risks, opportunities, and investment strategies to senior leaders.2016Parametrics
Cost Estimating Relationship Development HandbookJohn Fitch, Adam James, Alfred SmithDeveloping appropriate and well-defined parametric relationships is a foundational requirement for Cost Estimators. Current training for analysts focuses on various ordinary least squares regression (OLS), techniques, but ignores logical decision sequences employed by good analysts applying OLS and omits methods beyond OLS. The CER Handbook includes these techniques in a flow-chart format and guides analysts along each step in developing a parametric relationship.2016Parametrics
Cost Risk for Firm Fixed-Price ContractsDavid Biron, Christian SmartThe terms "cost risk" and "firm fixed-price contracts" seem contradictory. By design the contractor bears all risk under a firm fixed-price (FFP) contract. In spite of this, overruns often occur, and contractors have recourse to a Request for Equitable Adjustment (REA) when cost grows beyond the contract value. We present statistics on cost overruns for FFP contracts, some of which are significant, and show how to model risk for such contracts.2016Risk & Uncertainty
Modeling Prediction Intervals Using Monte Carlo Simulation SoftwareJames Black, Qing Qing "Q" Wu, The use of a prediction interval (PI) is a simple method of quantifying risk and uncertainty for a Cost Estimating Relationship (CER) derived from an Ordinary Least Squares (OLS) regression. Yet, few cost estimators implement PIs in their estimates despite their frequent use of CERs. This presentation will provide a step-by-step tutorial for modeling a PI for an example CER using Monte Carlo Simulation software and will identify the beneficial impact on the coefficient of variation (CV).2016Risk & Uncertainty
Introducing RIFT to Protect Your Uncertain ScheduleNicholas DeTore, Peter FredericThere are industry-accepted methods for allocating cost risk and uncertainty analysis results to detailed WBS elements; schedule results cannot be allocated the same way since duration behaves differently than cost. We present an innovative solution to this issue. The RIFT algorithm calculates a threshold date, for any task or milestone, that if exceeded puts the probabilistic project finish date in jeopardy. RIFT provides a new tangible metric to guide decision makers.2016Risk & Uncertainty
Risk vs. Uncertainty - What's the difference?Melvin R. Etheridge, Jr.After years of applying "risk", many in the profession still do not understand the difference between program risk and cost uncertainty in cost estimates. Determining confidence levels goes beyond simply applying the effects of programmatic risks. Simply put, risks have a probability of occurrence, while we know that quantitative inputs have some degree of uncertainty and must be treated as random variables. This paper explores analytical methods in Excel-based cost models using Monte Carlo s2016Risk & Uncertainty
A 'Common Risk Factor' Method to Estimate Correlations between DistributionsMarc GreenbergA 'common risk factor' method uses expert-derived risk factors to estimate correlation between two distributions. The premise is to estimate mutual information among risk factors that underlie each distribution. Approximation of such mutual information leverages properties of the joint probability distribution of a unit square. Geometric outputs of each pair of common random variables are compared to estimate common risk factor "intersections" that are, in turn, proxies of correlation.2016Risk & Uncertainty
How Regression Methods Impact Uncertainty ResultsBoyan Jonov, Shu-Ping Hu, Log transformation and weighted least squares are commonly used to develop multiplicative error cost estimating relationships. We objectively compare the two regression techniques and provide a sound defense of log-linear ordinary least squares (LOLS) to counter arguments against its use. We then demonstrate how uncertainty modeling varies substantially based on regression method. Lastly, we establish criteria to use so that LOLS leads to a justifiable uncertainty assignment.2016Risk & Uncertainty
The Zone System of Uncertainty AnalysisJeff McDowellThis paper presents a methodology for selecting cost uncertainty distributions and their dispersion inspired by the photography Zone System. The method seamlessly addresses five groupings of uncertainty across a sliding complexity scale. The distribution shapes and their parameters are drawn from patterns developed from the AFCAA Cost Risk and Uncertainly Metrics Manual (CRUAMM) body of work.2016Risk & Uncertainty
Cloud Total Ownership Costing: Considering the Technologies, Costs and BenefitsDan Galorath, Steven WoodwardCloud abounds with promises of drastically reduced costs, which are all attractive. However, Cloud computing can be ineffective in terms of technology, affordability and total cost of ownership in many situations. The assumption that cloud technology reduces government costs is often flawed. The benefits from cloud computing relating to time-to-market, savings, innovation and agility should be analyzed, providing balanced perspectives for the cloud opportunities presented.2016Software & IT
Data Collection for Agile ProjectsBlaze SmallwoodAgile software development projects produce a unique set of metrics, such as points and velocity, that can provide interesting insight into project progress. However, collecting this data requires specialized mechanisms, since no established standards exist, like SRDRs. This paper will discuss the types of data that are useful to collect for agile software development projects and mechanisms that have been used to collect them for several government projects.2016Software & IT
Apples and Oranges: a Presentation and Analysis of Results of Cloud Cost Calculators and Rate CardsDaniel J. HarperA recent effort for an Army customer examined over a dozen calculators and rate cards for estimating storage and hosting costs for cloud applications. This presentation will provide an overview of several calculators and tools, guidance for cost estimators on interpreting IT-centric inputs, and a discussion of similarities and variation in results. We will also present a cloud complexity plotter which provides a visual tool for explaining cloud cost and complexity drivers.2016Software & IT
A Primer on Agile Software Development for Cost AnalystsEric Hawkes, Cole J. Kupec IIAs Agile software development continues to become more common, cost estimators are compelled to adapt their methods to the Agile approach. This paper begins by explaining what Agile really is and how it differs from the common waterfall methodology. It also presents two case studies that detail how Agile was successfully implemented in a DoD acquisition program. Lastly, the paper proposes four innovative cost estimating methods for an Agile environment.2016Software & IT
The Cost of CyberAnn Hawpe, effrey VothA digital revolution is changing the nature of warfare and the cyber domain is now a top priority for all MDAP and MAIS programs. As the continued advancement of cyber warfare changes the future of land, air, sea, and space over the next decade, the cost community will need to consider the challenges facing defense programs and how to estimate the cyber security requirements within the construct of a reformed acquisition process.2016Software & IT
Maturing the Economic Aspects of Agile Development in the Federal GovernmentJennifer ManringAgile development has emerged in the commercial domain as a leading software development methodology, with growing adoption across federal agencies. This new approach for software development challenges traditional acquisition areas including cost estimation, economic analysis, and system engineering. This research provides a deeper understanding of the economic implications of agile development for the Federal Government and provides insights needed to understand the value proposition of an agile development environment compared with a traditional waterfall environment.2016Software & IT
Using Predictive Analytics for Cost Optimization Across Cloud WorkloadsZachary Jasnoff, David A. CassAs organizations move to the cloud, estimating costs proves difficult, presenting many challenges. Once estimated, cloud costs must be optimized taking into account workloads. Organizations failing to estimate cost based on workloads will either underestimate or over spend on cloud services, often with disastrous results. A framework for optimizing costs (spanning IaaS, PaaS, and SaaS) and determining the cost/benefit between public clouds, private clouds and hybrid clouds will be presented.2016Software & IT
SURF Process Summary & Initial Findings: A Deeper Focus on Software Data QualityNicholas Lanham, Corinne Wallshein, Ranae P. WoodsAs a result of the joint Software Resource Data Report Working Group (SRDRWG), several recommendations were generated by the supporting, multi-agency SRDRWG members to the Cost Leadership Forum. One of which led to the development of the first joint-agency SRDR Validation and Verification (V&V) guide as well as the SRDR User Review Function (SURF) subgroup. This study summarizes the results and data-quality improvement areas identified from 319 V&V comments generated by the SURF team.2016Software & IT
Cloud Solutions - Infrastructure, Platform or Software - Where should you go?Arlene F. Minkiewicz, Ashley HoenigkeCloud computing technologies promise to create some new and interesting challenges for the cost estimating community. How does one determine the effort and costs associated with migration of existing applications to the cloud? What challenges do the various cloud solutions (IaaS vs. PaaS,vs. SaaS) present? This paper presents a case study outlining the migration activities and cost implications for migrating the same application on each of the three cloud solution models.2016Software & IT
Process-Related Effort and Schedule Estimating Relationships for Software Cost EstimatingNichlas Lanham, Corinne Wallshein, Wilson RosaThe Naval Center for Cost Analysis will present comprehensive, updated findings of software size growth factors, effort estimating relationships (EER), and schedule estimating relationships (SER) with subsets of Department of Defense Computer Software Configuration Item records. This presentation focuses on software size (new, modified, reused, auto-generated, and total code). Subsets include maturity, application and super domain, language, contract type, and operating environment.2016Software Sizing
Estimating Small Scale Software Integration Efforts with a Large Scale Code BaseJoe BauerEstimating the cost and effort for testing and integration is often one of the most challenging aspects of software estimation. Integration effort depends greatly on the components under evaluation. When relatively small capabilities are introduced into large existing code bases, the integration impact due to the existing code may surprise stakeholders even when requirements are reduced. This presentation highlights software integration modeling for a major defense acquisition program.2016Software Sizing
Alternatives to SLOC in the Modern EnvironmentJeremy GoucherSource Lines of Code (SLOC) based estimating relies heavily on subject matter expert input. However, viable alternatives to SLOC are few. By mapping deliverables to functionality and functionality to effort, the authors propose an alternate technique with reduced reliance on SLOC. Agile development and some recent innovations represent alternatives as well. The authors will present an overview of the topic with examples, required data inputs, and estimating methodology.2016Software Sizing
Agile and Function Points: A Winning CombinationDaniel B. FrenchMany users of Agile have an incorrect belief that projects do not need to adhere to proper project management practices. However, it doesn't advocate getting rid of metrics or PM processes. The most critical aspect of software development is software size. Function Points can be effectively used with Agile development and this presentation will demonstrate the benefits of using FP in many aspects of project management including estimation, schedule and product backlog.2016Software Sizing
Cost Element Structure for Cloud Migration Business Case AnalysisLorena AguilarIn 2010, the White House released its initial "cloud first" policy. Agencies have since struggled to identify all of the costs associated with cloud migration, and recent audits have criticized agencies for not properly assessing cloud-related costs and benefits. We have developed a cloud migration cost element structure (CES), which simplifies and standardizes cost data collection and estimating. We will present the structure and its applicability to business case and cost benefit analyses.2016Software Sizing
Collaborative Scoring: An Innovative Approach for Sizing Software ProjectsBlaze SmallwoodA major challenge facing government organizations is developing defendable cost and schedule estimates for software development projects, driven by adequately estimating software size. This paper will describe a viable new method for sizing software requirements that utilizes a collaborative scoring methodology similar to those used during agile sprint planning. It has been successfully implemented on several government projects and is a viable option on all software projects.2016Software Sizing
Cost Estimate Credibility from a Government PerspectiveHenry ApgarThis original paper describes a search for the definition of "Cost Estimate Credibility." The author documents his review of government-agency estimating handbooks, contemporary textbooks, previous ICEAA papers and journal articles, and the opinions of contemporary experts. The paper concludes with a concise definition which can be used by government and contractor estimators for their cost models and cost estimates.2016 Int'lGovernment Perspectives
NASA's X-Plane Database and Parametric Cost Model V2.0Steve SterkIn 2015 the idea of creating an Armstrong X-Plane Cost Model was conceived. Initially the data being used was leveraged from Dr. Joe Hamaker's Quick Cost v5.0 and verified and supplemented by Armstrong's summer intern Aaron McAtee (PhD). After several peer reviews, it was suggested to go back and take a hard look at the data collected. Our objective is to produce an X-plane database that ensures that all the data is as accurate, traceable and far-reaching as possible.2016 Int'lGovernment Perspectives
NATO Agency Transformation: Creating a Centralized Cost Estimating CapabilityCandace MahalaA new NATO Agency is born, transforming multiple agencies into one, changing culture, and implementing a centralized robust cost estimating capability. This is the journey of how to grow and cultivate cost estimating from inception, integrating into projects as a value added asset, maturing over time and eventually becoming a fundamental part of the culture in the acquisition and execution processes.2016 Int'lGovernment Perspectives
Master of Cost Estimating & Analysis BriefGreg MislickGreg Mislick will provide an update on the all-distance learning Master's Degree and/or Certificate Program in Cost Estimating and Analysis (MCEA / CCEA) offered at the Naval Postgraduate School (NPS). Entering its sixth year, the MCEA program has proven to be a success for both the students who take the program and the DoD Services and other US Government Executive Agencies in which the students work. Expert cost estimates are key underpinnings for critical government processes dependent on credible and reasonable cost estimates, from budgeting to cost-benefit and financial analyses.2016 Int'lGovernment Perspectives
OSCAM - US Navy's Operating and Support Cost Estimating ToolStephen CurramOSCAM is a simulation based tool for operating and support cost estimating that has been used by the US Navy for 20 years. It plays an important role in submissions for US DoD procurement milestones. An overview and brief history of OSCAM will be given, with examples of equipment programmes that have used OSCAM. It will conclude with lessons learned, including insights on the type of data required for operating and support cost estimating.2016 Int'lGovernment Perspectives
NASA Project Cost Estimating Capability: New Analyses for Spacecraft EstimatingBrian Alford, Frank A. (Andy) PrinceThe NASA Project Cost Estimating Capability (PCEC) is a free, publicly-available parametric tool for estimating the cost of spacecraft and transportation systems for scientific and human exploration of our solar system and beyond. In development since late 2013, PCEC replaces the NASA / Air Force Cost Model (NAFCOM). This presentation will discuss the new overall philosophy and data analysis underlying PCEC, the challenges encountered during development, and provide a high-level overview of the2016 Int'lGovernment Perspectives
The 10+/-2 Factors for Estimate SuccessAndy NolanA study of completed projects across Rolls-Royce revealed the factors that lead to accurate estimates. This has enabled us to develop an Estimate Readiness Level (ERL) assessment method, to determine the likely accuracy of an estimate. Success is no accident and estimates accuracy can be predicted from the outset of a project. This paper will cover the factors for estimate success and the development of the ERL assessment.2016 Int'lManagement
Is it Worth It? The Economics of Reusable Space TransportationRichard L. WebbMuch has been invested in reusable space transportation systems in the belief that reusing system elements is cheaper than expending costly hardware each flight; a view not held by all industry stakeholders. This paper will define the metrics by which stakeholders measure financial "goodness," provide examples of application of the metrics to investment decisions, and offer observations on their potential influence on current and future decisions regarding space transportation system investment.2016 Int'lManagement
What Does A Mature Cost Engineering Organisation Look Like?Dale ShermonCost Engineering Health Check; How good are those numbers? is Dale Shermon's second book and co-authored with Dr Mark Gilmour. The book is the paper and the presentation will consider what a mature cost engineering organisation looks like? It will cover the essential requirements for a credible and justified cost forecast.2016 Int'lManagement
Using Public Data for Validation and Winning BidsAlexander KingTo win bids, cost forecasting involves more than just creating a reasonable baseline. This presentation will show how public data can be combined and triangulated to ensure our predictions are competitive as well as reasonable. The presentation will show how parametric estimates, past contract data, financial statement data and statistical data can combine to make powerful metrics for productivity and competitiveness.2016 Int'lManagement
OSA: Cost and Schedule Saver or Driver?Victoria Cuff, Brian D. FerschOpen Systems Architecture employs a modular design and utilized consensus based standards for its key interfaces with full design disclosures [Office of the Deputy Secretary of Defense, Systems Engineering, Initiatives]. A challenge with OSA is the ability to continuously design modular systems with obsolete technology and pieces, evolving technological environment and emerging threats. Examining past successes and failures, this study will analyze means to reduce costs and increase success.2016 Int'lManagement
The Value of R&D - a Real Options Analysis ApproachJohn Shimell, Neil DaviesThis study sought to apply Real Option Analysis (ROA) to answer the question "what is the value achieved from the MODs Chief Scientific Advisor's (CSA) "£400m annual budget". Typically this has been reviewed retrospectively via case studies and surveys. This study developed a tool based on an amended Black & Scholes Formula, combined with a Monte Carlo analysis of the "tree" of R&D to determine a predictive value for the R&D spend of the CSA.2016 Int'lManagement
Costing Aircraft Availability and 50-Shades of Grey WaterAndrew Langridge, Anna FooteData gathering is often one of a cost estimator's biggest nightmares for many reasons. Even in cases where data is available, the logistics of data collection can be daunting. This paper discusses a methodology that PRICE has developed to use open source web crawling software to regularly and repeatedly update guidance on commodity prices for Information Technology items with the push of a button. Uses of this methodology and further directions will be presented.2016 Int'lModels & Methods
Successful Cost Estimating with T1-EquivalentsGrady Noll, F. Gurney Thompson IIIThis paper discusses our research and development of a cost model for additive manufacturing (AM), also known as 3D printing. Working with academia and multiple AM companies, we investigated cost drivers, collected cost and technical data, derived AM CERs, and developed an approach for including AM components within a larger parametric cost estimate. This paper discusses the research approach, data collection, results, cost model implementation, and future work.2016 Int'lModels & Methods
Facilities Cost Estimates Drivers in the Oil and Gas Field DevelopmentLinda Newnes, David Peacock, Ettore Settani, Jon WrightIn aerospace and defence research has focused on estimating the cost of delivering an outcome, e.g. available aircraft. Our research shows that focusing on the product alone, using product reliability/failures does not reflect the cost of delivering an outcome. In this paper we show why this is the case and how the use of input-output modelling and systems engineering can be used. We then demonstrate how the approach can be used in estimating the cost of waste water treatment.2016 Int'lModels & Methods
The Signal and the Noise in Cost EstimatingChristian B. SmartWe seek to extract signal and eliminate noise when building models with historical data to predict future costs. One common problem is normalization, which is necessary when making comparisons but can inject noise when used in modeling. We discuss kernel smoothing and distribution fitting as ways to avoid overfitting peculiarities in historical data and the important issues of cross-validation and parsimony as ways to validate models and avoid the lure of overfitting.2016 Int'lModels & Methods
CER Issues And SolutionsKurt BrunnerCost estimating Relationships (CERs) are widely accepted as an effective cost estimating tool. There are some issues with CER development and application that should be addressed. This briefing will explore those probable shortcomings and how they may be alleviated with the goal of refining our cost estimating capabilities.2016 Int'lRisk Analysis
Should You Care About How Good An Estimation Process?Alain AbranThis presentation is based on the author's 2015 book on Software Project Estimation (Wiley & IEEE Press) and illustrates how organizations (including in the car industry) having collected their own data using international standards have built their own estimation models and developed a key competitive advantage through improved software estimation capabilities. Examples will include data from a large European car manufacturer and a small software organization using Agile.2016 Int'lRisk Analysis
A Parametric Model for the Cost Per Flight HourMichail BozoudisThe parametric estimating technique provides timely cost estimates for "unknown" systems, utilizing cost estimating relationships deriving from historical datasets. This case study describes the development of a parametric model that estimates the cost per flight hour (CPFH). The cost derives as a function of the aircraft empty weight and the engine's specific fuel consumption. As an example, the F-35A CPFH is estimated under the hypothetical scenario that it is operated by the Hellenic Air Force.2016 Int'lRisk Analysis
The Way From a Parametric Estimate to a CAD-Driven CalcualtionJoachim Schoeffer, Herbert SpixThe industrial requirements for quick, rough and exact costings (e.g. for budgeting) and the detailed cost break down to the last screw or processor is as old as both methodologies exist. To satisfy both fractions we need different solutions that address both needs. Ideally both scenarios are integrated and interface between each other. In this presentation the author shows a way of integrate parametric and CAD-driven bottom-up.2016 Int'lRisk Analysis
Outing the OutliersAlan R. JonesInappropriate inclusion or exclusion of data that may or may not be regarded as outliers may make a project unachievable or uncompetitive. This paper looks at some of the issues involved and what tests are available to support an informed decision process rather than relying on a random guess on what is, or is not, an outlier. The paper will compare the performance of these tests against a number of sample data sets.2016 Int'lRisk Analysis
Inherent Risk in Spreadsheets (IRiS)Alan R. JonesMany spreadsheets may be certified as being error-free on the day they are tested, but some are more prone to future errors due to the manner in which they have been designed or compiled.2016 Int'lRisk Analysis
Software Estimating Model Using IFPUG Standard Sizing MethodsChristine GreenA major input for any cost model for software projects are the scope and effort for a software projects regardless of technology or type of project process. IFPUG Sizing Standards methods can be utilized for software estimating model and thereby establish a valuable input for the cost models produced that will include both scope control and effort estimates.2016 Int'lSoftware
Cloud Solutions - Infrastructure, Platform or SoftwareArlene F. Minkiewicz, Ashley HoenigkeMigrating capability to the cloud comes with several planning and management challenges. How does an organization determine the right capability to migrate and the right platform for the migration? What challenges do the various cloud solutions (Infrastructure vs. Platform vs. Software as a Service) present? This paper presents a case study outlining the migration activities and other implications for migrating the same application on each of the three cloud solution models.2016 Int'lSoftware
Measurement of Software Size: Contributions of COSMIC to Estimation ImprovementsAlain Abran, Charles Symons, Frank VogelezangThis talk presents 1) an outline of the design of the 2nd generation COSMIC method for measuring a functional size of software; 2) industry evidence of its practical value in software project performance measurement and its accuracy in estimating; 3) an account of the method's full automation, with very high accuracy, for real-time embedded software specified in Matlab-Simulink and developed through a world network of software contractors.2016 Int'lSoftware
The Monitoring and Early Warning Indicators for a Software ProjectChristine GreenThis presentation will focus on controlling by monitoring and early warning indicators on a software project's scope, estimate and performance - all factors that highly influence the cost and budget.2016 Int'lSoftware
Introduction to Software Obsolescence Cost Analysis FrameworkSanathanan RajagopalAlmost every project in Defence sectors has got software with various degrees of complexity and dependencies. Whilst various research and studies have been conducted on system obsolescence and tools developed to cost systems and components obsolescence, no major research has been undertaken to develop a framework to estimate cost of software obsolescence. This paper will discuss the Software Obsolescence Cost Analysis Framework developed as part this research.2016 Int'lSoftware
Applying Earned Value to Agile Development ProgramsBob Hunt, Michael ThompsonA general movement to Agile software development process is one method used to address and potentially resolve these problems. The movement to Agile Software Development introduces several unique problems for EVM assessments. This paper will discuss some key element associated with: Agile/Hybrid Agile software development; Establishing a process for developing the technical, cost and schedule baseline; Applying Earned Value Management to agile programs.2016 Int'lSoftware
Hubble Space Telescope and Its Color PhotographsDavid PineThe Hubble Space Telescope has taken thousands of "color" photographs that have excited people the world over. This presentation examines the "Mirror Problem" and then how scientists change the photons being collected by the telescope into the phenomenal photographs that have thrilled the public. Understanding the color photographs begins with understanding what color is, and how Hubble's cameras operate to eventually produce pictures.2016 Int'lSpace Systems
Satellite Mass GrowthDonald MacKenzie, Erik BurgessA new study of satellite mass growth on 21 programs shows that equipment type and design maturity are drivers. Unlike current AIAA guidelines, acquisition strategy is shown to be a significant driver, with competitively awarded programs experiencing much higher growth than sole-source awards, even for units with the same design maturity. Overall, mass growth allowance recommended by AIAA for every equipment type is much lower than the actual growth experienced in this dataset.2016 Int'lSpace Systems
QuickCost 6.0Bob HuntQuickCost 6.0 has just been released by Galorath Federal and NASA. QuickCost 6.0 is an all new implementation QuickCost (QuickCost 1.0 was released in 2004). The philosophy behind the model is that accurate cost and schedule estimates of automated spacecraft missions can be obtained parametrically in a quick and easy fashion estimating at the top 11 NASA WBS elements and using only a few input variables. QuickCost 6.0 is based on data from the NASA ONCE CADRe database.2016 Int'lSpace Systems
Halfway to AnywhereFabian EilingsfeldA 'tariff zone map' is proposed for space transportation. Using parametric analysis, applying the paradigm of terrestrial public transport, it displays specific energy cost depending on total energy delivered to payloads. Every space transportation system leaves its own distinctive footprint when its performance data is plotted on the proposed map. It visualizes performance and cost trends with an emphasis on affordability, which helps rational discussion of project options and roadmaps.2016 Int'lSpace Systems
The SSCAG LegacyHenry ApgarActive partnership in this ICEAA Symposium marks the final milestone of a spectacular forty-year journey of the Space Systems Cost Analysis Group. SSCAG provided a unique international forum for space-systems cost analysts. Over those years, more than 300 skilled cost estimators, business analysts, program managers, space engineers, and logisticians filled the international venues, at more than 100 scheduled meetings, to deliberate major issues associated with government and commercial development.2016 Int'lSpace Systems
Using Budgeted Cost of Work Performed to Predict Estimates at Completion for Mid-Acquisition Space ProgramsC. Grant KeatonConventional techniques to predict estimates at completion using earned value management data generally underpredict the end costs for space acquisition programs. This research assesses the accuracies of earned value management analysis methods to forecast estimates at completion for on-going space system acquisitions.2015Journal of Cost Analysis and Parametrics
An Economics Approach to Fixing the Fare of the Parking Lot Service in Bogota Using Price Cap RegulationJorge A. PerdomoThis article presents an application of the Ramsey Pricing approach to establish the price cap or pricing ceiling per minute for the parking lot service in Bogota (Colombia), using the microeconomic framework for price fixing from a costs analysis (total fixed costs, total variables costs, average total cost, and marginal cost) of the provision of parking lot service in Bogota. 2015Journal of Cost Analysis and Parametrics
Predicting the Likelihood of Cost Overruns: An Empirical Examination of Major Department of Defense Acquisition ProgramsAlan K. Gideon, James S. WasekThis article provides a method for predicting the cost of a major acquisition program five years after program development approval. The author work extends the effort begun by Asher and Maggelet by adjusting for changes in median cost growth factors for each acquisition domain. Procurement average unit costs at program approval and five years afterwards were compared for 101 major United States Department of Defense acquisition programs.2015Journal of Cost Analysis and Parametrics
To Cost an Elephant: An Exploratory Survey on Cost Estimating Practice in the Light of Product-Service-SystemsEttore Settanni, Nils E. Thenent, Linda B. Newnes, Glenn Parry, Yee Mey GohBusinesses now contracting for availability are regarded as part of a paradigm shift away from the familiar product and support business model. The main difference being that such businesses eventually commit to provide a service outcome via product-service-system. The research presented in this article investigates how current cost estimating practice relates with the idea of having as the point of focus for the analysis a product-service-system delivering service outcomes, rather than a product. 2015Journal of Cost Analysis and Parametrics
Parametric Scale-Up Cost Factors for Conventional and Micro-Scale ToolsDonald S. Remer, Kerry M. ChinThis article estimates cost scale-up factors for micro-scale tools, including high speed steel micro tools, micro thread hand taps, and various types of mesh and wires. Many of these tools initially have a positive scale-up factor where the material costs directly correlate with the cost of the tool; they then transition to a negative scale-up factor at the micro-scale where manufacturing is more difficult and becomes the determining factor for the prices of the products. 2015Journal of Cost Analysis and Parametrics
Using Earned Value Data to Forecast the Duration of Department of Defense Space Acquisition ProgramsShedrick Bridgeforth, Jonathan Ritschel, Edward D. White, Grant KeatonTraditional earned value management techniques to predict final costs of space acquisition programs are historically inaccurate. A 2015 study by the Air Force Cost Analysis Agency (Keaton 2015) sought to improve the accuracy of the cost estimate at completion for space system contracts through a linear relationship between budgeted cost for work performed and time.2015Journal of Cost Analysis and Parametrics
Bending the Cost Curve: Moving the Focus from Macro-level to Micro-level Cost Trends with Cluster AnalysisBradley C. Boehmke, Alan W. Johnson, Edward D. White, Jeffery D. Weir, Mark A. GallagherBending the cost curve has become the ambiguous jargon employed in recent years to emphasize the notion of changing unwanted cost trends. In response to the planned $1 trillion Department of Defense budget reduction over the next six years, the Air Force has launched its own Bending the Cost Curve initiative in an effort to reduce cost growth. 2015Journal of Cost Analysis and Parametrics
A Literature Survey and Future Directions for Product Development: A Focus on Conceptual Design StageSatish Kumar Tyagi, Xianming Cai, Kai YangProduct development process follows a sequence of activities, methods, and tools to design and develop a high quality product. An optimized strategy for a single generation product development may not be the best choice when multi-generation scenarios are considered. In this article an attempt has been made to study the nature of multi-generational product development and compare it with that of single generation product development to drive validating arguments. 2015Journal of Cost Analysis and Parametrics
Pioneers of Parametrics: Origins and Evolution of Software Cost EstimationRicardo ValerdiThis article provides an historical account of the development of the field of software cost estimation, specifically the area of parametrics, through information obtained during interviews of 13 pioneers in the field. Cost model developers, users, and practitioners were interviewed with the intent to capture their views on the impact between cost estimation research and practice. 2015Journal of Cost Analysis and Parametrics
Time Phasing Aircraft R&D Using the Weibull and Beta DistributionsGregory E. Brown, Edward D. White, Jonathan D. Ritschel, Michael J. SeibelEarly research on time phasing primarily focuses on the theoretical foundation for applying the cumulative distribution function, or S-curve, to model the distribution of development expenditures. Minimal methodology prior to 2002 provides for estimating the S-curves parameter values. Brown et al. (2002) resolved this shortcoming through regression analysis, but their methodology is not specific to aircraft and does not consider aircraft-specific variables, such as first flight. 2015Journal of Cost Analysis and Parametrics
Covered with Oil: Incorporating Realism in Cost Risk AnalysisChristian B. SmartWhen Jimmy Buffett sang the words All of those tourists covered with oil in his song Margaritaville he probably never imagined that this phrase might apply to crude oil instead of suntan lotion. Both the cost and the environmental impact from the 2010 oil spill in the Gulf of Mexico were much worse than anyone had expected or could have predicted. It was, in the words of financial writer, Nassim Taleb, a black swan "an unexpected event with tremendous consequences. "2015Journal of Cost Analysis and Parametrics
Modifying Duration-Based Costing to Illustrate the Effect of Fixed CostsAnne-Marie Teresa LelkesThis study introduces Modified Duration-Based Costing (MDBC) as an alternative to Duration-Based Costing (DBC) developed in Lelkes and Deis (2013). Both DBC and ABC have a tendency to treat fixed costs as though they were variable. This study expands on the DBC model by showing an alternative way of dealing with fixed costs instead of treating them as variable costs. This study uses analytical methodology and simulations to analyze MDBC relative to an Activity-Based Costing (ABC) system.2015Journal of Cost Analysis and Parametrics
Trade Space, Product Optimization, and Parametric AnalysisDouglas K. HowarthThis article shows how to bound, build, and assemble trade spaces for product optimization. The advent of computerized tools that describe available trade spaces has changed not only the nature of optimized product design, but that of parametric cost studies as well. Because these tools allow broader analysis, engineers can produce many more potential designs. 2014Journal of Cost Analysis and Parametrics
Applications of a Parsimonious Model of Development Programs Costs and SchedulesDavid Lee, Carol Dezwarte, Stephanie Sigalas-Markham, Jeremy EckhauseA model of the cost and schedule of a development program, characterized by three non-dimensional parameters, gives means for estimating the cost and schedule impacts of constraining funding below planned levels, as well as for assessing the realism of the costs and schedules of planned programs. In contrast to models of the Norden-Rayleigh-Weibull class, the model explicitly considers specific components of cost, and captures the distinction between a development programs value (i.e., the things delivered) and its cost (i.e., the money paid to acquire the value). 2014Journal of Cost Analysis and Parametrics
Software Industry Goals for the Years 2014 through 2018Capers JonesThis article discusses 20 quantitative targets for software engineering projects that are technically feasible to be achieved within a five year window. Some leading companies have already achieved many of these targets, but average and lagging companies have achieved only a few, if any. Software needs firm achievable goals expressed in quantitative fashion. 2014Journal of Cost Analysis and Parametrics
COTECHMO: The Constructive Technology Development Cost ModelMark B. Jones, Phil F. Webb, Mark D. Summers,Paul BaguleyA detailed analysis of the available literature and the aerospace manufacturing industry has identified a lack of cost estimation techniques to forecast advanced manufacturing technology development effort and hardware cost. To respond, this article presents two parametric Constructive Technology Development Cost Models (COTECHMO). The COTECHMO Resources model is the first and is capable of forecasting aerospace advanced manufacturing technology development effort in person-hours. 2014Journal of Cost Analysis and Parametrics
Engineering Systems: Best-in-Class/Worst-in-ClassDonald M. BeckettMeasurements goal is to help assess performance to determine which methods are productive or counterproductive. Metrics are tools used to identify and implement practices that lower costs, reduce time to market, and improve product quality. But process improvement is not accomplished through measurement or metrics alone. 2014Journal of Cost Analysis and Parametrics
Comparing Lifecycle Sustainment Strategies in an Electronic Component Obsolescence EnvironmentKenneth D. Underwood, Jeffrey A. Ogden, Matthew T. McConvilleRapid advancements in technology and the diminishing lifecycle of electronic systems have complicated the sourcing and sustainment activities of many organizations as suppliers of original components go out of business or refuse to produce obsolete products. This article explores diminishing manufacturing sources and material shortages as well as the obsolescence costs and reliability issues associated with electronic components.2014Journal of Cost Analysis and Parametrics
Cost Risk Allocation Theory and PracticeChristian B. SmartRisk allocation is the assignment of risk reserves from a total project or portfolio level to individual constituent elements. For example, cost risk at the total project level is allocated to individual work breakdown structure elements. This is a non-trivial exercise in most instances, because of issues related to the aggregation of risks, such as the fact that percentiles do not add. 2014Journal of Cost Analysis and Parametrics
Activity-Based Parsimonious Cost SystemsShannon Charles, Don HansenAccurate product costing information is instrumental in effective decision making, especially for product related decisions, such as product mix and product emphasis. A body of literature suggests that activity-based costing plays an important role in providing accurate product costing information. However, activity-based costing systems are also more complex due to the number of cost drivers identified, compared to traditional single cost driver systems.2014Journal of Cost Analysis and Parametrics
An Assumptions-Based Framework for TRL-Based Cost and Schedule ModelsBernard El-Khoury, C. Robert KenleyThe Technology Readiness Level scale has been used to assess progress and provide a framework for developing technology. Many Technology Readiness Level-based cost and schedule models have been developed to monitor technology maturation, mitigate program risk, characterize transition times, or model schedule and cost risk for individual technologies as well technology systems and portfolios. 2014Journal of Cost Analysis and Parametrics
Development of a Product Life-Cycle Cost Estimation Model to Support Engineering Decision-Making in a Multi-Generational Product Development EnvironmentXianming Cai, Satish TyagiThe research under product development domain identifies and implements new concepts in design, planning, manufacturing, testing, and service sectors to keep up with the market demands. Many product developments are multi-generational in nature and may require redesigning the product at each generation because an optimized strategy for a single generation may not be the best option in the multi-generation scenarios. 2014Journal of Cost Analysis and Parametrics
A Process for the Development and Evaluation of Preliminary Construction Material Quantity Estimation Models Using Backward Elimination Regression and Neural NetworksBorja Garcia de Soto, Bryan T. Adey, Dilum FernandoDuring the early stages of a project, it is beneficial to have an accurate preliminary estimate of its cost. One way to make those estimates is by determining the amount of construction material quantities that are required and then multiplying the estimated construction material quantities by the corresponding unit cost. One advantage of making estimates in this way is that it allows for the segregation of quantities and costs. 2014Journal of Cost Analysis and Parametrics
On the Shoulders of Giants: A Tribute to Prof. Barry W. BoehmJo Ann Lane, Daniel D. Galorath, Ricardo ValerdiProf. Barry Boehms life work is full of contributions to the software engineering and systems engineering disciplines. This article presents Prof. Barry Boehms work in the context of the giants on whose shoulders he stands as well as the people he has mentored to carry on his work. Much of Prof. Boehms work described in this article focuses on his key contributions to the software and system development industries, as well as from the perspective of the enduring legacy he has established with his industry affiliates and students.2014Journal of Cost Analysis and Parametrics
NPS and AFIT Masters Degree in Cost Estimating and AnalysisDaniel Nussbaum, Greg MislickThis presentation provides an update on the Joint, all-Distance Learning Masters Degree in Cost Estimating and Analysis offered at the Naval Postgraduate School (NPS) and Air Force Institute of Technology (AFIT).2014Business and Art of Cost Estimating
A Comprehensive CES and BCA Approach for Lifelong LearningKevin Cincotta, Darcy LilleyThe Air Force Air Mobility Command (AMC) Enterprise Learning Office (ELO) mission is to transform AMC into a premier Air Force learning organization, achieve learning through optimum approaches and develop Mobility Airmen into life-long learners who demonstrate institutional Air Force competencies with a positive approach to managing their own learning. In this context, learning has three main components: training, education, and experience. ...2014Business and Art of Cost Estimating
BOE Development: Scope Evaluation and CriteriaMichael Butterworth, Demetrius PradoCurrent basis-of-estimates (BOEs) do not identify succinctly, how the scope of the estimate was derived and how actuals used for build-up, analogous and parametric methodologies fit the technology and product under consideration. We believe that one of the many problems leading to poor BOE development revolves around the lack of identifying the scope of work and understanding the effort needed in particular skill mix to perform the task. ...2014Business and Art of Cost Estimating
Long Term Affordability through Knowledge Based Bid & ProposalZachary JasnoffAccording to a 2013 GAO report, "positive acquisition outcomes require the use of a knowledge-based approach to product development that demonstrates high levels of knowledge before significant commitments are made. In essence, knowledge supplants risk over time." Often times, acquisition proposals that are not knowledge-based introduce significant risk and while seeming reasonable in the short term, cannot sustain long term affordability. ...2014Business and Art of Cost Estimating
What Happens to a Cost Estimate When It's Done?William Barfield, David BachWhat happens to a cost estimate when it is done, or "finished?" Cost estimates are used to support a wide variety of financial decisions. When estimates are done and the decision has been made, is the estimate still useful after the decision, or does it become "shelf-ware?" We surveyed the international cost community to determine how we develop, document, use, and archive our various kinds of cost estimates. ...2014Business and Art of Cost Estimating
Update On The Cost FACTS (Factors, Analogies, CER's & Tools/Studies) Group " Enhancing KM By Leveraging Enterprise Social NetworkingDaniel Harper, Ruth DorrCost FACTS is a community-driven initiative to bring together the cost estimating community across MITRE, Government agencies, the contractor community, and academia to share reusable, non-restricted cost information and to encourage dialogue throughout the worldwide cost community. Cost FACTS provides an easily accessible repository of reusable cost estimating FACTS, i.e., Factors, Analogies, Cost Estimating Relationships (CERs), and Tools/Studies. ...2014Business and Art of Cost Estimating
Space Shuttle Cost Analysis: A Success Story?Humboldt MandellIn the aftermath of the highly successful Apollo lunar program, NASA struggled for a few years to find a meaningful program which would satisfy long range national space strategies, as well as reflecting the realities of the rapidly changing political environment in the nation. The Space Shuttle emerged from the need to lower the costs of orbital cargo delivery for construction of a space station and enabling Mars exploration, but also was highly constrained by DOD requirements ...2014Business and Art of Cost Estimating
A Balanced Approach to Meeting Fiscal ConstraintsSteve Green, Kevin Davis, Kurt HeppardThe effective and systematic use of cost and budgeting information is a critical component of strategic planning and decision making in most organizations. The Department of Defense's (DoD) current operational environment, scarce resources, and conflicting stakeholder expectations are resulting in extreme fiscal constraints. The result is the need to reconsider missions and goals, reassess priorities, entertain force structure alternatives, and ultimately reduce budgets. ...2014Cost Management
Cost Overruns and Cost Growth: A Three Decades Old Cost Performance Issue within DoD's Acquisition EnvironmentLeone YoungFor the last three decades, the US Department of Defense (DoD) has been encountering program performance issues such as inaccurate and unrealistic estimations for its acquisition programs, and its effort of eliminating cost overruns and cost growth phenomenon has been deemed ineffective as well. This paper examines and consolidates multiple government reports and studies, which pertain to the cost performance of DoD that has been viewed as unsatisfactory and problematic. ...2014Cost Management
Supplier Cost/Price Analyses - Best Practices for evaluating supplier proposals and quotesMike MardesichDuring proposal planning, preparation, and review, an important but often over looked aspect is the evaluation of a supplier's proposal or quote. Requirements in the Federal Acquisition Regulations (FAR) provide for certain visibility into supplier's proposals and quotes, depending on the value of the proposal/quote. Additionally it is critical to note the FAR puts the onus of supplier cost (or price) analysis on the prime contractor. ...2014Cost Management
Innovative Business Agreements and Related Cost & Pricing Methods at NASA in Support of New Commercial ProgramsTorrance Lambing, James RobertsThis paper and presentation, focusing on Kennedy Space Center, will discuss changes and new methods of pricing and estimating the costs of NASA facilities and services to be provided to outside entities for use in new Commercial Space endeavors. It will also give an overview of new NASA regulations and documents that establish policy and guidance for entering into Agreements and how they are priced under the various types of Agreement currently being used at NASA. ...2014Cost Management
The Other RCA: Restaurant Cost AnalysisPeter BraxtonThe Weapon Systems Acquisition Reform Act (WSARA) of 2009 highlighted the importance of Root Cause Analysis, or RCA, but its conduct remains shrouded in mystery. In illustrating the central role of risk and uncertainty analysis in cost, Dick Coleman often made the provocative pronouncement to the effect of "You can't stand outside a restaurant with a menu, and the people you'll be dining with, and a calculator and get within 10% of the final bill, so what makes you think you can estimate a complex multi-billion-dollar acquisition program with that precision?!" ...2014Cost Management
Intelligence Mission Data Cost Methodology GuidebookEugene Cullen, III, Matthew SchumacherIn 2013, Booz Allen Hamilton developed and authored the Intelligence Mission Data Cost Methodology Guidebook (IMD CMGB) for the Defense Intelligence Agency's (DIA) Intelligence Mission Data Center (IMDC). This guidebook is the official costing manual for life-cycle mission data planning, required by Department of Defense Directive (DoDD) 5250.01 "Management of IMD within DoD Acquisitions," and defines costing methodologies, procedures, and processes that result in OSD Cost Assessment & Program Evaluation (CAPE) and Government Accountability Office (GAO) compliant cost estimates that are required by DoD Acquisition Systems. ...2014Cost Management
Achieving Affordable Programs NRO Cost Analysis Improvement Group (CAIG) Support of Cost Driver IdentificationLinda Williams, Pat Kielb, Eileen DeVillier, Jay MillerThis paper will describe the approach taken over the last several months to identify cost drivers and summarize findings that senior leaders found helpful to the decision making process. In summary, as budgets reductions trigger asset affordability reviews, the NRO CAIG has been able to assist in identifying total organizational cost drivers and provide a framework for future cost reduction opportunities.2014Cost Management
Godel's Impact on Hilbert's Problems Or Cost Consistency and Completeness as an Impossible ExerciseDavid PeelerIn a previous set of papers, the idea of using Hilbert's Problems as a construct to propose and recently revisit the status of a list of Hilbert's Problems for Cost Estimating. This paper similarly employs Godel's enlightenment with respect to Hilbert's attempts onto the cost estimating community. What can we learn about ourselves as estimators and where can we exert the greatest impact with respect to the use of our estimates? Using Godel's two theorems of undecidability as catalyst, we will explore the effect and utility of exacting math and other motions on cost estimates specifically and programmaitics generally.2014Cost Management
A New Cost Management Policy for Your Organization: An Integrated Approach?Tom Dauber, Woomi Chase, Ken OdomDeveloping a robust Cost Management Policy is a key driver to the success of any organization, regardless of size or industry. The policy should ensure cost control measures that are valid and effective, risks are mitigated, solutions are delivered on time, and profits/ROIs are maximized. The Cost Management Policy should be a systematic approach to managing cost through-out the life cycle of a program through the application of cost engineering and cost management principles. ... 2014Cost Management
Right Sizing Earned Value Management for Your ProjectGordon KranzEarned Value Management (EVM) is a program management tool that provides data indicators that can be used on all programs to enable proactive decision making throughout the program lifecycle and facilitate communication across the program team. Each program has unique attributes that should be considered when determining program management and reporting requirements, including, but not limited to, contract size and type, scope of work, complexity, risk, technology maturity, and resource requirements. ...2014Cost Management
Big Data Meets Earned Value ManagementGlen Alleman, Thomas CoonceWhen the result of an action is of consequence, but cannot be known in advance with precision, forecasting may reduce decision risk by supplying additional information about the possible outcomes. Data obtained from observations collected sequentially over time are common. Earned Value Management is an example where project performance data (BCWP) is collected from the status reports of planned work (BCWS) and a forecast of future performance is needed to manage the program. ...2014Earned Value Management
Don't Let EVM Data Mislead YouSteve SheamerEVM data is a popular data source for cost estimators and for good reason; in theory, it should provide most of the data needed to develop an estimate for a program. For completed programs, it provides historical costs by WBS and for programs that are in work it provides a measure of the work completed, work remaining, and forecast of the work remaining. But during a period of frequent cost overruns, estimates built using EVM data often fail to forecast the extent of program overruns...2014Earned Value Management
Trust but Verify - An Improved Estimating Technique Using the Integrated Master Schedule (IMS)Eric LofgrenIt has long been the wonder of management why the Integrated Master Schedule (IMS) fails to give advanced warning of impending schedule delays. The oft-touted Government Accountability Office (GAO) 14-Point Check for Schedule Quality analyzes schedule health using key metrics, leading one to assume that such a test authenticates schedule realism. Why, then, do practitioners find themselves caught off guard to slips when their IMS appears in good health? ...2014Earned Value Management
A Cure For Unanticipated Cost and Schedule GrowthGlen Alleman, Thomas CoonceFederal programs (DoD and civilian) often fail to deliver all that was promised and many times cost more than estimated and are often late. Delivering programs with less capability than promised while exceeding the cost and planned undermines the Federal government's credibility with taxpayers and contributes to the public's negative support for these programs. ...2014Earned Value Management
Unleashing the Power of MS Excel as an EVM Analysis ToolAllen Gaudelli, Steve SheamerWhat do you do if you need to analyze or report on EVM data and you don't have access to (or can't afford) the latest industry software? Nearly everyone has a very powerful analysis and reporting tool on their desktop with the flexibility and capability to consolidate cost, schedule, and risk drivers into a single view. In this presentation, we will show you how to leverage and manipulate the inherent capabilities of Microsoft Excel to build interactive EVM dashboards ...2014Earned Value Management
Design to Cost: Misunderstood and misappliedErin Barkel, Tolga YalkinThe Canadian Department of Defence maintains that concerns over cost overruns are overstated because it adopts a design to cost approach. According to the US Government, design to cost "embodies the early establishment of realistic but rigorous cost targets and a determined effort to achieve them." From the beginning of a project to its completion, "[c]ost is addressed on a continuing basis as part of a system's development and production process." ...2014Earned Value Management
Testing Benford's Law with Software Code CountsChuck Knight, Chris KaldesWhen analyzing a data set, common thinking may lead one to suspect that the leading digit of each data point would follow a uniform distribution, where each digit (1 through 9) has an equal probability of occurrence. Benford's law, to the contrary, states that the first digit of each data point will conform to a nonuniform distribution. More specifically, it states that a data point is more likely to begin with a one than a two, a two more likely than a three, a three more likely than a four, and so on. ...2014Information Technology
Improved Method for Predicting Software Effort and ScheduleWilson Rosa, Barry Boehm, Ray Madachy, Brad Clark, Joseph P. DeanThis paper presents a set of effort and schedule estimating relationships for predicting software development using empirical data from 317 very recent US DoD programs. The first set predicts effort as a function of size and application type. The second predicts duration using size and staff level. The models are simpler and more viable to use for early estimates than traditional parametric cost models. Practical benchmarks are also provided to guide analysts in normalizing data.2014Information Technology
Costs of Migration and Operation in the CloudArlene MinkiewiczAt one level cloud computing is just Internet enabled time sharing. Instead of organizations investing in all the Information Technology (IT) assets such as hardware, software and infrastructure they need to meet business needs; cloud computing technology makes these resources available through the Internet. Cloud computing allows an organization to adopt a different economic model for meeting IT needs by reducing capital investments and increasing operational investments. ...2014Information Technology
How I Continued to Stop Worrying and Love Software Resource Data ReportsNicholas LanhamThis presentation highlights the trends and cost estimating relationships derived from detailed analysis of the August 2013 Office of the Secretary of Defense (OSD) Software Resource Data Report (SRDR) data. This analysis was conducted by Nicholas Lanham and Mike Popp and provides as follow-on analysis to the August 2012 SRDR brief developed and previously presented by Mike Popp, AIR 4.2. ... 2014Information Technology
Mobile Applications, Functional Analysis and Cost EstimationTammy PreussThis presentation will demonstrate how to derive cost estimates at different stages in a project's lifecycle by using function points and the advantages of using an FP based size estimate over a SLOC based estimate. The intended audience is software cost estimators, project managers, and anyone who is interested in software measurement.2014Information Technology
In Pursuit of the One True Software Resource Data Reporting (SRDR) DatabaseZachary McGregor-DorseyFor many years, Software Resource Data Reports, collected by the Defense Cost and Resource Center (DCARC) on Major Defense Acquisition Programs (MDAPs), have been widely acknowledged as an important source of software sizing, effort, cost, and schedule data to support estimating. However, using SRDRs presents a number of data collection, normalization, and analysis challenges, which would in large part be obviated by a single robust relational database. ... 2014Information Technology
Optimizing Total Cost of Ownership for Best Value IT Solutions: A Case Study using Parametric Models for Estimates of Alternative IT Architectures and Operational ApproachesDenton Tarbet, Kevin Woodward, Reggie ColeBecause of a variety of architectures and deployment models, Information Technology (IT) has become more and more complex for organizations to manage and support. Current technology IT system architectures range from server based local systems to implementations of a Private Cloud to utilization of the Public Cloud. Determining a "best value architecture" for IT systems requires the ability to effectively understand not only the cost, but the relative performance, schedule and risk associated with alternative solutions. ...2014Information Technology
Estimating Hardware Storage CostsJenny Woolley, William BlackEstimating Commercial-off-the-Shelf (COTS) hardware storage volume and cost requirements can be challenging. Factors such as storage type, speed, configuration, and changing costs can potentially lead to estimating difficulties. This is especially true when a Redundant Array of Independent Disks (RAID) configuration is implemented. Due to the multiple attributes that can vary within each RAID level, as well as other factors that may influence the total storage volume needed, developing relationships for estimating long-term storage costs can become complicated. ...2014Information Technology
Relating Cost to Performance: The Performance-Based Cost ModelMichael Jeffers, Robert Nehring, Jean-Ali Tavassoli, Kelly Meyers, Robert JonesFor decades, in order to produce a cost estimate, estimators have been heavily reliant on the technical characteristics of a system, such as weight for hardware elements or source lines of code (SLOC) for software elements, as specified by designers and engineers. Quite often, a question will arise about the cost of adding additional performance requirements to a system design (or in a design-to-cost scenario, the savings to be achieved by removing requirements). Traditionally, the engineers will then have to undertake a design cycle to determine how the shift in requirements will change the system. ...2014Information Technology
Lessons Learned from the International Software Benchmark Standards Group (ISBSG)Arlene MinkiewiczThis paper will introduce the ISBSG and the database that are available from the ISBSG. It then provides details of the data driven approach applied to develop these templates - discussing research approach, methodology, tools used, findings and outcomes. This is followed by a discussion of lessons learned including the strengths and weaknesses of the database and the strength and weaknesses of the solutions derived from it. While particularly relevant to software estimators, this paper should be valuable to any estimator who lacks data or has data they are not quite sure what they might do with it.2014Information Technology
Software Maintenance: Recommendations for Estimating and Data CollectionShelley Dickson, Bruce Parker, Alex Thiel, Corinne WallsheinThe software maintenance study reported at ICEAA in 2012 and 2013 continued to progress in 2013 in spite of the high data variability. This presentation summarizes the past years' software maintenance data collection structure, categorizations, normalizations, and analyses. Software maintenance size, defect, cost, and effort data were collected from Fiscal Years (FY) 1992 - 2012. Parametric analyses were performed in depth on available variables included in or derived from this U.S. Department of Defense software maintenance data set. ...2014Information Technology
An Update to the Use of Function Points in Earned Value Management for Software DevelopmentMichael Thompson, Daniel FrenchThis presentation describes the opportunity that was presented to the team and how the recently completed pilot program was developed and implemented to address it. The authors will address how effective the pilot program was as far as identifying and resolving issues, measuring earned value, as well as the challenges and lessons learned with the development, implementation, and sustainment of the FP based EVM process.2014Information Technology
The Federal IT Dashboard: Potential Application for IT Cost & Schedule AnalysisDaniel HarperFederal agencies have experienced a growing demand for rapid turnaround cost and schedule estimates. This need is increasing as the pressure to deploy systems rapidly mounts. The push for Agile SW development compounds this problem. A critical component in cost estimating is the data collection of costs for the various elements within the estimate. Analogous programs constitute a robust source for credible estimates. The problem is how to find analogous programs and how to capture the cost of elements within those programs at a sufficiently detailed level to use in a cost estimate and in a timely manner so that the cost data is still relevant. ...2014Information Technology
Trends in Enterprise Software Pricing from 2002 to 2011Ian Anderson, Dara LoganOne of the biggest challenges in the cost estimating community is data collection. In the Information Technology (IT) cost community, technology is always evolving, while the data capturing it tend to be scarce and more difficult to use in building solid cost models. Fortunately, NCCA learned the Department of the Navy (DON) Chief Information Officer (CIO) has been collecting benchmarking measures, including pricing, since 2002 under the Enterprise Software Initiative (ESI) Blanket Purchasing Agreements (BPAs). ... 2014Information Technology
Estimating Cloud Computing Costs: Practical Questions for ProgramsKathryn ConnorCloud computing has garnered the attention of the Department of Defense (DoD) as data and computer processing needs grow and budgets shrink. In the meantime, reliable literature on the costs of cloud computing in the government is still limited, but programs are interested in any solution that has potential to control growing data management costs. ...2014Information Technology
The Agile PM Tool: The Trifecta for Managing Cost, Schedule, and ScopeBlaze Smallwood, Omar MahmoudThe growing number of DoD software projects that are adopting an "Agile" development philosophy requires cost estimators to not only adapt the methodologies and metrics they use to estimate software development costs, but also re-think their models to give PMs the information they need to effectively manage these programs. The Agile PM Tool is one manifestation of this trend as it provides a logical, dynamic approach for helping the government effectively manage the cost, schedule, and scope of their "Agile" projects.2014Information Technology
Which Escalation Rate Should I Use?Nathan HonsowetzConducting life cycle cost estimates requires time frames of 10, 20, even 30 years, and with such long time frames it's important to use appropriate escalation indices. Escalation can have a significant impact on cost estimates, especially estimates with longer time frames. However, often cost estimators insert a "standard" escalation index into their models without considering whether that index is appropriate for their estimate. ...2014Life Cycle Costing
Ground Vehicle Reliability Analysis Using the Mean Cumulative FunctionCaleb FlemingThis paper surveys and outlines the fundamental MCF methodologies and explanations detailed in-depth in Wayne Nelson's Recurrent Events Data Analysis for Product Repairs, Disease Recurrences, and Other Applications. Applying the MCF to vehicle maintenance data reveals recurring component failure behaviors, develops new guidelines for interpretation, and assists in data normalization and validation.2014Life Cycle Costing
Cost Overruns and Their Precursors: An Empirical Examination of Major Department of Defense Acquisition ProgramsAlan Gideon, Enrique Campos-Nanez, Pavel Fomin, James WasekThis paper proposes a model of acquisition program future cost for two specific acquisition domains - aircraft and ships - that takes into account the non-recurring developmental costs defined at program approval and each domain's historic tendencies to exceed planned program cost. Technical and non-technical reasons for these observations are discussed. ...2014Life Cycle Costing
System Utilization: An In-depth Method of Modeling and Measuring Military Manpower CostsOmar MahmoudEstablishing defendable cost estimating methodologies for capturing military manpower costs is a key component in any Program Life Cycle Cost Estimate. With a proven and systematic approach to estimating military manpower costs, a program can be confident in selecting a proper course of action among competing alternatives when conducting an EA, appropriately designate their ACAT level, avoid the pitfalls that lead to over/under estimating or double counting costs, and above all obtain a high level of confidence from their resource sponsor and Milestone Decision Authority.2014Life Cycle Costing
Integrating Sustainability into Weapon System Acquisition within the Department of DefenseWalt Cooper, Remmie ArnoldDoD acquisition and logistics professionals use the term sustainment to describe the support needed to operate and maintain a system over its lifetime. In the context of the DoD acquisition process, sustainability involves using resources to minimize mission, human health, and environmental impacts and associated costs during the life cycle. This paper will present a draft version of "DoD Guidance - Integrating Sustainability into DoD Acquisitions," initial findings from pilot studies, and the challenges and road ahead. ...2014Life Cycle Costing
Cost Analysis & Optimization of Repair Concepts Using Marginal AnalysisJustin WoulfeOPRAL is an analytical model for determining the optimal repair locations and spares allocations in a multi-level hierarchical support organization to optimize Life Cycle Cost. With this model, the analyst can either treat repair decisions as fixed and given as input parameters, or using the OPRAL algorithm, evaluate several different repair strategies in order to find the optimal one, considering all aspects of life cycle cost. ...2014Life Cycle Costing

Military Construction Cost Estimating

Nicole Barmettler (Sullivan)An informative presentation on construction cost estimating specifically dealing with military facilities. Within this topic, the author will identify, define, and explain the cost methodologies and cost adjustment factors considered when developing construction cost estimates for general military facilities. Project costs will be exemplified by illustrating a breakdown and walkthrough of the process. The author will specifically discuss the process involved in a five year facility acquisition timeline that is usually required for a typical major military construction effort, which is defined by a project cost exceeding $750,000.2014Methods and Models
Cost and Performance Trades and Cost-Benefits AnalysisSteven IkelerThis paper will discuss the basics of Trades Analysis and the C-BA. It will include what the cost analyst should do to prepare and what information to collect during the Trades Analysis. One observation is that non-traditional Work-Breakdown Structures need to be considered and the analyst should model potential second and third order effects beforehand. ...2014Methods and Models
Lessons Learned from Estimating Non-Standard Equipment Returning from Overseas OperationsMichael MetcalfThis paper explores challenges in estimating the cost of retaining this equipment. Items that present unique challenges include: the type classification and full materiel release processes; repair, reset, and upgrade; short- and long-term storage; and knowledge retention and loss of wartime experience. We will also explore funding challenges; moving targets in number and configuration of retained systems; the transition of O&S from wartime contractor-based to peacetime organic; training; and system disposal and divestiture.2014Methods and Models
Weibull Analysis MethodErik Burgess, James Smirnoff, Brianne WongThe NRO Cost and Acquisition Assessment Group (CAAG) develops time-phased estimates for space systems in support of milestone decisions, budget formulation, and other program assessment tasks. CAAG relies on parametric budget-spreading models based on historical data to provide objective analysis and credible information before contract award or early in a program when there is little or no execution history available. ...2014Methods and Models
Study of Cost Estimating Methodology of Engineering DevelopmentMyung-Yul LeeThe purpose of this paper is to study the actual engineering labor hours for FSED Program and PE/PI project and to create a parametric estimating model in a weapon system, especially cargo aircraft. The C-17 final assembly facility in Long Beach, California closes in 2015. It is worthwhile to review historical data for the C-17 FSED program and PE/PI project, both programs have engineering development program in view of estimating methodology research.2014Methods and Models
Validation and Improvement of the Rayleigh Curve MethodMichael MenderThe presentation begins with a brief overview of the Rayleigh method, followed by an explanation of R-squared and the issues with non-linear functions. We then discuss the method for linearizing the Rayleigh function and then using it for linear regression. We then provide a brief demo of the tool we developed and then move on to a discussion of our results pertaining to the performance of the Rayleigh method. ...2014Methods and Models
Rotorcraft Cost Model Enhancements for Future ConceptsF. Gurney Thompson IIIThis paper will discuss the ongoing research efforts to improve upon existing rotorcraft cost estimation capabilities, from both a cost estimating relationship (CER) update and a model development perspective. We will share our approach, many of our findings, and any lessons learned. Efforts currently underway include data collection, updates to existing cost models and their CERs, adding support for new aircraft types and technologies, and the addition of new analysis capabilities to better understand total ownership cost.2014Methods and Models
Kill Vehicle Work Breakdown Structurennifer Tarin, Christian Smart, Paul TetraultThis paper provides an alternative to Appendix C: Missile Systems for inclusion in MIL-STD-881C, the Department of Defense standard for Work Breakdown Structures (WBSs). The Missile Defense Agency (MDA) produces interceptors that are similar to missiles with the exception of the payload element. Where Appendix C defines the payload element with a limited set of WBS elements, the MDA interceptor payload, referred to as a kill vehicle, includes a large collection of significant WBS elements. ...2014Methods and Models
Meet the Overlapping Coefficient: A Measure for Elevator SpeechesBrent LarsonYou've seen this picture before. . . a plot of two overlapping probability distributions. You may have created one with an overlay chart. Typically this graphic contrasts two cost distributions so as to illustrate similarity, difference or change. However, seldom seen is a number that quantifies the overlap or area shared by both distributions. ... 2014Methods and Models
Excel Based Cost Roll Up MethodMatthew LeezerThis paper will show the user how to create a custom function (sumlowerlevel) using Visual Basic and apply this function in Excel to generate a report that will save time and increase the accuracy of estimates. The method uses the lookup function to capture the cost of purchased components and subsystems and uses the custom function for all make assemblies to roll up the cost of the purchased parts. ...2014Methods and Models
The Role of Cost Estimating in Source SelectionAnnette BarliaThe analysis will focus on the process of developing an IGCE and then utilizing it to evaluate vendor proposals for the acquisition of new technology. It will demonstrate that a strong IGCE facilitates source selection. If the cost estimate is developed the right way, organizations will have more leverage during contract negotiations with vendors, and the acquisition will run smoothly and meet program goals. ...2014Methods and Models
Automated Costing to Identify Opportunities to Reduce Direct Material SpendJulie Driscoll, Dale MartinThis session will cover how technology is automating the costing process through integrating costing solutions with CAD, PLM and ERP; pulling information about components directly from CAD files; and using an intelligent cost engine that evaluates manufacturing routings for feasibility and cost effectiveness. We will look how these solutions enable automated batch costing of components and are used by manufacturers to support cost reduction projects. ...2014Methods and Models
Moving Beyond Technical Parameters in our CERsEric Druker, Charles HuntOne of the frequent criticisms of JCL analysis (integrated cost and schedule risk analysis) has been that the results typically exhibit coefficients of variation (CV) that are orders of magnitude less than those seen in parametric estimates of similar scope. During a recent NASA research task examining how parametrics estimates can be linked to program management artifacts, the research team stumbled upon a characteristic of our Cost Estimating Relationships (CERs) that almost certainly leads our parametric estimates to have higher than necessary CVs. ...2014Parametrics
Using Dummy Variables in CER DevelopmentShu-Ping Hu, Alfred SmithThis paper explains the reasons for using dummy variables in regression analysis and how to use them effectively when deriving CERs. Specific guidelines are proposed to help analysts determine if the application of dummy variables is appropriate for their data set. This paper also demonstrates some common errors in applying dummy variables to real examples. An application using dummy variables in splines (to derive the fitted equation as well as the intersection) is also discussed.2014Parametrics
Bayesian Parametrics: Developing a CER with Limited Data and Even Without DataChristian SmartThis paper discusses Bayes' Theorem, and applies it to linear and nonlinear CERs, including ordinary least squares and log-transformed ordinary least squares.2014Parametrics
Tactical Vehicle Cons & Reps Cost Estimating Relationship ToolCassandra Capots, Jeffery Cherwonik, Adam James, Leonard Ogborn When estimating Operating and Support (O&S), it is reasonable to assume that as reliability increases, consumable and reparable parts ("cons and reps") cost should decrease (less frequent repairs), while as vehicle price increases, parts cost should increase (more expensive parts). Developing a dataset to support cost estimating relationships (CERs) for the Army's Tactical Vehicle fleet is a significant challenge. ...2014Parametrics
Unmanned Aerial Vehicle Systems Database and Parametric Model ResearchBruce Parker, Rachel Cosgray, Anna Irvine, Brian Welsh, Patrick Staley, Praful PatelThis handbook documents the first two years of research sponsored by NCCA and ODASA-CE. With the inclusion of UAS in the United States' (U.S.) military arsenal, the government has a desire to understand the components of a UAS including the air vehicle, GCS and payloads, the development and production process, and the O&S implications of these systems. ...2014Parametrics
Building a Complex Hardware Cost Model for AntennasDavid Bloom, Danny PolidiThis paper discusses the development of a Complex Antenna Cost Model based on quantifiable sizing mechanisms which are designed to quickly and accurately calculate the "top-down" cost for all engineering and operations disciplines and functions required for antenna development and test. ...2014Parametrics
ESA Project Office Cost ModelHerve JoumierThis paper describes the definition and the implementation of a Project Office parametric cost model aimed at defining a reference manpower allocation based on fair judgement and rational fair modelling. It has been developed to improve the cost estimations capability of ESA providing outputs that are used by agencies for comparison with contractors' proposals. ...2014Parametrics
Improving the Accuracy of Cost Estimating Relationship for Software SystemsDavid WangIn this paper, we leverage recently published results on the statistical characterization of schedule and cost risks to analyze the prediction accuracy of CER for software systems. Our analytical analysis and empirical statistical analysis of actual code size growth data suggest that the statistics of code size estimate can also be characterized by fat-tail distributions....2014Parametrics
Hybrid Parametric Estimation for Greater AccuracyWilliam RoetzheimThis talk will discuss hybrid parametric estimation based on HLO catalogs, and give examples of the application and accuracy of this technique within organizations including the State of California, Halliburton, IBM, Procter and Gamble, and multiple top 25 financial institutions.2014Parametrics
Linking Parametric Estimates to Program Management Artifacts (LPEPM)Mike Smith, Ted Mills, John SwarenA common fate of parametric cost and schedule estimates is that they fall into disuse as a Project's own artifacts (e.g. Work Breakdown Structure (WBS), budget, schedule, risk lists, etc.) are created and mature. Parametric estimates typically do not map cleanly to WBS or schedule-derived artifacts, allowing a sense among Project Managers (PMs) ? rightly or wrongly ? that "parametric estimates are fine, but they don't reflect my project." ...2014Parametrics
Impact of Full Funding on Cost Improvement Rate: A Parametric AssessmentBrianne Wong, Erik BurgessThe NRO Cost and Acquisition Assessment Group (CAAG) currently houses data collected from various U.S. Government organizations, including the Department of Defense and NASA. These data points are pooled with NRO data and used in Cost Estimating Relationships for space hardware, which underpin CAAG estimates for major system acquisition programs, aiding in the development of programs and budgets. ...2014Parametrics
Developing R&D and Mass Production Cost Estimating Methodologies for Korean Maneuver Weapon SystemDoo Hyun Lee, Sung-Jin Kang, Suhwan KimIn this research we have attempted to establish a CER development process to meet the current need, and found certain cost drivers for the Korean historical maneuver weapons system data, using Forward selection, Stepwise Regression and R square selection. We have also developed a CER model for production labor costs, using Learning rate which has been generally applied to estimate valid production labor costs. ...2014Parametrics
Affordability Engineering for Better Alternative Selection and Risk ReductionMarlena McWilliams, Bob KouryThis paper will outline the process and steps to how to implement affordability into your estimating environment to understand system requirements vs. system costs and affordability; and provide best value identifying and accepting the most affordable, feasible, and effective system or alternative. The need to evaluate and assign a best value is essential to both the government (DoD) and the contractors supplying systems / alternatives to the government.2014Risk
Critique of Cost-Risk Analysis and Frankenstein Spacecraft Designs: A Proposed SolutionMohamed Elghefari, Eric PlumerIn this paper, we present a historical data driven probabilistic cost growth model for adjusting spacecraft cost Current Best Estimate (CBE), for both earth orbiting and deep space missions. The model is sensitive to when, in the mission development life cycle, the spacecraft cost CBE is generated. The model is based on historical spacecraft data obtained from the NASA Cost Analysis Data Requirements (CADRe) database. ...2014Risk
Risk Adjusted Inflation IndicesJames BlackIt is often observed that Office of the Secretary of Defense (OSD) inflation rates are different than prime contractor specific inflation rates seen in Forward Pricing Rate Agreements/Proposals (FPRAs/FPRPs) and in commodity group composite rates (e.g. Global Insight indices). Yet, it is a standard practice in many cost estimating organizations to use OSD inflation rates for escalating costs in estimates without giving consideration to a range of different possible inflation rates. ...2014Risk
Improved Decision Making with Sensitivity AnalysisBlake BoswellIn this study, we review common applications of SA methods to project estimation including a description of each method as well as its advantages and disadvantages. Additionally, we explore the topic of Global Sensitivity Analysis (GSA), which is a process for measuring the overall contribution of uncertain model inputs to variation in model outputs and is a popular technique for model validation in engineering and life sciences. ...2014Risk
Excel Based Schedule Risk and Cost EstimatesWilliam EvansThe emphasis on robust cost and schedule estimating solutions has resulted in the creation of multiple solutions for analysts and clients. Excel based integrated cost and schedule risk is only one methodology for solving client problems. Incorporating cost and schedule risk in Excel leads to an increased ability to audit and trace the schedule and cost risk methodology throughout an Excel based PLCCE, improving the confidence and robustness of the estimate. While there are hurdles to the implementation of an Excel based schedule risk solution, when combined with form controls, the benefits to PLCCE auditability and usability are immense.2014Risk
Using Bayesian Belief Networks with Monte Carlo Simulation ModelingMarina DombrovskayaOne of the main aspects of creating a Monte Carlo simulation cost estimate is the accuracy in defining uncertainty and risk parameters associated with the cost components of the model. It is equally important to assess and accurately represent inter-dependencies between uncertain variables and risks, which are measured via correlation. Since oftentimes historical data is insufficient for a rigorous statistical analysis, both probability distribution and correlation are commonly estimated via a subject matter opinion. ...2014Risk
Expert Elicitation of a Maximum Duration Using Risk ScenariosMarc GreenbergThis paper provides with a brief review on a current method to elicit a most-likely commute time, a "practical maximum" commute time and risk factors that contribute to commute delays. This paper continues by showing how these risk factors can be organized into an objective hierarchy of risk factors, leading to the creation of a customized risk work breakdown structure (WBS). ...2014Risk
Quantifying the Necessity of Risk Mitigation StrategiesJames Northington, Christopher Schmidt, Chuck KnightThis paper will begin by highlighting flaws with the current risk management process, walk through the new proposed methodology for risk mitigation, and provide a quantitative example of the process in action using raw data. In the end, the proposed methodology will provide a greater understanding of program risks, a measurement of importance of implementing a risk mitigation strategy, a measurement of the mitigation strategy's subsequent impact, and a quantitative measurement of benefit for Program Mangers to defend their risk mitigation strategies.2014Risk
NASA JCL: Process and LessonsSteve Wilson, Mike StellyOur paper will describe JCL implementation and address the creation, implementation, evolution, inherent benefits, inherent issues, its ultimate place among program management's decision-making toolset, and hard recommendations for organizations hoping to wage successful JCL campaigns. Real-world examples will be referenced, including those from the Constellation, Commercial Crew, and Orion spacecraft development programs. ...2014Space
NASA's Phasing Estimating RelationshipsChad Krause, Erik Burgess, Darren ElliottCost and schedule estimating in support of budget formulation is limited when cost phasing is not considered. As a result, NASA'a Office of Evaluation (OE) Cost Analysis Division (CAD) initiated a review of historic mission funding profiles for the purpose of corroborating current phasing profiles and optimizing future budgeting performance. Actual expenditures by year, technical parameters, and programmatic information were compiled and normalized from NASA's extensive library of CADRe (Cost Analysis Data Requirment) documents for programs since 1990. ...2014Space
Developing Space Vehicle Hardware Nonrecurring Cost Estimating Relationships at the NRO CAAGRyan Timm, Jan SterbutzelThis paper builds on our 2012 SCEA conference briefing that described the NRO CAAG approach to developing Space Vehicle (SV) hardware Cost Estimating Relationships (CERs) for Nonrecurring (NR) engineering. These CERs are developed from the NRO CAAG's cost database of more than 2300 space hardware boxes, and can stand as alternatives to other popular parametric tools, like the nonrecurring CERs in USCM or NAFCOM. ...2014Space
A Next Generation Software Cost ModelJairus Hihn, Tim Menzies, James JohnsonIn this paper we will summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods. We will then describe the methodology being used in the development of a NASA Software Cost Model that provides an integrated effort, schedule, risk estimate, as well as identifying the changes in the project characteristics that are most likely to improve a given projects cost-schedule performance and risk exposure.2014Space
The NASA Project Cost Estimating CapabilityAndy Prince, Brian Alford, Blake Boswell, Matt PitlykThe paper begins with a detailed description of the capabilities and shortcomings of the NAFCOM architecture. The criteria behind the decision to develop the PCEC are outlined. Then the requirements for the PCEC are discussed, followed by a description of the PCEC architecture. Finally, the paper provides a vision for the future of NASA cost estimating capabilities.2014Space
NASA Instrument Cost Model (NICM)Hamid Habib-Agahi, Joseph Mrozinski, George FoxThe NASA Instrument Cost Model (NICM) includes several parametric cost estimating relationships (CERs) used to estimate NASA's future spacecraft's instrument development cost. This presentation will cover the challenges associated with creating cost models in an environment where data on previously built instruments is 1) sparse, 2) heterogeneous and 3) book-kept differently by the various NASA centers and support institutions. ...2014Space
New Air Force Integrated Baseline Review (IBR) Process - A Quick Reaction Capability (QRC) PerspectiveJames Ross, Martin Levitan, Christine Bolton, Barbara MeyersSince the early 1990s, the U.S. Air Force has been using Integrated Baseline Reviews (IBR) as a key technique for early assessment of the efficiency and effectiveness of baseline plans, resource allocations, scheduling and costs. Much of the earlier IBR doctrine does an excellent job describing what is required to complete these assessments, but little detail had been written about the specific processes, procedures, steps and techniques required to undertake these assessments. The January 2012 "Air Force Integrated Baseline Review (IBR) Process Guide", developed by SAF/AQXC personnel, is one of the best documents to date to address this concern. A drawback of this document is that the described IBR Process is based primarily on experiences with, and lessons learned from, the Acquisition Category (ACAT) I KC-46 Tanker Program...2013Earned Value Management
The Use of Function Points in Earned Value Management for Software DevelopmentBen Netherland, Mike Thompson, Dan FrenchIn this presentation, the authors detail their efforts in the development of an EVM methodology for a government software development project utilizing the International Function Point User Group (IPFUG) function point software sizing metric. Traditionally it has been difficult to apply Earned Value Management (EVM) criteria to software development projects, as no tangible value is earned until the software is delivered to production...2013Earned Value Management
Measuring and Managing Organizational Performance in the GovernmentKellie Scarbrough (Wutzke)In this presentation, the author details the challenges that government agencies face when attempting to measure organizational performance and presents a general methodology for successfully implementing a robust performance measurement program. The majority of government agencies do not operate like private industry and face unique challenges when defining meaningful metrics and collecting performance data...2013Earned Value Management
Understanding Requirements for Subcontract EV Flow Down and ManagementMark InfantiDo you use Earned Value Management (EVM)? Do you have a government customer that is requiring you to use EVM and report the data monthly? In either of these situations, if you have a subcontractor (SC), you need to know how they will report to you. You need to have a plan on how you will incorporate their data into your data for a complete program understanding. If your SC reports to you at cost level how do you add the fee to their report? How do you add your companies general and administrative (G&A) overhead to their costs? Do you share your Integrated Master Schedule (IMS) with them or just the key milestones?...2013Earned Value Management
Cost Estimation and Earned Value IntegrationSissy Gregg, Michelle EhlingerTraditionally, cost estimators and earned value analysts work separately from one another, focusing on their own individual discipline. We have found synergies by merging these disciplines together resulting in stronger analysis through improved methods, more robust data sets, and new tools. The benefits of cost estimating (CE) and earned value (EV) management integration are realized throughout the acquisition lifecycle and enhance senior leadership decision making and improve acquisition outcomes. This paper presents the proven benefits of CE/EV integration, including more accurate cost estimates, cost estimating relationships, improved program management support, and cross program assessments...2013Earned Value Management
Time is Money: The Importance and Desired Attributes of Schedule Basis of EstimatesJustin Hornback"Time is Money" is a maxim made popular by Benjamin Franklin. It reflects the long understood importance of schedule requirements and their impact on cost. While the importance of cost basis of estimates (BoE) have gained in popularity among industry communities, schedule BoEs are, at least equally, if not more important as cost, but have yet to reach the same level of understanding of importance within the industry...2013Earned Value Management
Estimating Cost To-Go Without Stable EVM DataPeter Frederic, Ronald K. LarsonInitial independent estimates for development programs are typically created using parametric methods. As a program progresses, it becomes important to separate sunk cost from cost to-go. In the very early stages of a program, it is acceptable to estimate cost to-go by simply subtracting the reported sunk cost from the parametrically-estimated total cost. However, as the program continues to progress and more significant progress accumulates, we can no longer assume that dollars spent equate directly to technical progress. It is then necessary to understand progress to-date in order to estimate cost to-go...2013Earned Value Management
The [Whole] Truth about ANSI-compliant EVMSMichael NosbischOver the past several years, there have been multiple presentations by government agency representatives and consultants alike that have characterized an EVMS that complies with the ANSI/EIA 748(B) Standard as simply "good project management," and as such should not be that difficult for most successful government contractors to implement. It is true that the basic concepts and principles of EVM can absolutely be likened to sound project management. However, the costly, time-intensive process of implementing, validating, and then maintaining an "ANSI-compliant" EVM system is far and above what any contractor would do if it were not a contractual requirement that is also fully reimbursable. ...2013Earned Value Management
Cloud Computing Starter Kit: Cost and Business Case ConsiderationsJennifer Manring, Raj AgrawalCloud computing is a major trend in the information technology (IT) industry that is affecting both commercial companies and Federal Government agencies due to its promise of agility, scalability, and cost savings. Prior to investing in cloud computing, decision makers are asking "what is my return on investment (ROI)?' However, there is no one size fits all answer to this question and caution should be exercised when applying generic rules of thumb and quick assessment cost and economic models. Costs, benefits, and risks must be carefully assessed for each situation...2013eTrack
Got a Defense Business System? The effect of the new defense business system policy on cost estimating, tools, and fundingCharles GuThe Department of Defense (DoD) budgets over $7 billion a year for business system investments and has mandated that all information technology systems must be assessed and certified through a formal process before funding can be obligated in FY 2013. This new law, Section 901 of the Fiscal Year 2012 National Defense Authorization Act (FY2012 NDAA), significantly expands the scope of systems requiring certification to include any business system with a total cost in excess of $1M over the period of the current future-years defense program, regardless of type of funding or whether any development or modernization is planned...2013eTrack
How to Estimate and Measure "the Cloud" and Make COCOMO Cloud EnabledDavid Seaver, John RosbrughCloud and Big Data have become the new buzzwords in the commercial, federal and DoD technology arenas. This presentation will: 1. Define Cloud & Big Data 2. Identify what's new and different with the Cloud and Big data. 3. Identify the cost drivers for cost estimates...2013eTrack
Utilization of Visual Basic in Cost Estimating and Analysis Tools - Anyone Can CookJeremy EdenAs collaborative environments become more prevalent in all industries, the cost estimating and analysis industry is no exception to this movement. Increasing amounts of pressure are put upon cost estimators and analysts to develop tools that are low cost, fast, robust in design, have long standing methodologies, do not require proprietary software or licenses, and are and easy to use by both the advanced cost estimator looking for maximum control and the novice simply trying to diligently support the early stages of a development program...2013eTrack
Methods to Analyze Services Portfolio Cost Drivers and EfficienciesValerie Reinert, Virginia WydlerThis presentation provides a statistical methodology used to evaluate the "health" of a services portfolio through performance data analysis and cost analysis, with the goal to identify cost drivers and efficiencies for performance improvement. Managing a Federal government portfolio is a complex endeavor. Portfolio Managers have a major responsibility to evaluate whether their portfolio is operating efficiently and effectively. Thus, they need to use analytical procedures that, when executed, will evaluate the "health" of the portfolio and identify areas for increased efficiency and effectiveness. If the analysis is not data-driven, this leads to a lack of proper business case analysis and cost benefit analysis to weed out bad actors in the portfolio...2013eTrack
Touch Labor Estimate ModelingMichael Yeager, Lyle DavisTo support its mission for the F-35, the production cost estimating team developed touch labor estimate models which include the flexibility to run production rate effects, loss of learning, commonality adjustments, affordability initiative and outsourcing impacts, multiple learning curve break point analyzes, and estimates for touch labor in hours or by realization. Given the scrutiny to the F-35 Lightning II program by the Department of Defense and Congress detailed and accurate cost modeling allows for better budgeting and more credibility within the services themselves and with the American public...2013eTrack
Innovative Business Agreements and Resulting Cost & Pricing Methods at NASA in Support of New Commercial ProgramsJames Roberts, Torrance LambingOn April 15, 2010 President Obama delivered a speech at Kennedy Space Center in which he outlined his new vision for the U.S. space program. Emphasis was placed on enabling the exploration of Space by Commercial entities instead of by the Government. Since that time, NASA's role has changed in many instances from being a program manager-overseeing development of space launch hardware and conducting space exploration missions-to one of support and a provider of space-related facilities and infrastructure...2013eTrack
Capacity Cost Model: Balancing the Demand for Software Changes Against the Supply of ResourcesChris KaldesA Program Executive Office (PEO) within the Department of Defense (DoD) is responsible for multiple information systems that support the Services (Army, Navy, and Air Force). This PEO is not only responsible for Operations and Maintenance (O&M) of the information systems, but also for implementing system change requests that are made by the Services' Senior Service Representatives (SSRs). The PEO was relatively satisfied with the status quo, but the executives within the PEO recognized that there were three problems with doing business today. (1) Currently, there was no way to scientifically indicate/prove to the SSRs what software changes they could promise...2013Information Technology
Cloud Computing and Big Data - What's the Big Deal?Arlene MinkiewiczAt one level, cloud computing is just Internet enabled time sharing. Instead of Information Technology (IT) organizations investing in all of the hardware, software and infrastructure necessary to meet their business needs, cloud computing makes access to them available through the Internet. On the one hand, cloud computing allows an organization to adopt a different economic model for meeting IT needs by reducing capital investments and increasing operational investments. On the other hand, cloud computing enables the capture, storage, sharing, analysis and visualization of huge amounts of data in multiple formats from multiple media type: Big Data analysis...2013Information Technology
ODASA-CE Software Growth ResearchKevin Cincotta, Lauren Nolte, Eric Lofgren, Remmie ArnoldFor several years, the Office of the Deputy Assistant Secretary of the Army for Cost and Economics (ODASA-CE) has used a single growth factor to account for size growth in weapon system software development estimates. This factor is invariant to program characteristics and may, for example, lead to excessive growth estimates for large programs, whereas experience suggests that these grow less in percentage terms than their smaller counterparts. Over the past year, ODASA-CE worked with Technomics, Inc. to research improved methodologies for incorporating growth in software estimates...2013Information Technology
Developing A Business Case for CloudCynthia O'Brien, John BellCommercial organizations and their customers have embraced cloud-based data management solutions with astonishing speed. Government agencies are not far behind, spurred on by the Office of Management and Budget's "Cloud First" Federal IT policy urging agencies to look to cloud-based solutions whenever possible. Government decision makers are moving to cloud solutions not only because of pressure from OMB and end users; they are also intrigued by the promise of dramatic cost savings. But can a decision to pursue a cloud transition be based on the promise of cost savings alone? There have been numerous examples where transitions to cloud have not yielded the 40% or more expected cost savings...2013Information Technology
Estimating Real-Time Software Projects with the COSMIC Functional Size Measurement Method and the ISBSG RepositoryH.S. van HeeringenNowadays, software project estimation of administrative software systems has evolved into quite a mature stage. Standardized functional size measurement methods, like IFPUG and NESMA, are used to measure the functionality that the software application is going to offer to the user and this functional size is considered to be the main cost driver for administrative software projects. These methods are suitable for project estimation, because they measure the functional user requirements (which should be described before the project starts), instead of the actual delivered product (for instance source lines of code, which can only be measured after the project finished)...2013Information Technology
The COSMIC Functional Size Measurement Method: An IntroductionH.S. van HeeringenThe COSMIC functional size measurement method is a relatively new and not very well known method to measure the functional size of pieces of software. The main advantages of this method over the existing methods (like IFPUG and NESMA function point analysis) is that COSMIC is designed to also measure other types of software than the traditional administrative software. COSMIC can therefore be used to measure real-time, embedded, infrastructure software, as well as administrative software. Another advantage is the continuous scale that is used in the COSMIC method, which makes it possible to accurately measure the difference in size between different functional processes...2013Information Technology
Software Maintenance Cost Estimating Relationship Development for Space SystemsRyan TimmMethods development for Space Vehicle (SV) cost estimating has mainly focused on acquisition cost, as acquisition makes up the majority of a SV's lifecycle costs and has high interest from oversight agencies and congress. Tightening budgets and increased space vehicle life expectancy has bolstered the need to further refine methods for estimating SV maintenance costs...2013Information Technology
Estimating Software Development Costs for Agile ProjectsBlaze Smallwood, Omar MahmoudAddressing the need to more rapidly develop and field capabilities for the warfighter, more and more software-centric DoD programs are transitioning towards an industry trend called "Agile" software development. While "Agile" software development is geared towards producing usable software products more rapidly than traditional waterfall or incremental methods, it also requires more flexibility with managing requirements. The main challenge this has created for program managers is figuring out how to effectively manage scope, cost, schedule, and performance in this flexible, fast-paced development environment in which requirements are more fluid. In turn, cost estimators have been challenged to develop new data collection approaches and estimating methodologies to more effectively estimate software costs for these "Agile" programs...2013Information Technology
Domain-Driven Software Cost, Schedule, and Phase Distribution Models: Using Software Resource Data ReportsWilson Rosa, Barry Boehm, Brad Clark, Joe Dean, Ray MadachyInstead of developing cost and schedule estimation models with many parameters, this paper describes an analysis approach based on grouping similar software applications together called Productivity Types. Productivity types are groups of application domains that are environment independent, technology driven, and are characterized by 13 product attributes. Also consideration is given to the operating environment that the software operates within. Over 196 actual software projects from DoD's Software Resource Data Reports (SRDRs) were fully inspected and analyzed to produce a comprehensive set of Cost Estimation Relationships, Schedule Estimation Relationships, Software Productivity Benchmarks, and Best Practice Data Normalization Guide. Analysis results will be discussed in this presentation...2013Information Technology
Software Maintenance: Recommendations for Estimating and Data CollectionCorinne Wallshein, Bruce Parker, Vanessa Welker, Thomas Harless, Peter BraxtonEntering a period of fiscal austerity, it becomes more important than ever to estimate and consider operating and support (O&S) costs, which represent the lion's share of life cycle cost (LCC) for most platforms, during acquisition. Given the ubiquity of software in today's complex programs, a key component of O&S is software maintenance. This paper will discuss our work to understand the process of software maintenance, data collected and normalized to date, and resultant benchmarks for use in developing and cross-checking software maintenance estimates. It represents an update to "Software Maintenance Data Collection and Estimating Challenges" (Welker, et al., SCEA/ISPA, 2012)...2013Information Technology
Estimates of Unit Cost Reductions of the F-16 Fighter as a Result of U.S. Arms Export ProductionAndrew A. Yeung, Keenan D. Yoho, Jeremy ArkesArms exports have increasingly become an attractive option for reducing escalating unit costs of new weapon systems to the United States Department of Defense. However, while there is no lack of conjecture, there is little data that show weapon system costs to the United States actually decrease when the same weapon is sold to a foreign buyer. 2013Journal of Cost Analysis and Parametrics
Accuracy Matters: Selecting a Lot-Based Cost Improvement CurveShu-Ping Hu, Alfred SmithThere are two commonly used cost improvement curve theories: unit cost theory and cumulative average cost theory. Ideally, analysts develop the cost improvement curve by analyzing unit cost data. However, it is common that instead of unit costs, analysts must develop the cost improvement curve from lot cost data. An essential step in this process is to estimate the theoretical lot midpoints for each lot, to proceed with the curve-fitting process.2013Journal of Cost Analysis and Parametrics
A Macro-Stochastic Model for Improving the Accuracy of Department of Defense Life Cycle Cost EstimatesErin Ryan, Christine Schubert Kabban, David Jacques, Jonathan D. RitschelThe authors present a prognostic cost model that is shown to provide significantly more accurate estimates of life cycle costs for Department of Defense programs. Unlike current cost estimation approaches, this model does not rely on the assumption of a fixed program baseline. Instead, the model presented here adopts a stochastic approach to program uncertainty, seeking to identify and incorporate top-level (i.e., macro) drivers of estimating error to produce a cost estimate that is likely to be more accurate in the real world of shifting program baselines. 2013Journal of Cost Analysis and Parametrics
Comparison between the Mix-Based Costing and the Activity-Based Costing Methods in the Costing of Construction ProjectsLeandro Torres Di Gregorio, Carlos Alberto Pereira SoaresAfter bibliographic research on costing methods in civil construction, and a presentation of the mix-based costing method, as well as an application of the activity-based costing method in the costing of civil construction projects based on other authors, a possible application of the mix-based costing was sought. This method allows the distribution of costs and indirect expenses to products without the subjectivities and uncertainties typical of traditional apportionment, by means of analyses of different production scenarios.2013Journal of Cost Analysis and Parametrics
Galaxy Charts: The 1,000-Light-Year View of the DataRobert Nehring, Katharine Mann, Robert JonesThis article presents a new kind of chart, called a Galaxy chart, which combines the strengths of other chart types. A Galaxy chart displays an entire Cost Element Structure on a single sheet of paper, showing all of the elements, their relationships, and their costs in a visually appealing way. Each child cost element is in orbit around its parent, with its children in orbit around their parent. The size of each cost element is directly proportional to its magnitude. 2013Journal of Cost Analysis and Parametrics
Feasibility of Budget for Acquisition of Two Joint Support ShipsErin K. Barkel, Tolga R. YalkinThe mandate of the Parliamentary Budget Officer is to provide independent analysis to Parliament on the state of the nations finances, the governments estimates, and trends in the Canadian economy, and, upon request from a committee or parliamentarian, to estimate the financial cost of any proposal for matters over which Parliament has jurisdiction. 2013Journal of Cost Analysis and Parametrics
Can DoD Inflation Indices and Discounting Processes Be Improved?Kathryn Connor, James DrydenCurrently the DoD is facing an uncertain budget environment. This will have an impact on what the DoD can spend for acquisition programs and sustainment of major weapons systems. Current practices for inflation and discounting skew program affordability, especially during operations and sustainment. In this presentation, we look at how well current inflation indices and discount rates serve programs today and whether there are strategies to improve the accuracy of these estimates. After examining the experience of several major weapons systems we have identified potential policy changes and strategies for cost estimators to employ on inflation and discounting. We believe that these can improve a program's understanding of long run affordability and potential risks associated with inflation and discounting.2013Life Cycle Cost
Lessons Learned from the Joint STARS Analysis of Alternatives for Cost and Risk AnalysesDaniel Mask, David StemThe Joint Capabilities Integration and Development System (JCIDS) plays a key role in identifying the capabilities required by the warfighters to support the National Security, Defense, and Military Strategies. The Analysis of Alternatives (AoA) is a documented evaluation of the performance, operational effectiveness, operational suitability, and estimated costs of alternative systems to meet a capability need that has been identified through the JCIDS process. As weapon system complexities and their interdependencies between other systems increase in the collaborative warfighter space, this evaluation of mission systems and technologies has become more and more challenging...2013Life Cycle Cost
Analysis of Large O&S Proposal: Lessons Learned!James LinickRecently, a $20M contract was awarded for weapon system O&S. The contractor's proposal was subjected to technical evaluation and extensive negotiations ensued. Over 100 BOEs were evaluated for multiple CLINs. Proposal cost components were labor, materials and ODCs. This presentation will focus on how a final agreement was reached, focusing on: 1) Proposal construction, what are CLINs, BOE's and how sub-contractor work was credited, 2) BOE technical evaluations, the delineations and weighting of on-site and offsite work, 3) Forward price rate agreements, what they are, and the difference between them and final negotiated labor rates, 4) Cost to price calculations, how they are performed and what are Fringe, G&A, Overhead, Fee and Cost of Money and 5) How final contract price and weapon system CONOPS cost estimate compared...2013Life Cycle Cost
The Dynamic Economic Model: A Flexible Approach to Investment Analysis with AnalyticaKevin King, Jason MehrtensGiven the current economic crises facing the United States, government spending is being placed under more intense scrutiny. As a result, major investments are being more closely analyzed and pressure to be cost effective is of the utmost importance. In the world of cost estimation, the scrutiny is translating to increased demand for cost metrics prior to the establishment of clear scope requirements. For MCR, uncovering new tools and strategies to adapt to this type of environment has become a top priority. One way MCR improved versatility while maintaining our valued reliability has been to create economic cost models using Analytica®, a modeling software designed to enhance the capabilities of the modern spreadsheet tool...2013Life Cycle Cost
The Forgotten Costs and Uncertainty Analysis within the DoD Acquisition EnvironmentRoberto Perez, Elizabeth EvansDespite the best efforts of the Department of Defense (DoD) to capture all costs associated with the life cycle of acquisition programs, it continues to be an elusive. A primary reason for this shortcoming in cost estimating is the lack of time spent on estimating operations & sustainment (O&S) costs compared to the time and effort estimating research, development, and procurement costs. The implications for this imbalance are significant given the fact that 60 - 70 percent of an acquisition program's life cycle cost is comprised of operations and support...2013Life Cycle Cost
Life Cycle Cost Estimate (LCCE) AssessmentsCasey Trail, David Brown, Colleen CraigThis presentation summarizes a scorecard tool and criteria that are used to conduct assessments. The criteria, which are based on the GAO Cost Estimating and Assessment Guide, include four characteristics, which are further broken in to 13 criteria and 75 sub-criteria. Lessons learned, and key findings are presented as examples of LCCE characteristics associated with both positive and negative evaluation outcomes. The presentation is of interest to government and contractor cost estimators, as well as anyone from the cost community wishing to leverage best practices and tools related to life cycle cost assessment. Participants with cost assessment experience are encouraged to share their experiences and lessons learned...2013Life Cycle Cost
Construction vs. Systems Acquisition Cost Estimating: A Comparative AnalysisTom Sanders, Steve Essig, Tim AndersonThe continued push to improve DOD's acquisition processes includes several thrusts-regulatory streamlining, improving use of incentives, and workforce optimization. These are not new. To deal with these concerns, the acquisition community has a unique opportunity. DOD Cost estimators have recently received a strong "vote of confidence" in the acquisition community; new positions have been created and existing positions have been protected. That is not necessarily so in the construction industry, another large employer of cost estimators. The construction industry has undergone a transformation over the past six years, beginning with the housing collapse in 2006 and enduring through the years of the Great Recession...2013Management
The Joint Integrated Analysis Tool (JIAT)- Making Data Sharing EasierNiatika Griffin, Melissa Cyrulik, John McGahanThe silver lining in the current DoD budget turmoil is that it provides us with an opportunity to reexamine how we operate and look for ways to be more efficient. In times of belt tightening, improving access and distribution of established resources is imperative. We are challenged to ultimately improve how we share cost resources. For cost professionals, these resources include databases of cost and technical information, libraries of resources, and approved models that can be used rather than re-invented. The Office of the Deputy Assistant Secretary of the Army - Cost & Economics (ODASA-CE) has invested in developing a tool that facilitates secure access to data needed by the cost analyst...2013Management
Strategic Value of the Business Case Analysis (BCA)Eric BullerThe impact of sequestration has become the main topic of discussion in almost all circles in the Department of Defense since the Budget Control Act passed into law in 2011. In response to reduced budget sequestration, the DoD has been interested in exploring potential cost reductions in weapon system operations. In nearly all lifecycle costs, sustainment generally makes up 60% to 80% of the total program costs. In a defense world seeking increased efficiencies and budget savings it is imperative to examine effective sustainment strategies...2013Management
Back to the Big Easy: Revisiting Hilbert's Problems for Cost EstimatingPeter Braxton, Richard ColemanAt the International Congress of Mathematicians at the Sorbonne in Paris in 1900, German mathematician David Hilbert boldly put forth a list of 23 theretofore unsolved problems in mathematics, which subsequently became quite influential in 20th-century research efforts. At the Joint SCEA/ISPA Conference in New Orleans in June, 2007, the authors audaciously emulated Hilbert with a list of 21 problems for cost estimating and risk analysis...2013Management
Net Present Value (NPV): The Basics and the PitfallsKevin Schutt, Nathan Honsowetz One of the essential metrics of an economic analysis is the Net Present Value (NPV) measure. While it is a fairly simple calculation, made even more routine and efficient by modern technology, be wary of pitfalls in its application. This presentation will cover the basic components and methodology of NPV and discuss the pitfalls and easily-made mistakes in its application. These include incorrect discount rate formulas, inconsistent treatment of inflation, incorrect understanding of the discount rate, incorrect application of Excel's "=NPV()" formula (hint: it's not really "Net"!), and including irrelevant cash flows. Come learn about Net Present Value and avoid these potentially career-limiting mistakes!2013Management
Fifteen Undeniable Truths About Project Cost Estimates, or Why You Need an Independent Cost EstimateTim AndersonAll project cost estimates are wrong, some more so than others. It is exceedingly difficult to accurately estimate the cost of a large project (e.g., major defense acquisition project). Reasons for this are manifold, but the primary causes seem to be general uncertainty, insufficient technical detail, unanticipated design/technical changes, funding and schedule perturbations over the acquisition life cycle, other programmatic "difficulties," and otherwise attempting to predict the future with imperfect information. The end result is that project cost estimates tend to be too low, with early estimates being the most likely to be understated...2013Management
Cost Engineering Heath Check ... How Good are your Numbers?Dale Shermon, Mark GilmourHigh quality cost estimating gives a business leader confidence to make rational financial decisions. Whether you are a business leader or a cost estimating manager, you have a vested interest in understanding whether you can depend on your organisations ability to generate accurate cost forecasts and estimates. But how can business leaders be confident that the cost information that they are being provided is of high quality? How can a cost estimating manager be sure that their team is providing high quality cost information?...2013Management
Sailing Blind: Data-Driven Estimating at the Concept Phase, a Case Study of Canada's Joint Support Ship ProjectTolga Yalkin, Erin BarkelCanada's lack of recent experience in naval shipbuilding and the unique nature of the National Shipbuilding Procurement Strategy (NSPS) present unique estimating challenges. A methodology was needed that was at once data-driven and defendable while at the same time able to address shipbuilding realities at a very high level conceptual stage. This paper outlines a case study where PRICE Systems and the Canadian Parliamentary Budget Officer (CPBO) developed a methodology and model to address these challenges for the Joint Support Ship (JSS) project...2013Management
Supplier Cost/Price AnalysesDavid Eck, Todd BishopUnder United States Federal Government contracts, an important but overlooked aspect is the audit and evaluation of a subcontractor's cost proposal. Requirements in the Federal Acquisition Regulations (FAR) provide for certain visibility into subcontractor's cost proposal depending on the value of the proposal. Additionally it is critical to note the FAR puts the onus of subcontractor cost (or price) analysis on the prime contractor. The expectation is that the same level of detail that the prime includes in its proposal should be included as support to a subcontractor's cost proposal. Effectively the prime must perform subcontractor cost analysis in accordance with the FAR...2013Management
National Reconnaissance Office (NRO) Experiences with the Implementation of Will Cost/Should Cost ManagementSissy Gregg, Greg Lochbaum, Erik BurgessThe objective of this presentation is to address how to assess and incorporate contract performance in an Estimate to Complete, for the purpose of determining a reasonable amount of margin and how that margin should be managed and controlled. The discussion will go beyond cost and schedule variance to forecast total program cost including future requirements, other government costs, etc...2013Management
Avoiding Pitfalls When Applying Learning to your EstimateAndrew BusickMost cost estimators understand the basics of learning curves, but we sometimes fail to recognize the implications of the way that we apply learning. This presentation points out some major pitfalls and mistakes that can be made when dealing with learning. We address both deriving learning curves from historical data and applying learning to a cost estimate. Additionally, we recommend best practices to use when dealing with these issues...2013Methods and Models I
Cost Effective Analysis: The Role of Discounting in Government InvestingBrandon ShepelakIn order for the government to best fulfill its obligation to efficiently spend tax dollars, it uses various analysis methods to determine which investment to choose. Cost effectiveness analysis is often the preferred analysis method when an investment is not being judged as to the value added but to determine the least expensive spending alternative. According to the Office of Management and Budget (OMB) in circular A-94, an investment is cost effective when, comparing competing alternatives, its cost are the lowest as expressed in present value terms for a given amount of benefits (OMB, 2011)...2013Methods and Models I
Learning Curve Analysis of Small Data Sets - Spacecraft Bus Cost Improvement AnalysisBrian Welsh, James YorkRecent initiatives to implement cost savings across DoD have produced a number of potential opportunities. Within the Space Portfolio one such opportunity is to acquire satellites more efficiently through block buys and limiting production breaks. To quantify these opportunities, we can analyze historical cost experience. This paper describes three approaches used to pool cost improvement data from multiple programs, allowing for regression of all data simultaneously. The approaches are (1) scaling based on weight, (2) normalizing to a common theoretical first unit, and (3) the use of program binary ("dummy") variables...2013Methods and Models I
Building an Analysis Schedule " Lessons Learned from the SGSS ProgramMatt Blocker, David JacinthoConstructing an integrated cost and schedule uncertainty model involves bringing together important program elements: cost, risk, and schedule. The model's core is the schedule; the linking of costs and risks to the schedule is what allows the implementation of time-dependent costs and, ultimately, an analysis that jointly considers cost and schedule. It follows, then, that one of the first, and often largest, difficulties in constructing an integrated cost and schedule model is the development of the schedule backbone...2013Methods and Models I
A Novel Non-Recurring Production CER MethodologyLisa Hackbath, Raymond CovertProduction costs are generally categorized as either non-recurring or recurring. Typically non-recurring costs include tooling and pre-production activities, among others. Cost estimating relationships (CERs) are generally developed first for recurring hardware costs, and then non-recurring CERs are developed as a function of recurring hardware costs...2013Methods and Models I
Data Collection and Analysis Supporting Defendable Cost EstimatesArlene MinkiewiczCost modeling and estimation has a long and interesting history in the Aerospace and Defense industry starting around the time of World War II. All sorts of mathematical and experiential models have been proposed and used over the years to help with bidding, planning, proposing and executing contracts. While general purpose models are useful, more and more industry and government cost professionals are asking for models built with data very specific to their industry and their organization. Unfortunately many organizations do not have the infrastructure, processes or tools for collecting project data efficiently...2013Methods and Models I
Getting (and sharing!) the FACTS: Factors, Analogies, CERs and Tools/StudiesDaniel Harper, Ruth DorrOne of MITRE's corporate values is "People in partnership." MITRE values [...] partnership with the government, collaboration within and without..." We at MITRE have been charged via our corporate goals to "Apply technical and engineering excellence," by bringing to the customer the best thinking possible by "[...] tapping into a deep technical base, both within MITRE and globally, across the breadth of industry and academia." ...2013Methods and Models I
Aircraft Operation - Cost Models; Lessons Learned!James LinickIn FY10 the Air Force Cost Analysis Agency (AFCAA) asked one of its cost support contractors to study the processes in place for producing specific aircraft cost products and implement improvements, specifically concentrating on format consistency, maintainability and automation. The periodically produced products to be studied were: 1) Aircraft Operation and Sustainment Cost Estimates 2) Cost per Flying Hour Risk Models 3) Jet Engine Maintenance Procedures versus Time on the Wing...2013Methods and Models I
Implementing a Data Collection Tool to Estimate and Analyze SpendingThomas Brooks, Daniel MaskAs budgets and resources shrink, the effort to better understand spending has become a priority to many decision makers within the Marine Corps. Since Operation Iraqi Freedom (OIF) began in 2003, military organizations have been able to request and receive immediate Overseas Contingency Operations (OCO) funding without following traditional procurement processes. This readily available funding has led to the emphasis of developing and maintaining operational capability in lieu of program estimates and resource management. The lack of quality budget estimates and use of financial tracking mechanisms has made it difficult for Marines to strategically allocate budget cuts based on efficiencies or priorities; more often than not, budget reductions are distributed uniformly across the organization...2013Methods and Models I
The Dashboard Tool: Taking Cost Analysis to the Next Step By Combining Costs with Capabilities to Evaluate COAsJohn KoAlthough life cycle cost estimates can provide a comprehensive view of the costs for a program, cost estimates do not provide the entire picture needed to make sound business decisions because they only focus on the costs for a program. To determine how programs are actually performing, the system's capabilities must be linked with the program's costs to properly quantify how programs meet their requirements with respect to cost. Booz Allen has developed an Excel Based tool called Dashboard that estimates both the program's costs and the capabilities achieved for a program...2013Methods and Models I
Realizing the True Cost of Energy - Keeping the DoD GreenJohn KoIn recent years, the demand for energy as well as volatility in costs of energy for the Department of Defense has prompted individual programs to monitor these costs closely in order to identify new ways to control these costs. In response to spikes in energy costs in recent years, major programs are now evaluating different alternatives to improve fuel efficiency and ultimately reduce the military's dependence on fuel...2013Methods and Models I
Cost Estimating Tips: Learn Tricks in Excel and Best PracticesEric HongThis presentation for the 2013 International Cost Estimating and Analysis Association (ICEAA) Conference is to present Microsoft Excel tricks and best practices for cost estimating. Many of the presentations at the conference are insightful and show the amazing cost estimating products that are being developed, but the knowledge gained from those presentations usually cannot be applied on a daily basis. The goal for this presentation is to provide the attendees efficient and more effective ways of modeling...2013Methods and Models I
Galaxy Charts: Depict and Color Your WBS in a Meaningful WayRobert Nehring, Katharine Mann, Robert JonesFor centuries, we have searched for new ways to display our thoughts and ideas in ways that will allow the viewer to easily digest and understand our point of view. Displaying quantitative data in interesting yet meaningful ways is no different. In fact, discerning how to display your entire Work Breakdown Structure (WBS) succinctly and clearly has proven to be very difficult. One answer to this challenge is called the Galaxy Chart, which shows both relationships and magnitudes on a single chart...2013Methods and Models II
Computing Fully Burdened Costs of Energy - Fuel (FBCE-F)Walt Cooper, Richard LeeThe Defense Science Board, in studies conducted in the early years of the past decade, concluded that the DoD was not according sufficient emphasis to the consumption of energy in the battle space. Based on those studies, DoD leadership has elevated the importance of seeking opportunities to deliver more capable forces that consume less energy. The Joint Capabilities Integration and Development System now requires an energy Key Performance Parameter. Pilot studies have been conducted to establish a framework for the computation of the fully burdened costs of energy (FBCE) that, by statute, must now be used to inform cost, schedule and performance trade decisions in Analyses of Alternatives (AoAs). Based on those pilot efforts, the Assistant Secretary of Defense (Operational Energy Plans and Programs) has recently published a framework for computing FBCE. A detailed and thorough understanding of this framework is essential for cost analysts supporting AoAs...2013Methods and Models II
Estimating Alternatives for Joint Future Theater Lift (JFTL)Robert GeorgiUS military operations in Afghanistan and Iraq are making military transport aircraft work at rates not foreseen just a decade ago. At the same time the existing inventory of transport aircraft are limited in their ability to transport heavy cargo to austere unimproved landing zones where military forces routinely operate. The US Air Force (USAF) with considerable support from the US Army conducted the Joint Future Theater Lift (JFTL) Technology Study (JTS) to consider how best to equip its theater lift fleets beyond 2020 to address this need. The JTS was established by merging the US Army's Joint Heavy Lift (JHL) and the USAF's Advanced Joint Air Combat Systems (AJACS). The JTS analyzes the cost, risks, and operational effectiveness of theater lift technology alternatives that address capability shortfalls identified in the JFTL Initial Capabilities Document (ICD); an essential capability shortfall being the delivery of a combat-configured medium weight armored vehicle (up to 36 tons) into austere, short, unimproved landing areas without ground handling equipment. One of the alternatives included a hybrid airship design...2013Methods and Models II
Ground Radar Expenditure Phasing AnalysisRick GarciaRealistic budget submissions are critical to a program's success. Budget realism relies on a full understanding of the system's requirements so that a comprehensive cost and schedule estimate can be developed that can inform a realistic budget position. In addition to developing a realistic estimate, the budget position must also reflect a realistic phasing of budget dollars across an appropriate schedule; so that sufficient budget dollars are available at the correct time in a program's acquisition lifecycle to assist the program achieve success. Most existing expenditure phasing models are Rayleigh or Weibull-based distributions and were developed on a variety of DoD space and ground programs and are not commodity specific to terrestrial radar systems...2013Methods and Models II
Improving Cost Estimates in a Medical Acquisition EnvironmentMark Russo, Kimberly WallaceEstimating costs for drug development has proven difficult over the years. Difficulties stem from the overlapping of the Department of Defense (DoD) acquisition process and the Food and Drug Administration's (FDA's) review process, small data pool, industry close hold of cost data, the inherent difficulties in the drug development process, and the lack of using best practices in cost estimating. Over the past couple of years the medical cost community has been working towards improving their cost estimates. These efforts included the establishment of a standard drug development work breakdown structure to ensure that data could be collected and compared in a standard way. A medical cost model was then created in order to use the available data to develop program estimates...2013Methods and Models II
Incorporation of Program Priorities in Funding-Constrained EnvironmentsLeon Halstead, Michael Land, Derrick Kuhn, Bryan ShraderThe recent era of declining Federal (and organizational) budgets has given new impetus to the ability to prioritize funding in an efficient and effective manner. While there are multiple methodologies to pursue these budget reductions, there is a tendency for both allocations and reductions of funding to be top-down, i.e. Organization-driven with a less defensible framework to support Program priorities. This approach heightens the risk that Program goals and objectives will be negatively impacted due to underfunding in this manner due to lessened consideration of Program-specific environmental factors...2013Methods and Models II
Quantifying Uncertainty in Early Lifecycle Cost Estimation (QUELCE)Robert Stoddard, Robert Ferguson, Dennis Goldenson, Jim McCurley, Dave ZubrowEarly lifecycle cost estimation continues to grow in importance within both industry and the U.S. Government. In the commercial arena, shrinking business and product development cycles demand more rapid and early cost estimates. Within the U.S. Defense Department, cost estimates are now mandated prior to Milestone A approval within the DoD Acquisition Lifecycle. One of the primary challenges with early lifecycle cost estimation remains the incongruence of input data required by existing cost estimation models and the data available early in the lifecycle. Recent research into capability-based cost estimation offers improvements but ignores much of what may be viewed as program execution change drivers. These program execution change drivers embody the root causes of changes in program performance leading to changes in the cost of the program. As a result, our team developed the QUELCE method as an innovative approach to elicit domain expert knowledge about the possible future scenarios of change driver behavior for a given program...2013Methods and Models II
Sensitivity Analysis in Cost-Benefits AnalysisSteven IkelerThe Army has been using Cost-Benefits Analysis (C-BA) for all new requirements with cost impact for two years. As with the Analysis of Alternatives (AoA), a sensitivity analysis of the C-BA results is required. The C-BAs frequently use weighted decision matrices to combine the benefits into a single Cost-Benefit Index. This greatly simplifies the sensitivity analysis and facilitates allocation of cost to the individual benefits. The author reviewed the sensitivity analysis in 80 C-BAs. This session will address some of the techniques for presenting sensitivity analysis, the basic math behind the sensitivity analysis and allocating cost to benefits and generalizing the results to more complicated problems.2013Methods and Models II
Probabilistic Technology Investment Ranking SystemPeter FredericThis paper describes the Probabilistic Technology Investment Ranking System (PTIRS), a multi-tool cost estimating system. PTIRS is a new cost estimating tool that has been developed for the NASA Environmentally Responsible Aviation (ERA) Project at Langley Research Center. The ERA Project is charged with developing new technologies for commercial transport aircraft that will reduce fuel consumption, emissions, and noise. The PTIRS tool assesses the integrated, end-to-end benefits of a new technology and allows the benefits to be weighed against the cost of maturing, certifying, and ultimately implementing the technology in a generation-after-next (N+2) subsonic transport system. The tool is both deterministic and probabilistic: it allows inputs with ranges of uncertainty described as statistical distributions and it uses Monte Carlo simulation to produce results with statistically described ranges of uncertainty. The tool produces a two part answer: 1) the life cycle cost impact and noise, emission, and performance benefit of implementing and operating the new technology, and 2) the cost and time required to mature and certify the technology to a readiness level appropriate for full-scale development...2013Methods and Models II
Rapid Generation and Optimisation of Ship Compartment Configuration based on Life Cycle Cost and Operational EffectivenessAidan Depetro, Rhyan HoeyIt has been well established that the majority of Life Cycle Cost (LCC) is incurred during the in-service period. Among other factors, this is strongly linked to the design of the ship and the decisions made during the early design phase. In particular, compartment configuration can have a significant effect on LCC. Poorly considered compartment configuration and hull selection can result in hydrodynamic efficiencies which significantly increase energy consumption and hence fuel costs. Associated space limitations, inadequate or non-existent removal routes and other accessibility problems may result in expensive equipment overhaul and replacement procedures, invasive removal methods, longer maintenance availabilities and increased maintenance costs. Current design methods and decision analysis techniques focus mainly on the trade-off between operational effectiveness and acquisition cost rather than LCC...2013Methods and Models II
Data-Driven Estimating - Quantifying Electronics Complexity Using Public DataF. Gurney ThompsonThe parametric cost estimation industry is being challenged to increase the defendability of their estimates, mainly through comparison to analogous data. However, companies are more protective than ever of their high quality cost data. While this data is often obtainable, it usually comes with strings attached (non-disclosure agreements, etc.) that preclude its usage in defending your estimate. At the same time, government and military databases with cost and technical data are being exposed to the public, and while the noise introduced by granularity issues and the mapping process of this data presents its own set of challenges, this trend also contains the key to a solution...2013Methods and Models II
Affordability Analysis: The Role of Process, Cost and ROI Modeling In Improved Program PerformanceDan GalorathAffordability analysis as part of decision making may be the biggest edge of the decade for both commercial organizations and DoD / government organizations. Affordability has been addressed in the past but never to the level it is today. In an IT context companies struggle to increase profits and often view IT as a necessary evil: one that consumes resources rather than contributes to the bottom line. However, IT can be a significant contributor when IT decisions are made after modeling affordability in multiple dimensions.2013Methods and Models II
Applying Cost Analysis to the DoDAFMichael ButterworthCost analysts in the aerospace community today are being challenged to accurately estimate new and on-going program efforts that will be integrated into the Department of Defense Architecture Framework (DoDAF). The DoDAF provides a foundational framework for developing and representing architecture descriptions that ensure a common denominator for understanding, comparing, and integrating architectures across organizational, Joint, and multinational boundaries. It establishes data element definitions, rules, and relationships and a baseline set of products for consistent development of systems, integrated, or federated architectures...2013Methods and Models II
The Role of Value Engineering In Affordability AnalysisQuentin Redman, Robert Koury, Joseph Bobinis, Paul Tuttle, Kevin Woodward, Hein B.A. de JongThe purpose of this white paper is to describe the role Value Engineering plays within the affordability process. The paper is not a step by step "How To Conduct or Execute" Value Engineering (VE) but is a discussion of the context, input, setup, execution hints, and output of Value Engineering in support of conducting affordability analysis and management. As such it is important to understand the concept of affordability within the Systems Engineering paradigm. This paper is designed to provide insights, lessons learned, and suggestions for using Value Engineering in the affordability process...2013Methods and Models II
Adding Process Tailoring to Product Size for Better Cost EstimationDavid Bloom, Chris FudgeGenerally, when one is looking to create a parametric model to predict costs of an activity, then one looks to size relationships associated with that activity. For example, if one were to develop a parametric model for estimating the development of a Field Programmable Gate Array (FPGA), then, because most of the work to complete an FPGA revolves around software code development, building a parametric model around new code, modified code and reused code (with other size relationships) would probably be a good place to start...2013Parametrics
Evaluating Cost Relationships with Nonparametric StatisticsCaleb FlemingParametric statistics are often used to analyze data, evaluate hypotheses, and determine the significance of a given set of inputs. Most commonly, cost estimators use the following parametric testing measures for significance: Pearson's product-moment correlation, z-tests, t-tests, and Analysis of Variance (ANOVA) tests. While each of these tests is subjectively appropriate to evaluate varying scenarios, each test also objectively depends on the validity of a fixed set of assumptions. Though the assumptions differ by test, all root themselves in the general idea that the data is contained within an adequately large sample and follows a particular probability distribution with estimable parameters...2013Parametrics
Fit, Rather Than Assume, a CER Error DistributionShu-Ping HuAnalysts usually assume a distribution (e.g., normal, log-normal, or triangular) to model the errors of a cost estimating relationship (CER) for cost uncertainly analysis. However, this hypothetical assumption may not be suitable to model the underlying distribution of CER errors. A distribution fitting tool is often used to hypothesize an appropriate distribution for a given set of data. It can also be applied to fit a distribution to...2013Parametrics
Developing a Military Aircraft Cost Estimating Model in KoreaSung Jin Kang, Dong Kyu KimA parametric cost estimating model is very useful for the weapon acquisition process. But it requires validated and normalized historic cost data to develop the Cost Estimating Relationships (CERs). Statisticians seek hundreds of data points to assure a good statistical fit between CERs and the supporting data points comprised of cost, technical, and program information...2013Parametrics
Improving Program Affordability Through The Application of Data AnalyticsDavid Wang, Austin LeeIn the current budgetary environment, program affordability is a key concern. Improving program execution and minimizing schedule delays and cost growth are keys to improving program affordability. However, there is a lack of quantitative analysis of schedule risks, and a lack of understanding of the root causes for schedule delays and cost growth. Consequently, most of the common affordability improvement suggestions centered on relaxation of test and verification requirements, and streamlining of engineering processes may not address the true root causes. The effectiveness of these affordability improvement suggestions is uncertain...2013Parametrics
Portfolio Management: A Parameter ApproachAndy NichollsMy paper outlines one high level method of managing a portfolio of future projects from pre-concept into acquisition phases and examines current practices and tool set requirements at the same time; examines the limitations of the current toolset and the possibilities of inclusion of probabilistic risk calculations via use of individual program risk registers...2013Parametrics
On General Purpose Model CredibilityWendy Lee, Evin StumpIn the aero/defense community and elsewhere, cost estimation of proposed large scale hardware projects by prospective bidders or sponsors is often done using general purpose parametric models, i.e., models based on statistical analysis of a collection of historical data. Of interest in this paper are the parametric models that provide estimates of costs of hardware development and production for a wide variety of hardware, especially hardware that may be used in many situations and environments...2013Parametrics
Benefits of Integrating Schedule and Cost Risk AnalysisRafael HartkeProject managers have the arduous task of running their projects within budget and deadlines, all while facing a wide range of risks like prices and labor costs uncertainty, suppliers performance and reliability, technological challenges, weather, currency exchange rates, etc. Thus, assessing the likelihood and impact of risks is of utmost importance to manage not only the project itself, but also the expectations of the various stakeholders...2013Risk I
Modeling the Risk and Uncertainty of Inflation Rate ProjectionsBrian Flynn, Peter Braxton"Experience in controversies such as these brings out the impossibility of learning anything from facts till they are examined and interpreted by reason; and teaches that the most reckless and treacherous of all theorists is he who professes to let facts and figures speak for themselves, who keeps in the background the part he has played, perhaps unconsciously, in selecting and grouping them, and in suggesting the argument post hoc ergo propter hoc." Alfred Marshall...2013Risk I
Probabilistic Mass Growth UncertaintiesDarren Elliott, Eric PlumerMass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of an ongoing NASA effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle...2013Risk I
Deciphering JCL: How to use the JCL Scatterplot and IsocurvesEric DrukerAs the use of integrated cost and schedule risk analysis (ICSRA) methodologies rapidly expands across U.S. government programs there has been increasing confusion on how to interpret its seminal result: the Joint Confidence Level scatterplot and its associated isocurve. Over the past few years, the following statements have been made about JCL:...2013Risk I
Understanding the Results of an Integrated Cost/Schedule Risk AnalysisJames Johnson, Darren ElliottThe recent rise of integrated risk analyses methods has created a new opportunity for complex projects to understand the dynamic inter-relationship of cost, schedule, and risk. NASA has been implementing Joint Confidence Level (JCL) analysis of cost, schedule, and risk for all major projects since 2009 in an attempt to proceduralize and codify the requirements for an integrated risk analysis product...2013Risk I
Top-level Schedule Distribution ModelsPeter FredericIn the initial stages of a development project, it is sometimes necessary to build a summary-level schedule for planning and budgeting purposes before the day-by-day details of the project are fully defined or understood. However, when uncertainty assessments are performed on schedule networks containing few activities, the distribution forms chosen for individual activity durations can have a significant impact on the overall results. It is therefore important to choose uncertainty distribution forms that accurately represent the behavior of the sub-network of activities represented by each summary activity...2013Risk I
Analysis Schedules: Danger at a Higher LevelJustin HornbackA government organization charged with evaluating the adequacy of schedule and funding of large projects using project provided schedules and cost estimates receives a detailed (~2000 line) schedule capable of analysis with limited modification or summation. Analysis conducted using detailed schedule revealed interesting results when compared with high level schedule used in previous evaluations of the same project when a detailed analysis was not available...2013Risk I
Cost of Mission Assurance for Space ProgramsErik Burgess, Joe Frisbie, Michelle Jones, Chad KrauseStandards and strategies for assuring successful space missions have evolved in recent years, with some programs moving toward traditional government standards and others moving to industry's commercial standards. This has prompted questions from several government leaders about the cost of mission assurance. Although there are decades of engineering studies, test data, and on-orbit failure data to support detailed evaluation of the technical merits of various approaches, there is very little on the cost side of the equation. The National Reconnaissance Office Cost Analysis Improvement Group (NRO CAIG) has begun to address this by focusing on two of the more costly areas of mission assurance-environmental testing and high-reliability electronic parts...2013Risk I
Joint Cost Schedule Risk and Uncertainty HandbookDuncan Thomas, John Fitch, Alf Smith, Jeff McDowellThe Naval Center for Cost Analysis (NCCA) has created a Cost Schedule Risk and Uncertainty Handbook (CSRUH) that clearly presents simple, well-defined cost risk and uncertainty analysis processes that are repeatable, defendable and easily understood. The CSRUH is based upon the Air Force Cost Analysis Agency (AFCAA) Cost Risk Uncertainty Handbook published in 2007. It updates key processes to address cost risk and uncertainty methods with an emphasis on providing guidance to capture the impact of schedule uncertainty and the risk register while avoiding "double counting"...2013Risk II
A Step-Wise Approach to Elicit Triangular DistributionsMarc GreenbergAs the federal government acquires less mature, more advanced and more complex systems, there is an ever-increasing burden on the cost analyst to employ methods of eliciting requirements, schedule and cost uncertainties from one of more subject matter experts (SMEs). Arguably, the most common technique a cost analyst uses today to elicit such data is to ask each SME for the lowest, most likely and highest value which, consequently, produces a triangular distribution...2013Risk II
Robust Default Correlation for Cost Risk AnalysisChristian SmartCorrelation is an important consideration in cost risk analysis. Exclusion of correlation from cost risk analysis results in the de facto assumption that all risks are independent. The assumption of independence leads to significant underestimation of total risk. However, figuring out the correct correlation values between work breakdown structures elements can be challenging. For instance, it is difficult to estimate the exact correlation value between the structures and thermal protection subsystems in a cost risk estimate...2013Risk II
Use of the Risk Driver Method in Monte Carlo Simulation of a Project ScheduleDavid HulettIdentifying the root causes of project schedule and cost risk requires that the risk to the project schedule is clearly and directly driven by identified and quantified risks. In the Risk Driver Method the risks from the Risk Register drive the simulation. (As a side note, we find that the Risk Registers are not complete-during the interviews to collect risk data the interviewees introduce important risks that are, surprisingly, missing from the Risk Register.) The Risk Driver Method differs from older, more traditional approaches in which 3-point (low, most likely and high) estimates of the activity durations are applied directly to activity durations...2013Risk II
Base Realignment and Closure (BRAC) Savings and Acquisition RiskPeter Braxton, Kevin Cincotta, Richard LeeThe Government Accountability Office (GAO) recently released the study Military Base Realignments and Closures: Updated Costs and Savings Estimates from BRAC 2005 (GAO-12-709R, June 29, 2012). Its appendices contain a wealth of risk data, with initial estimates (2005 BRAC commission) and final costs (Fiscal year 2011 DOD budget) for 175 distinct BRAC initiatives. Applying an innovative method for modeling within-program risk and uncertainty using cross-program data, this paper derives cost growth factors (CGFs) and coefficients of variation (CVs) for BRAC initiatives. Furthermore, the pattern in these data is astoundingly similar to that found in major defense acquisition program (MDAP) data, a strong confirmation of this modeling approach (meta analysis)...2013Risk II
Cost Contingency Analysis using PolytopesBohdan KaluznyContingency is defined as a possibility that must be prepared for-an event that may occur but is not likely or intended. In the realm of cost estimation, contingency refers to a reserve above and beyond a baseline estimate that would be tapped if one or more unexpected events led to higher program costs. Ideally, the selection of a contingency amount should be based on achieving a particular level of confidence derived from cost uncertainty and risk analyses, but often a mere rule-of-thumb percentage (e.g. 15% of the baseline estimate) is applied...2013Risk II
Correlation Matrices RevisitedSteven IkelerIn a cost risk analysis, risk distributions are sometimes assigned at a low level and then combined, most commonly using Monte Carlo simulation. When using this technique, it is important to use a realistic correlation matrix. It is usually impractical to know all the pairwise correlations. This session will address some of the techniques for completing a credible correlation matrix at differing levels of input information and certainty. Additionally, the session will show examples and review the mathematics behind legitimate correlation matrices. Past sessions have discussed these topics and typically deal with correction techniques for expert-based correlation matrices. This session will be different since it deals less with correcting expert based correlation matrices and more with dealing with uncertainty in the correlations or incomplete correlation matrices. It will also address an extension of the problem to a parametric analysis problem.2013Risk II
Optimization of Cost and Performance in Complex Systems Using System Dynamics ModelingMichael Polly, Benjamin HarrisAnalysis of trade spaces between cost and performance is critical to win new contract pursuits in today's highly competitive environment. The overall goal in a program is to design and produce a system that maximizes both performance and affordability. In attempting to balance cost and performance, traditional cost models only allow for experimentation with one variable at a time. This can be very cumbersome on large systems, and inhibits how much of the trade space can be explored under time constraints. Altering one variable at a time also limits the number of cost estimating relationships that can be incorporated into the analysis...2012Earned Value Management Track
Joint Analysis of Cost and Schedule (JACS) - A New Tool for JCL AnalysisAntonio Rippe, Rey CarpioNASA has recently implemented a policy that requires a program/project to be approved at a 70% Joint Budget and Schedule Confidence Level (JCL) which are to be generated from models that integrate cost and schedule. To comply with the policy, a tool framework is needed that can conduct a risk and uncertainty analysis on schedule logic as well as integrate cost uncertainty analysis to account for phasing and costs that are dependent on overall duration. NASA is currently using several platforms, but has seen some issues in extensibility, cost integration, and overall performance...2012Earned Value Management Track
"The Answer is 5": Observations on Cost & Schedule in Small Defense ProgramsBrian Fersch, Wesley Tate, Colleen LeonardCost & schedule overruns have been an enduring issue in government acquisitions. Our largest programs are typically dissected and their flaws illuminated with 20/20 hindsight. DoDs smaller programs, though comprising the majority of the budget and quantity of programs, do not get this same attention. Over the course of the past year this study team has collected historical & current cost and schedule data on numerous programs that has been collected at similar points in each programs acquisition lifecycle. The end result is a database that allowed the team to examine how cost estimates and schedule expectations changed from year to year for the same events...2012Earned Value Management Track
WBS Development: Rules, Aberrations and MIL-STD-881C Implementation ChallengesMichael MetcalfIssued on October 3, 2011, Military Standard 881C (MIL-STD-881C) provides the framework for developing Work Breakdown Structures (WBSs) for use throughout the acquisition process, as well as a standard set of structures that fit eleven commodity types. These WBSs are used for management, system engineering, and cost estimation by a variety of stakeholders. Though comprehensive, the MIL-STD does not address some of the complex details encountered when developing a weapon system WBS, and there are many special cases and exceptions that could not be captured in the document...2012Earned Value Management Track
Integrated Project Management (IPM) - Transforming Data into InformationAndrea Mozzo, Bruce KoontzEarned Value Management (EVM) integrates the cost, technical and schedule performance. EVM is not just tracking Schedule Performance Index (SPI) and Cost Performance Index (CPI), but has been transformed into Integrated Project Management (IPM). IPM involves three key areas: Managing cost, schedule, and technical performance within constraints; initiating effective cost, schedule, and performance tradeoffs when constraints are not achievable; and continually evaluating progress to predict and mitigate problems...2012Earned Value Management Track
Critical Chain Applied to an MRO - Surface Repair FacilityHoward RainerThe project management methodology of Theory of Constraints, called Critical Chain, is examined for application to Scheduling, Planning and Control (SP&C) for a defense-related Maintenance Repair Operation (MRO). The opportunities for Critical Chain, as an extension of Theory of Constraints (TOC), have been reported for well over a decade. With this proven record comes a proposal for similar application in the SP&C at the Surface Repair Facility (SRF) at Cecil Field, Florida. The SRF has established reporting systems such as Earned Value Methodology (EVM) - mentioned here to distinguish the differences between execution and reporting in SP&C.2012Earned Value Management Track
Successful Implementation of an Over Target Baseline/Schedule from a Government PerspectiveJoseph C. Annunziato, Daniel W. DonaldsonIn recent years, Over Target Baseline/Schedule implementations have been accomplished with varying degrees of success. Using the Over Target Baseline (OTB) and Over Target Schedule (OTS) Handbook as a guide, this presentation will address the proper way to implement an Over Target Baseline/Schedule from a Government perspective. The purpose is to provide both government and industry professionals a general understanding of the processes and decisions that must be considered when implementing an OTB and to assist the earned value management (EVM) community in the understanding and implementation of this important management tool. It will also incorporate examples and lessons learned from recent OTB/OTS implementations...2012Earned Value Management Track
Accepted Standards and Emerging Trends in Over Target Baseline (OTB) ContractsSimon DekkerOver Target Baseline (OTB) projects or programs are those that have run significantly over cost and require formal reprogramming - essentially a complete replanning of the project - in order to help the contractor regain management control over the effort. The OTB process has been well documented and become an established part of Earned Value Management practice. Much of the literature-to-date has focused on OTBs from the contractor perspective, including the steps to take in order to propose and implement an OTB, and the proper channels and occasions for engaging the customer in the process...2012Earned Value Management Track
EVM for the Rest of UsJavier SloninskyEarned Value Management (EVM) is the practice of evaluating cost and schedule data to measure productivity and progress throughout the lifecyle of a project. EVM, however, often has the stigma of being overly complex and burdensome to implement in a useful way. This session will discuss best practices in implementing a simple yet robust earned value management solution for organizations that benefit from progress measurement and the ability to forecast cost performance, but don't need to comply with ANSI 748, the EVM standard often used by government agencies and contractors. We'll discuss topics including the integration of required data, developing key performance indicators (KPI) and metrics, application of rules of credit, updating of percent complete, and automated EV reporting.2012Earned Value Management Track
A New EVM Performance Index: The MRPIMichael NosbischIt has long been understood that the accuracy of the cost performance index (CPI) is closely tied to how management reserve (MR) has been historically utilized on a specific project/program. However, a formal assessment of MR usage is still not consistently performed in conjunction with CPI calculation and subsequent analysis. This presentation will offer a potential solution to this issue, by proposing a "Management Reserve Performance Index," or "MRPI." Once calculated, the author will demonstrate how the MRPI can be then used to develop a "risk-adjusted IEAC" that provides a more accurate assessment of a project's or program's overall cost performance.2012Earned Value Management Track
From Bid Package to Detailed EVM Baseline in One Easy StepKaren StiffYour Company has put together a cost estimate and basic schedule along with the technical elements as a bid package for a Request for Proposal for a Government contract requiring Earned Value Management (EVM). The proposals have been evaluated, discussions have been held and now it's time for the award. Congratulations your company's won the contract! Now the manager assigned to the program has the task of setting up an EVM baseline for the program. Why is it so hard to do a detailed schedule and EVM baseline, when you've got the schedule and cost that the company bid? Why doesn't this baseline look in any way similar to the bid package cost and schedule? Should it look similar?...2012Earned Value Management Track
Touchpoints between the SCEA Cost Estimating Body of Knowledge (CEBoK®) and the Program Management Institute (PMI) Body of Knowledge (PMBoK)Leon HalsteadThe General Accounting Office (GAO), in its publication of the "GAO Cost Estimating and Assessment Guide" in May of 2009, focuses extensively on the importance of Cost Estimation and its contribution to overall project success. While this publication offers specific technical examples of cost estimation and analysis capability, e.g. pre-Program approval cost estimates/baselines, Affordability Analysis, et al, there is less focus on how these technical examples "fit" into the larger Project Management context...2012Informational Sessions
A Structured Approach to CCEA® CertificationMichael Mahoney, Rick Beavers (Presenter)The structured approach to SCEA certification details the methods used by Lockheed Martin Chelmsford to prepare employees for the CCEA examination. In 2010, three of us set out to earn certification at the CCEA level and all three of us passed on the first attempt. The session will introduce a generic schedule for studying the CEBoK modules including the time allotted for each, an approach to taking sample tests, and lessons learned. We'll discuss what worked and what didn't and share tips and tricks.2012Informational Sessions
The EVP Certification: A Bridge Between SCEA and AACEMichael NosbischIn the 1990s, SCEA and AACE International, formerly known as the Association for the Advancement of Cost Engineering, not only were both members of the now defunct Joint Cost Management Association, but they also maintained a cooperative agreement that intended for members of the two associations to "benefit from increased knowledge of the activities and services available from the other organization." While that document agreement unfortunately did not extend beyond 1995, there has recently been a joint effort to renew a similar type of agreement...2012Informational Sessions
The Fractal Nature of Cost Risk: The Portfolio Effect, Power Laws, and Risk and Uncertainty Properties of Lognormal DistributionsChristian B. SmartCost risk can be added to the list of the many phenomena in nature that follow a power-law probability distribution. Both the normal and lognormal, neither of which is a power-law distribution, underestimate the probability of extreme cost growth, as shown by comparison with empirical data. This situation puts the widely debated portfolio effect into further dispute.2012Journal of Cost Analysis and Parametrics
Prediction Bounds for General-Error-Regression Cost-Estimating RelationshipsDr. Stephen A. BookEstimating the cost of a system under development is essentially trying to predict the future, which means that any such estimate contains uncertainty. When estimating using a cost-estimating relationship (CER), a portion of this uncertainty arises from the possibility that the cost-estimating form to which regression analysis is applied may be the incorrect one. 2012Journal of Cost Analysis and Parametrics
Comparison of Cumulative Average to Unit Learning Curves: A Monte Carlo ApproachTrevor P. Miller, Austin W. Dowling, David J. Youd, Eric Unger, Edward D. WhiteCumulative average and unit cost learning curve methodologies dominate current learning curve theory. Both models mathematically estimate the structure of costs over time and under particular conditions. While cost estimators and industries have shown preferences for particular models, this article evaluates model performance under varying program characteristics.2012Journal of Cost Analysis and Parametrics
Here, There Be Dragons: Considering the Right Tail in Risk ManagementChristian B. SmartThe portfolio effect is a common designation of a supposed reduction of cost risk achieved by funding multiple projects (the portfolio) that are not perfectly correlated with one another. It is often relied upon in setting confidence-level policy for program or organization budgets that are intended to fund multiple projects. The idea of a portfolio effect has its roots in modern finance, as pioneered by 1990 Nobel Memorial Prize in Economic Sciences recipient Harry Markowitz (1959). 2012Journal of Cost Analysis and Parametrics
A Case Study on Target Cost Estimation Using Back-Propagation and Genetic Algorithm Trained Neural NetworksAdil Salam, Fantahun M. Defersha, Nadia F. Bhuiyan, M. ChenCost estimation of new products has always been difficult as only few attributes will be known. In these situations, parametric methods are commonly used using a priori determined cost function where parameters are evaluated from historical data. Neural networks, in contrast, are nonparametric, i.e., they attempt to fit curves without being provided a predetermined function. 2012Journal of Cost Analysis and Parametrics
Enhanced Scenario-Based Method for Cost Risk Analysis: Theory, Application, and ImplementationPaul R. Garvey, Brian J. Flynn, Peter J. Braxton, Richard C. LeeIn 2006, the scenario-based method was introduced as an alternative to advanced statistical methods for generating measures of cost risk. Since then, enhancements to the scenario-based method have been made. These include integrating historical cost performance data into the scenario-based methods algorithms and providing a context for applying the scenario-based method from the perspective of the 2009 Weapon Systems Acquisition Reform Act. 2012Journal of Cost Analysis and Parametrics
Preliminary Study of Palm-Oil Biodiesel Life Cycle Cost Analysis in IndonesiaSidhartha Sahirman, Dr. Saparso, Agus Sarjito, Mokhtar Awang, Lr. Shaharin A. SulaimanRecently, Biodiesel have been gaining popularity and getting accepted as an alternative fuel for diesel engines. In the last few years, Indonesian government has a special interest in the development of this renewable energy, the use of which can help reduce oil imports. Biodiesel has a bright future in this country, as Indonesia is the world's largest producer of crude palm oil (CPO) - a desirable feedstock for biodiesel production. Biodiesel has the potential to become a significant industry sector in Indonesia...2012Life Cycle Cost
Life Cycle Costs in CBANiatika Griffin, Ronako CarsonCost Benefit Analysis (CBAs) are of growing importance to the acquisition process. The Vice Chief of Staff for the Army signed a memo dated December 20, 2009 stating that, in this era of constrained resources, Army leaders need to be responsible stewards of government funds. Any new requirement that is unfunded and/or newly expanded program proposals will be accompanied by a thorough CBA. The CBA must identify the total cost proposal to the Army in the form of life cycle cost, benefits that will result, and the bill-payers that would be used to pay for the unfunded or new requirement...2012Life Cycle Cost
Designing a Conceptual Framework for Estimation and Analysis of Total Ownership CostF. Gurney Thompson III, Robert KouryIn recent years, the push for greater efficiency and productivity in Defense spending has yielded an increased focus on affordability analysis. Understanding and estimating Total Ownership Costs (TOC) is key in assessing affordability, and the cost community must adapt to support TOC estimation. This paper discusses the development of a conceptual framework for estimating TOC in support of a broader audience, from the acquisition community to program managers and even as a decision support tool for entities such as Congress, DoD Financial / budgetary community, and G-8 Program Analysis & Evaluation...2012Life Cycle Cost
DCARC Contractor Sustainment Cost Data CollectionSandra B. EnserIn the past, Contractor Sustainment (including Contractor Logistics Support (CLS), Contractor Performance Based Logistics (PBL), and other contractor sustainment efforts) were omitted from DoD centralized O&S databases, or were captured as an annual total with little or no supporting detail. For many programs, Contractor Sustainment costs are a major cost driver. They represent an increasing percentage of operating costs. As DoD faces major budget reductions, valid and timely Contractor Sustainment cost data supports analysis which enables Services to use declining operations funds effectively...2012Life Cycle Cost
Tactical Wheeled Vehicle (TWV) Fuel Economy Improvement Breakeven AnalysisRaymond KleinbergWith the rising costs of and attention towards fuel usage in the Department of Defense (DoD), efforts are being made to procure more fuel efficient vehicles and modernize the Army's existing Tactical Wheeled Vehicle (TWV) fleet. This research is part of an ongoing effort to help program managers make decisions regarding how much investment can be made into new technologies in order to see benefits within the estimated useful life of individual systems and the fleet as a whole.2012Life Cycle Cost
Discrepancy Report Prioritization and Software Maintenance ImpactsJennifer Woolley, Kyle ThomasIn the current budget-constrained environment, programs and organizations throughout the Intelligence Community (IC) are searching for new strategies to reduce costs. Rather than sacrificing new capabilities, maintenance is often one of the first areas of reduction. Many organizations perform developed software maintenance based on the submission of Discrepancy Reports (DRs), which are ranked according to their level of severity/importance. One approach to cutting maintenance costs involves only focusing on higher levels of DRs while leaving lower level DRs postponed or unaddressed. This strategy allows for a reduction in personnel because fewer DRs must be completed...2012Life Cycle Cost
Software Sustainment: Pay Now or Pay LaterArlene MinkiewiczWith today's budget constraints the Department of Defense (DoD) is likely to build less new systems. In light of this, the systems that we have today will need to be stretched further. This means more time and effort will be going into maintenance. And since many systems fielded today rely on software to deliver much of their mission critical capability; this means more time and money will be devoted to software sustainment...2012Life Cycle Cost
How VAMOSC VIEWS Can Help You!Brian Welsh, Walt Cooper, Paul Hardin, Elizabeth Koza, Robert NehringSeveral recent developments underscore the importance of identifying and analyzing operating and support (O&S) costs for weapon systems. This brief will present the VAMOSC VIEWS, innovative graphical displays of O&S cost data that serve as an invaluable tool for the cost analyst. In response to current economic conditions, we are entering an environment of reduced budgets within the Department of Defense (DoD). Legislative initiatives, starting with the Weapon Systems Acquisition Reform Act (WSARA) of 2009, are aimed squarely at increasing the visibility and use of O&S costs in weapon systems decision-making...2012Life Cycle Cost
Estimating Software Maintenance Costs for U.S. Army Weapons SystemsCheryl Jones, James Judy, Brad ClarkThe emerging defense economic environment is characterized by scarce resources, extended system life cycles, and evolving mission capability requirements. As a result, the Army's investment in software maintenance, sustaining engineering, and operational support efforts is coming under increased scrutiny by decision makers at all levels. Of particular interest is the validity and defensibility of the estimates used to determine, allocate, and evaluate the value of software maintenance funds...2012Life Cycle Cost
Software Maintenance Data Collection and Estimating ChallengesVanessa Welker, Peter Braxton, Wilson Rosa, Corinne Wallshein, Joe DeanEntering a period of fiscal austerity, it becomes more important than ever to estimate and consider operating and support (O&S) costs, which represent the lion's share of life cycle cost (LCC) for most platforms, during acquisition. Given the ubiquity of software in today's complex programs, a key component of O&S is software maintenance. This paper presents the results of a research study co-sponsored by the Air Force Cost Analysis Agency (AFCAA) and the Naval Center for Cost Analysis (NCCA) to collect software maintenance data from government support activities and development contractors to enable high-fidelity cost estimates for software maintenance...2012Life Cycle Cost
Job Satisfaction: The Link to Retention and the Correlation to Age, Gender and Organizational PositionDarrin L. DeReusIn an attempt to balance resource availability and workload, leaders in the United States Air Force have attempted to manage the budgets by fluctuating manpower levels. There is minimal research of multiple affiliations (active duty military, government civilians and contractors) and the effects of manpower reductions on the organization. This study collected data on career anchors and satisfaction levels to find the similarities and differences of multiple affiliations in the United States Air Force. The results of this study showed that there were correlations between career anchors and satisfaction scores...2012Management
Using Treasury Securities to Develop Inflation IndicesC. Tyler Cunningham, Joseph ParisiThis paper examines inflation assumptions in Department of Defense cost estimates. Professional guidance allows and encourages the development of custom inflation indices, but there are few resources detailing the methodologies to develop those indices. We address that need by laying out detailed steps to develop defensible, custom, risk-adjusted inflation indices using techniques that leverage analysis of the spread and volatility between traditional US Treasury Securities and Treasury Inflation-Protected Securities. 2012Management
A Holistic Approach to Multiyear ProcurementsAnn Hawpe, Travis WinsteadMulti-year contracts have the potential to save the Government a substantial amount of money if they are executed on stable programs in an environment with stable requirements and funding. With the current economic pressure for defense programs to do more with less, multi-year contracts are a good option for programs that meet these criteria. The largest amount of savings is the result of procuring a larger quantity of parts than would be procured with a year-over-year contract structure, thus realizing economies of scale. Savings are also captured when the labor hours to develop and submit year-over-year proposals are factored into the calculations...2012Management
Social Media ApplicationsMarc Wear, Michelle Ehlinger, David Pearson, Sara WiseThis paper explores the impact of adopting social media tools in the work environment as a means to improve communication, collaboration, and productivity for cost and earned value analysts. Public and private entities have taken steps to adopt social media platforms to benefit their organization; this paper demonstrates how cost analysts can implement these tools to improve methods and processes...2012Management
Building a DHS Cost Estimating & Analysis Center of ExcellenceKatie Geier, Colleen Craig, David Brown, Kevin CincottaThe Department of Homeland Security (DHS) Cost Estimating & Analysis (CE&A) Center of Excellence (COE) serves as the Department's primary advocate and resource for cost estimating and analysis. Formed in October 2011, the CE&A COE's primary goal is to take a proactive, streamlined approach to improving the Department's resources and skills in cost estimating and analysis. This presentation highlights the roles and responsibilities of CE&A COE and summarizes the work accomplished to date and planned future efforts for accomplishing the following objectives:...2012Management
Cloud Computing: Federal Mandates and the DoDHeather Nayhouse, Eric LumsdenThe Federal CIO has laid out a plan to implement cloud computing in all federal agencies and consolidate 800 data centers by 2015. The Office of Management and Budget has required agencies to default to cloud computing solutions wherever those solutions are secure, reliable, and cost-effective. In response, the Department of Defense has launched a number of pilot programs as well as its own strategic plan to foster their deployment of cloud computing systems. Several case studies of cloud solutions within the Department of Defense are already available. The DoD is analyzing these cases for guidance into a larger push into cloud computing systems...2012Management
A Case Study in Broad System Analysis: DoD Spectrum Reallocation Feasibility Study, 1755-1850 MHzBrian Wilkerson, Guenever AldrichWith the increasing demand placed on the electromagnetic spectrum by the proliferation of web enabled cell phones, the President has begun an initiative to sell 500 MHz of dedicated bandwidth from the federal government to the wireless industry. The 1755-1850 MHz band was selected as the first band under this initiative for a study of the cost and technical feasibility of relocating DoD assets that use these frequencies. The study was incredibly broad as all services were involved, and the systems affected include Precision Guided Munitions, Point to Point Microwave, High Resolution Video, Multiple Radio Systems, Electronic Warfare, Air Combat Training, Telemetry, Satellite, and Small Unmanned Aerial Systems...2012Management
Using the Tools of Persuasion to "Sell" Your EstimateJennifer KirchhofferWhat is the difference between a good cost estimator and a great cost estimator? Both have robust estimating skills and knowledge, but only great cost estimators are able to garner acceptance and usage of their estimates from even their harshest critics. The ability to "sell" the estimate is key to the utilization of the estimate in the proper context by the decision makers. Our jobs often entail delivering news that our customers and leaders are not anxious to hear. Many factors work against us as we try to garner support and acceptance of the estimates...2012Management
Capabilities Based Portfolio Analysis (CBPA)Paul Gvoth, Brad EllisTotal cost is no longer the singular driver of defense acquisition program economic decision making. Defense programs and projects must be evaluated according to their contributions to overall monetary and non-monetary Return on Investment (ROI) relative to the capabilities of a Family of Systems (FoS), System of Systems (SoS) or as a part of a portfolio of products that together meet documented operational requirements. This is needed because future defense strategies, including Smart Defence are being driven by budget cuts...2012Management
NRO Program Assessments - Best Practices and Lessons LearnedGreg Lochbaum, Linda Williams, Lisa Keller, Ken OdomThe National Reconnaissance Office (NRO) performs program assessments for a variety of purposes: the management of acquisition programs; portfolio analysis for resource allocation; and accountability to NRO's oversight organizations - Execution to Oversight. The NRO Cost Analysis Improvement Group (CAIG) recently stood up a Program Assessment group to continuously improve the NRO's program assessment process to become more efficient, consistent and standardized. This presentation provides the NRO's approach to each level of the Execution to Oversight pyramid...2012Management
How does a $20B Program Become a $30B Program? Lessons Learned in Production Cost ManagementSteve Sheamer, Allen GaudelliHow does a $100M ship become a $200M ship or a $75M fighter jet turn into a $125M fighter jet? As we have seen recently with several high profile acquisition programs, systems integrators and cost estimators continue to struggle when estimating the production costs of development systems. This presentation focuses on production-related cost drivers and why contract incentive-fee structures are often part of the problem, then details steps for better management of these production costs...2012Management
Case Study of Army Cost Management: Managing Variability in Analytical Cost Products for an ACAT ID ProgramTomeka WilliamsArmy Cost Management requires an understanding of the level of commitment the Department of the Army has made to cost management that goes beyond the Program Management Office (PMO) through, TRADOC, and HQDA levels. By requiring any program that is a new start, or major upgrade of an existing program to submit a Cost-Benefit Analysis based on guidance from the VCSA, is one of the unique cost management products the Army has instituted. This is now the first step to securing a viable and executable acquisition program within the Department of the Army...2012Management
Trying To Do Too Much with Too Little: How Poor Portfolio Management Can Lead to Schedule Delays and Cost OverrunsChristian SmartGovernment organizations often try to accomplish too much with the resources with which they are provided. The genesis of new projects in government organizations often begins with the search to spend idle funds. An organization's leadership may discover that it has a relatively small amount of money that can be used to research a new technology or to begin initial development of a new project. New ideas to improve systems, develop better, newer ones, or to advance scientific understanding abound. In order to sell leadership, project management provides leadership with success-oriented, optimistic projections...2012Management
IT Service-Based Costing: Standardizing Provisioning and Servicing IT ResourcesEmily Jessen, Paul BrownIn today's current economic environment, there is a strong focus on constraining spending of appropriated funds to meet the mission particularly within the Intelligence Community and Department of Defense (DoD). Part of the DIA's mission includes the delivery of IT products and services. In the past, IT services have been provided to non-Core customers for free, but in the new fiscal environment, the DIA does not have sufficient resources to continue to operate in this manner. In compliance with the Economy Act of 1932 and DoD Financial Management Regulations, the DIA has been taking steps to quantify the IT services it deploys in order to recover costs of those services...2012Management
Modeling R&D Budget ProfilesErik BurgessPrior research sponsored by the National Reconnaissance Office (NRO) and published in the Journal of Parametrics (Summer 2006) has become the principal method for developing space and ground segment budget profiles for NRO programs. Since then, the practice of budgeting to independent cost estimates has been signed into law for the intelligence community. The resulting scrutiny and reliance on budget-phasing models has motivated several improvements that should be relevant to the estimating process for other DoD commodities. The presentation first reviews the mathematical formulation of expenditure profiles that are based on historical data, and how some of the resulting accuracy metrics have been useful in defending annual budgets. New models for space and ground systems are presented. Since the annual budget authority required to support an expenditure stream is heavily dependent on a program's execution rates (i.e., outlay rates), we also present a new approach for estimating what those outlay rates will be. This is a new area of responsibility for most cost estimating organizations, yet its impact on annual budget requests can be significant.2012Management
Evaluating the Life Cycle Cost and Effort of Project Management for Complex Systems Development ProjectsLeone YoungAcross industries, systems development effort has become more costly and complex in nature over the decades, and the costs of project management that are associated with systems development effort have shown an upward growth as well. The estimation of systems development cost and effort has been studied by many scholars and researchers in several different areas such as hardware, software, integration and systems engineering, but the area of project management cost and effort has had very limited exploration, and the literature is void of project management costing methodologies as of today...2012Management
Defense Cost & Resource Center (DCARC) Executive Overview BriefMike AugustusThe OSD stood up an independent office to manage the CCDR (Contractor Cost Data Report) requirement in the late 1990s. Today, that office is known as the Defense Cost and Resource Center (DCARC), which is part of the Office of the Secretary of Defense (OSD) Cost Assessment and Program Evaluation (CAPE). The original charter has been expanded to accommodate the collection of not only cost data but software metrics data as well, from industry on ACAT 1 Major Defense Acquisition Programs and Major Automated Information Systems (MAIS) programs. Today the Cost and Software Data Report (CSDR) requirement spans over 140 MDAP ACAT 1C/1D programs and a large number of MAIS 1AM and 1AC Programs...2012Methods and Models I
One NASA Cost Engineering (ONCE)Eric Plumer, Mike BlandfordThis presentation is the first roll out of NASA's ONCE (One NASA Cost Engineering) database to the SCEA/ISPA community. Concept and early prototyping of the One NASA Cost Engineering (ONCE) database began several years ago and it has been developed to fully automate the CADRe data in a GUI database allowing for easy search and retrieval of data for the cost community. By automating the retrieval of CADRe data this tool is greatly improving the space cost community's ability quickly acquire the data needed to improve the quality and credibility of these cost estimates...2012Methods and Models I
Galaxy Charts: The 1,000-Light-Year View of the DataRobert Nehring, Katharine Mann, Robert JonesThe old adage "A picture is worth a thousand words" is one that we have all heard and often try to use to our advantage. This paper applies the principles of visual display of information as advocated by Edward Tufte and others to develop an innovative graphic that will prove invaluable to cost estimators and consumers of their estimates everywhere...2012Methods and Models I
Building an Agile, Collaborative Environment for Capturing Productivity Based Cost Model DataDavid Bloom, Wanda Grant, Mason WexlerUnderstanding and analyzing productivity within an Aerospace electronics development organization requires continuous oversight and adjustment. For example, in digital electronics, a corollary to Moore's Law says that the capability of digital processor components doubles every 18 months. This would imply that the cost to design a set of digital electronics products to deliver a specific state-of-the-art capability would change (become less) as a function of the ability to incorporate more features on any one component...2012Methods and Models I
United States Marine Corps Logistics Requirements Funding Summary (LRFS) Cost Estimating Tool (CET) - A Quick Cost Estimator for Logisticians Part IICharles GuSimilar to a Life Cycle Cost Estimate (LCCE), the Logistics Requirements Funding Summary (LRFS) captures Integrated Logistics Support (ILS) related costs for a program throughout its life cycle. The LRFS provides visibility into the logistics requirements of a program which is needed during Program Objectives Memorandum (POM) and budget submissions. Additionally, it helps to plan and quantify requirements, identify and defend funding, and serves as the ILS input to the LCCE...2012Methods and Models I
Modeling Potential Cost Savings by Synchronizing Commercial Derivative Acquisition & Lifecycle ProgramsBrad BoehmkeSynergy is defined as "the combined or cooperative action of two or more stimuli for an enhanced effect" (Razzetti, 2009). As Department of Defense (DoD) system acquisition costs continue to grow and commercial derivative programs becoming more accepted, it is critical that the Air Force look for ways to create cost savings in every possible way. With commercial derivative programs becoming more popular in defense acquisition, there may be acquisition programs running concurrently that have system commonalities and could benefit from synchronizing acquisition efforts. By leveraging commonalities in requirements, the Air Force and the Department of Defense may be able to "obtain cost savings through acquisition and logistics planning" (Poulin, 2010)...2012Methods and Models I
Zero Based Review MethodologyMartha Wells, Peter MeszarosThe days of incrementally funding activities based on past performance are over at DIA. Increased budget pressure throughout the federal government has created an environment for improvements to the way activities are funded. DIA is approaching this increasingly challenging budget environment with a new programming methodology - Zero Based Reviews. The Zero Based Review (ZBR) process incorporates programmatic analysis and activity-based costing to approach budgeting from a right sizing methodology. Rather than reviewing the enterprise based on current structure, the ZBR takes a topical approach to program review to determine if the right resources are being used in the right way for the right mission...2012Methods and Models I
Cost Benefit Analysis OverviewDarrell HamiltonThere are many different requirements to do some type of cost and benefit tradeoff analysis as part of the acquisition process, IT investment decisions, performance based logistics decisions and budget change proposals. Most of them have nuance definition differences and are known by their acronyms as CBA, EA, BCA, AoA, COA and several other terms. Every one of those analysis requirements is aimed at demonstrating or helping others to decide on what is the most reasonable and cost effective decision that needs to be made...2012Methods and Models I
Accuracy Matters: Selecting a Lot-Based Cost Improvement CurveShu-Ping Hu, Alfred SmithThere are two commonly used cost improvement curve (CIC) theories: unit cost (UC) theory and cumulative average cost (CAC) theory. Ideally, analysts develop the CIC by analyzing unit cost data. However, it is common that instead of unit costs, analysts must develop the CIC from lot cost data. An essential step in this process is to estimate the theoretical lot midpoints (LMP) for each lot, to proceed with the curve-fitting process. LMP is generally associated with UC theory, where the midpoint is always within the lot. The more general lot plot point (LPP) term is used in the context of both the UC and CAC theories...2012Methods and Models I
Developing Standardized Cost Element Structures for the United States Marine CorpsJeremy EdenThis presentation explains the development of a Cost Element Structure (CES) for use in United States Marine Corps (USMC) Logistics Requirements Funding Summaries (LRFS). Industry and the Assistant Commander for Life Cycle Logistics formed an Integrated Product Team (IPT) to create a new CES. The IPT comprised logisticians and cost estimators who worked together over the course of one year to create the new USMC LRFS CES...2012Methods and Models I
Enhancing Excel-Based Cost Models with PivotTable ReportingBlaze Smallwood, Omar MahmoudA major challenge facing DoD cost estimators is developing Program Life Cycle Cost Estimates (PLCCEs) that serve as more than just an acquisition reporting "check-in-the-box" document. The ideal is to simultaneously create a dynamic, flexible tool that the program office can use to support their day-to-day costing needs, from answering a wide-range of cost-related data calls to integrating PLCCE outputs into their Program, Planning, Budgeting, and Execution (PPBE) processes. For DoD cost estimators that utilize Microsoft (MS) Excel to create their PLCCE models, a valuable tool that can be leveraged to achieve these goals is Excel's PivotTable reporting capability...2012Methods and Models I
An Approach to Improving Cost Estimating and Budget Integration in Federal ProgramsMichael Noonan, Kristin Jackman, Eric HongOver the past few years, political standoffs in Washington, D.C. have delayed or prevented passage of Federal Appropriations Bills and significantly reduced Federal spending. As Agency budgets continue to shrink, programs must increasingly compete with each other for a slice of dwindling resources. Program and Departmental managers responsible for resource decisions are demanding more detailed and agile resource justifications which can accommodate multiple budget scenarios. In response, program offices require tools that efficiently translate estimated program costs into flexible and defensible budgets...2012Methods and Models I
A Canadian F-35A Joint Strike Fighter Cost Estimation ModelBohdan L. KaluznyThe F-35 Joint Strike Fighter (JSF) is a single-engine, stealthy (radar-evading), supersonic multi-role fighter. Canada intends to purchase 65 F-35A-conventional takeoff and landing variant (CTOL) - jets for delivery between 2016 and 2022. In order to secure production, Canada will likely have to commit to procurement as early as 2012. Until recently, Canada relied on the United States (U.S.) JSF Program Office (JPO) for projected costs. In early 2011, the Department of National Defence (DND) F-35A cost estimate came under public scrutiny as a result of a Parliamentary Budget Officer report (Canada's equivalent to the U.S...2012Methods and Models I
Will-Cost and Should-Cost Management: It's Not Business As UsualZachary JasnoffIn April , 2011 Under Secretary of Defense for Acquisition, Technology & Logistics Ashton Carter issued the Memorandum: "Implementation of Will-Cost and Should-Cost Management". The memo defines implementation of Should-Cost and Will management for all ACAT I, II and III programs and lists "Selected Ingredients of Should Cost Management". Thus, each organization involved with these programs must successfully deal with the challenges or planning, coordinating and managing Should Cost/Will cost programs and have the necessary tools to quantitatively manage them through their life cycle...2012Methods and Models II
DoD Contracts Database and Interactive ToolBrian OcteauProposal Summary Good cost estimates, analyses and assessments should consider all sources of data available. One of those sources - contract data - is often overlooked, and sometimes misused. For many, the collection, compilation, and interpretation of contract data can be an unwieldy and time consuming venture. However, once compiled, the data can be an invaluable resource for cost analysts, providing insight relative to cost, schedule, and technical growth, highlighting data completeness gaps inherent in other data sources, and revealing data trends and/or patterns...2012Methods and Models II
Rapid Business Case Development Using Macro-Based Excel ToolAndrew HutchinsonPurpose: To present a tool that creates business cases based on user-entered customizations such as the start and end year of the investment. With the increasing scrutiny of federal government budgets the development of business cases for future investments will take on added significance. A best practice within industry is the development of business cases to analyze the net present value of potential investment opportunities. The question this analysis answers is how much value will potential initiatives create within your organization and allows for an "apples to apples" comparison of multiple investment opportunities...2012Methods and Models II
An Intuitive Application of Cost Risk Analysis to a LRFSBlake BoswellThis presentation explains the implementation of an Uncertainty Modeling Capability (UMC) into a USMC LRFS. Similar to Life Cycle Cost Estimate (LCCE), the Logistics Requirements Funding Summary (LRFS) captures Integrated Logistics Support (ILS) related costs for a program throughout its life cycle. The LRFS provides visibility of logistics requirements for Program Objectives Memorandum and budget submissions. Additionally, it helps to plan and quantify requirements, identify and defend funding, and serve as the ILS input to the LCCE...2012Methods and Models II
Creating a Framework for Cost Analysis: The Cost Analysis Support Tool (CAST)Eric LumsdenThere are many redundancies encountered when creating new cost models, and many operations that can be automated. CAST is a tool created by TASC that provides a framework to accelerate the process of building a cost model. Estimators are able to execute processes at the click of a button that would otherwise be very time consuming to create from scratch or complete by rote. They are able to transfer over work from other projects to eliminate redundant operations...2012Methods and Models II
Identifying the Cost Capabilities of the DoDAF Architecture FrameworkHolly A. H. Handley, Resit Unal, Andreas TolkThe Department of Defense Architecture Framework (DoDAF) is a key tool used by the United States engineering and acquisition communities to describe the overall structure for designing, developing, and implementing systems. In 2009, version 2.0 of DoDAF was released. This version represented a major shift in focus from output products, or documentation, to the collection of the underlying data that represents the architecture. This version describes a new data model, the DoDAF Meta Model (DM2); the DM2 provides the mechanism needed to collect, organize, and store data. While many pre-defined models reside in the DoDAF, none of them are considered cost models of the system...2012Methods and Models II
EDGARS CB - A New Memory Tool for CostersWilliam BarfieldAre you befuddled by what it takes to do a cost estimate? What kind of information is used and when is it needed so that a defendable cost and benefits estimate can be provided to the investment decision authority? Since beginners and professionals alike rely on tricks-of-the-trade to help get their job done, a new and easy memory aid is presented...2012Methods and Models II
Significant Reasons to Eschew Log-Log OLS Regression when Deriving Estimating RelationshipsStephen A. BookLog-Log Ordinary Least Squares (LLOLS) regression, considered in the 18th and early 19th Centuries as the best (and, in fact, the only) method for fitting nonlinear algebraic relationships of the form y = axb to data sets of (x,y) pairs, has a number of serious defects that make it far from adequate for CER development in the 21st. No other option was available 200 years ago, but the advances in computing power and techniques of statistical optimization available to us today leave no reason to stick with an obsolete method. In the 21st Century, we insist that 21st Century engineering technologies be applied, so why would we continue to develop CERs to estimate them using 18th Century statistical methods?...2012Methods and Models II
Valuation in Cost Estimating: Taking a Page from the Investment Banker's PlaybookKevin SchuttValuation is "the process of estimating what something is worth". It is commonly undertaken in investment banking to support activities such as initial public offerings of company stock and mergers & acquisitions. The need for valuation analysis arose in a cost estimation setting in order to satisfy Office of Management and Budget budgetary treatment criteria, yet valuation techniques in general may be a useful addition to the cost estimator's toolkit, facilitating price analysis and cross-checks of cost estimates. Likewise, cost estimation techniques may inform valuation analysis. While cost estimation generally results in market-independent estimates, valuation provides the viewpoint of what the market will bear. This presentation will provide an overview of four major valuation techniques: Price multiple; Replacement cost; Comparable transaction, and Discounted Cash Flow. A case study in valuation will be presented. 2012Methods and Models II
Cost Estimating Training for Non-Cost EstimatorsJeremy EdenThe effort to improve the accuracy and fidelity of cost estimates is critical given budget restraints. As this effort continues, non-cost estimators (program managers, engineers, logisticians, etc.) are more frequently facing the demand of preliminary cost estimates prior to the development of formal cost estimates. These preliminary estimates are done in order to streamline the cost estimating process and guard against risk for a program in development. While most non-cost estimators have the information and tools available to address early program cost estimating needs, many do not have the skills and experience necessary to make good use of the resources at their disposal...2012Methods and Models II
Just-In-Time Cost AnalysisJohn KoBooz Allen developed and maintains a robust Life Cycle Cost Estimate (LCCE) for the acquisition of the Urgent Universal Need Joint Mine Resistant Ambush Protected (MRAP) vehicle program, the Defense Secretary's top acquisition priority. Building and maintaining the LCCE was challenging because the MRAP Program is unique from other urgent needs Acquisition Category I-D programs in two ways. First, to accommodate the aggressive and accelerated schedule, the MRAP Program conducted an open competition, which required a review of several designs from various contractors in a short period...2012Methods and Models II
Estimating Alternatives for Joint Future Theater Lift (JFTL)Robert Georgi, Bruce PurselUS military operations in Afghanistan and Iraq are making military transport aircraft work at rates not foreseen just a decade ago. At the same time the existing inventory of transport aircraft are limited in their ability to transport heavy cargo to austere unimproved landing zones where military forces routinely operate. The US Air Force (USAF) with considerable support from the US Army conducted the Joint Future Theater Lift (JFTL) Technology Study (JTS) to consider how best to equip its theater lift fleets beyond 2020 to address this need...2012Methods and Models II
Exploring Methods of Conflating Data from Various Data SourcesAshley MosesWhen creating a cost estimate, it is not out of the realm of possibilities that one might be required to combine data from multiple sources to get an accurate estimate. Coleman, Braxton, Druker, et al. have presented research on which methods might be the best to combine opinions given by subject matter experts (SMEs). Additionally, they have also discussed what adjustments are needed to make the estimates provided by SMEs more accurate as studies suggest that SMEs will tend to underestimate the range of possible data values when appropriate feedback is not given to them...2012Methods and Models II
Volatility and Cost EstimatingJennifer LeottaThis paper will examine the properties and uses of implied volatility, stochastic volatility, and historic realized volatility. Further discussion will focus on what applications an assessment of market volatility brings to the field of cost estimating through the application of a volatility range derived for fuel prices at varying intervals over the course of a generic program's life cycle.2012Methods and Models II
Introducing Reliability and Availability Requirements into TOC ModelsEvin J. Stump, Wendy W. LeeTwo important system requirements are reliability and availability. Reliability is the probability of no disabling failures over a certain span of time, while availability is the ratio of the time the system will actually fulfill its operational expectations, to the total time it is expected to fulfill them. Both of these are often included in system specifications...2012Parametrics
Estimating Relationship Development Spreadsheet and Unit-as-an-Independent Variable RegressionsRaymond P. Covert, Noah L. WrightMCR has constructed an estimating relationship (ER) development spreadsheet based on the zero percent bias, minimum percent error (ZMPE) regression technique to help with more credible and efficient development of cost improvement curves and cost estimating relationships (CERs). The CER-development method accommodates linear, power and triad functions with single and multiple technical and dummy independent variables. Furthermore, the ER development spreadsheet may be modified to accommodate other functional forms that may be of value in particular contexts...2012Parametrics
Are Parametric Techniques Relevant for Agile Development Projects?Arlene MinkiewiczAgile software development practices are predicated on the following tenets as introduced in 2001 in the Agile Manifesto [1] - Individuals and interactions over processes and tools - Working software over comprehensive documentation - Customer collaboration over contract negotiation - Responding to change over following a plan2012Parametrics
Comparing Bottom-Up and Top-Down Estimating Approaches in a Custom Cost ModelDonald L. Trapp, Noah L. Wright, Lisa HackbarthDeveloping a cost model involves several steps such as data collection, normalization, cost estimating relationship (CER) development, model building and documentation. Choosing an appropriate level of detail for each CER is an important consideration. This is influenced by several factors such as user needs, data availability as well as the statistical behavior of the data. The cost modeler must sometimes choose to estimate costs for particular work breakdown structure (WBS) elements with either a bottom-up or top-down approach...2012Parametrics
Error Analysis of a Custom Cost ModelRaymond P. Covert, Donald L. Trapp, Noah WrightHow do you know how well a custom cost model performs? If it is based on historical data, then the simplest performance measurement is the model's percent standard error (PSE). The PSE measures a historically-derived model's ability to re-predict its underlying database. In this presentation we will demonstrate an analytic error analysis of a custom cost model using historically-derived cost estimating relationships (CERs) and factors. We will provide a sample model and show how we calculated the PSE of each of the CERs from the data as well as developing the error calculations for each step in the model. We will show how we used the concepts of effective correlation coefficients, propagation of errors, and the calculated variances of the sums and differences of two random variables.2012Parametrics
The Function Point Based Pricing Model: The Price is Right!Daniel B FrenchMany IT organizations face a difficult challenge managing contractual engagements with vendors for application development and maintenance services. Regardless of the pricing model or acquisition strategy, projects come in late and over-budget more often than not. Be it a time and materials, cost plus, fixed price or other contractual arrangement, both the vendor and client face the challenge of providing the functionality desired, minimizing risk, and ensuring that the vendor is able to generate a reasonable profit from the engagement...2012Parametrics
Tactical Missile Bluebook Cost Model DevelopmentDonald L. Trapp, Noah L. Wright, Lisa HackbarthAs part of a quality review process, MCR constructed a cost model based on the cost estimating relationships (CERs) and cost factors contained in the Tactical Missile Bluebook. This effort proved to be valuable in refining the existing CERs and factors and in improving the Bluebook. While building the model, the CER developers found instances where CER input definitions in the documentation needed to be more specific to better serve CER users. When comparing candidate CERs to each other in the regression development process, CER developers relied on error statistics and goodness-of-fit metrics, but these metrics provided an incomplete comparison...2012Parametrics
Tactical Missile Bluebook and Cost Model OverviewRaymond P. Covert, Donald Trapp, Noah Wright, Andrew DrennonMCR recently developed the Tactical Missile Bluebook together with a cost model based on the information contained in it. The Bluebook contains cost estimating relationships (CERs) for tactical missile development and production based on 13 programs and 153 recurring production lots. In this presentation, we show how the Bluebook is organized and how the CERs and factors were developed, including the filtering methods used to determine data points to be included and excluded from CER development, the challenges of building a useful estimating method from this compilation of data, the methods used to develop the estimating relationships, the results obtained, and significant general findings...2012Parametrics
Understanding and Measuring the Impact of Design and Systems Engineering Decisions on AffordabilityZachary JasnoffMethodologies for understanding and measuring the impact of design and systems engineering decisions on affordability In today's data driven cost estimating environment, it is critical to understand the impact of design and systems engineering decisions on cost. It is also important to leverage actual cost data in developing data-driven Cost Estimating Relationships (CERs) that relate engineering design and performance parameters to cost. In addition, the ability to conduct sensitivity analysis on not only the CER, but on the entire system estimated is needed to understand the full impact on Measures of Effectiveness and Measures of Performance...2012Parametrics
Applying Parametric Cost Models as a Predictive Parameter for CMMI ComplianceDavid Bloom, Wanda Grant, Robert Wright, Gary BosworthThe Electronic Engineering Center (EEC) of Raytheon Space and Airborne Systems (SAS) recently breezed through a Capability Maturity Model Integration (CMMI) Level 5 assessment, jumping from an existing CMMI level 3 capability and without needing a separate CMMI Level 4 assessment in the process. Although this success was the result of a lot of hard work by many contributors, the bottom line is that there were very few changes in terms of business processes that were required to obtain a CMMI Level 5 organizational assessment, the hallmark of a learning and optimizing organization...2012Parametrics
Using Method of Moments in Schedule Risk AnalysisRaymond P. CovertA program schedule is a critical tool of program management. Program schedules based on discrete estimates of time lack the necessary information to provide a robust determination of schedule uncertainty and therefore the risk that the proposed schedule will be completed on time. To determine the risk that a proposed, discrete schedule will meet a particular schedule and to find the probable critical paths (i.e., the criticality index), a probabilistic schedule risk analysis (SRA) is performed. SRA is a process by which probability density functions (PDFs), usually triangular, are defined at the task level in an effort to quantify the uncertainty of each task element...2012Risk
Joining Effort and Duration in a Probabilistic Method for Predicting Software Cost and ScheduleMichael A. RossThis paper describes a data-driven method for estimating the cost and schedule of software development projects. This method correlates the estimates of cost and schedule such that constraining (perhaps reducing) the cost will impact the estimated schedule and constraining (perhaps compressing) the schedule will impact the estimated cost. This method provides these estimates of cost and schedule that are probabilistic (i.e., provide a range of possible outcomes with associated probabilities of attainment); a capability that is essential to analyzing the impact that affordability and budget constraints have on program cost and schedule and their associated risks...2012Risk
Is My Schedule Ready for Risk Analysis?Mario FountanoThe ultimate goal of schedule risk analysis is the ability to provide project stakeholders with a risk mitigation plan that allows the team to make informed decisions concerning future cost and schedule performance. However, in order to perform schedule risk analysis, a project schedule must first be valid, and provide a clear path to project completion. This study presents the steps our team takes in order to evaluate the readiness of project schedules prior to integrating a risk analysis. Often, project schedules contain flaws in duration and logic, are left un-statused, or were never developed to capture the true nature of project work. The assessment process our team follows helps to build more confidence and transparency into project work. 2012Risk
Utilizing Optimization Technique to Enhance Cost and Schedule Risk AnalysisColin Smith, Brandon HerzogAdvancements in Monte Carlo simulations have enabled cost and schedule uncertainty analysis to be conducted on large, complex programs which could never be analyzed using conventional methods. Analysts are now able to predict with increasing confidence the budgetary requirements, schedule forecast, and risk mitigation measures on projects at all points during their lifecycle. These advancements in Monte Carlo technologies and methodologies have further lowered the barrier to entry for conducting uncertainty analysis; have reduced simulation times to mere fractions of a second; and have allowed for a comprehensive, integrated cost, schedule and risk analysis...2012Risk
The Unseen: Statistical Inference with Limited DataTrevor VanAttaObjective measurements of probability are often unavailable, and most significant choices under risk require an intuitive evaluation of probability.' -Nobel Laureates Daniel Kahneman and Amos Tversky. What are the odds of rolling a sum total of seven when tossing two dice? What is the probability of red turning up after a spin of a European roulette wheel? Most analysts, given a little time and a calculator, could answer these two questions with exact precision. For both of these questions, there is only one true correct answer. Such is the nature of probability analysis for questions that are decompositional (all possible outcomes can be determined), frequentistic (the experiment can be repeated an infinite number of times), and algorithmic (the results can be measured with numbers)...2012Risk
Applying the Pareto Principle to Distribution Assignment in Cost Risk and Uncertainty AnalysisJames R. Glenn, Christian Smart, Hetal Patel, Lawrence JohnsonSignificant effort and statistical knowledge is required in order to complete an accurate uncertainty analysis of a cost estimate. First, a representative sampling of data points must be collected in order to model appropriate distributions for each random variable in the estimate. Additionally, correlation between the random variables in the estimate must be properly assessed and applied. Finally, to generate the cumulative distribution of total costs (i.e., S-Curve) either the joint cumulative distribution function must be computed or an approximation technique such as Monte-Carlo simulation must be used...2012Risk
Inflation Cost Risk Analysis to Reduce Risks in BudgetingMichael DeCarlo, Eric DrukerFor any project there is a danger of unanticipated cost growth because inflation rates are extremely difficult to estimate. This presents a significant challenge to estimators. Predicting future inflation rates with some precision is possible, however, when the appropriate analysis is implemented. Even with previous recommendations from a major federal and commercial consulting firm, there is evidence indicating that the government has not been making adequate assessments of inflation rates for future budgeting. Without applying proper attention and techniques to the analysis and prediction of inflation rates, budgets run a higher than necessary risk for increased cost growth due to inflation prediction error...2012Risk
Diagnosing the Top Level Coefficient of Variation: An Alternative ApproachDaniel J. AndelinThe coefficient of variation (CV), defined as the standard deviation divided by the mean, is a useful metric to quantify the uncertainty inherent in a probability distribution, as it provides a relative (and thereby more intuitive) measure of spread than the variance or standard deviation. In the field of cost analysis, the CV can be used as a diagnostic to determine the level of risk and uncertainty built into an estimate and whether those levels fall within the expected range for a program of a given scope and complexity and at a given stage in its life-cycle...2012Risk
Real Data, Real UncertaintyAlfred Smith, Jeff McDowell, Lew Fichter, Wilson RosaCentral to any cost risk analysis and model are the uncertainty distributions assigned to point estimates. Ideally, the analyst will have a database of historical cost and technical information that can be used to objectively develop Cost Estimating Relationships (CERs) using approved statistical methods. From that analysis the analyst will be able to objectively define the shape and dispersion of the CER uncertainty distributions. Similarly, with a suitable database or expert opinions to draw upon, the analyst will be able to develop uncertainty distributions for the inputs to the CERs...2012Risk
Enhanced Scenario-Based Method for Cost Risk Analysis: Theory, Application, and ImplementationBryan Flynn, Peter Braxton, Paul Garvey, Richard LeeThere is a growing realization within the cost-analysis community that estimates of cumulative probability distributions of cost, or S-curves, too often understate true, underlying risk and uncertainty. Several organizations cite cases where return program acquisition costs, or actuals, fall at the 99th+ percentile on S-curves estimated years previously. This degree of deviation from the mean is a legitimate possibility for any one acquisition program. After all, there's no such thing as an "average" program. Variation is expected...2012Risk
SAR Data Analysis, CV Benchmarks, and the Updated NCCA S-Curve ToolRichard Lee, Peter Braxton, Kevin Cincotta, Brian Flynn, Ben BreauxTo support the development of better probabilistic cost estimates, the Naval Center for Cost Analysis (NCCA) has championed the development of the S-Curve Tool, which was well received at both the 44th Annual Department of Defense Cost Analysis Symposium (ADoDCAS) in February, 2011 (1), and the joint Society of Cost Estimating and Analysis (SCEA) / International Society of Parametric Analysts (ISPA) conference in June, 2011. This paper presents ongoing research to support both continued improvement of the S-Curve Tool and greater understanding of the nature of cost growth for major acquisition programs; its mean value (risk) and variability (uncertainty); and the components thereof...2012Risk
A Systematic Approach for Empirical Sensitivity Analysis on Monte Carlo ModelsMatt PitlykA key component of risk analysis is the sensitivity analysis performed on the input variables for Monte Carlo models with the goal of determining those variables that cause the most variation in the final distribution and identify the best candidates for risk mitigation plans. While the standard technique of calculating the correlation coefficient between the final distribution and each input distribution is appropriate for linear models, it is not sufficient to accurately identify the largest uncertainty drivers. In the case of the non-linear models it is neither appropriate nor accurate...2012Risk
Fitting Absolute Distributions to Limited DataBlake BoswellThe choice of probability distributions is a critical component for cost risk and uncertainty modeling. When data is available, distribution fitting techniques, such as Goodness of Fit (GoF) tests and Information Criteria (IC), can be applied to determine distributions that accurately describe potential cost realizations; however, with limited data GoF tests and IC based methods provide little or no insight into the best distribution choice. Therefore, when data is limited it is standard practice in cost risk and uncertainty modeling to solicit expert opinion in the construction of triangular distributions with vertices representing the best case, typical, and worst case scenarios...2012Risk
The Importance of Software Cost Estimating Standards among a Diverse CommunityKyle Thomas, Jenny Woolley, William BlackMany cost analysis organizations within elements of the United States Intelligence Community (IC) have developed cost estimation and data collection methods that are based on years of Agency-specific historical programmatic data. These methods have been established over time, effectively creating independent approaches across the Community. While these techniques are valuable, the cost estimation and reconciliation process highlights differences in estimation methods that make analysis across the IC difficult...2012Software/Hardware
Learning Curve Analysis with EZQuant - An OverviewMichael MahoneySure you learned how to calculate learning curves by hand while studying for the CCEA exam, but what will you use to calculate them from day to day? This session will provide an overview of the freely available DCAA EZQuant regression analysis shareware. EZQuant takes minutes to learn and is easy to use. Unit and Cumulative Average learning curve calculations and projections are supported. Since DCAA recognizes the software you don't spend time validating the tool. Natural logarithms are fascinating but EZQuant is just fast.2012Software/Hardware
Estimating for Lifecycle and Product Line AffordabilityJoAnn Lane, Barry Boehm, Ray Madachy, Supannika KoolmanojwongA significant challenge in systems engineering and acquisition is to justify investments in new systems and the evolution of existing systems and product lines. In an era of shrinking budgets, especially in the U.S. Department of Defense, choices are driven by affordability, and not just the affordability of the initial development. Affordability includes development, deployment, usage, maintenance, and retirement/disposal costs. A way to assess or estimate system or product line affordability is through Total Ownership Cost (TOC) analysis...2012Software/Hardware
Software Cost Estimating for Iterative and Incremental Development ProgramsBob Hunt, Jon Kilgore, Jennifer SwartzIterative and Incremental Software Development processes are at the heart of a cyclic software development process. Iterative and Incremental Software Development processes evolved in response to the weaknesses of the traditional waterfall model. Iterative and Incremental Software Development start with an initial planning and ends with deployment with the cyclic interactions in between. The goal is to deliverer what the user needs at the end of the process, not only what was envisioned at the beginning of the process. Iterative and incremental development is an essential part of the Rational Unified Process, Extreme Programming, and generally the various agile software development frameworks...2012Software/Hardware
Cloud Computing - Changing the Way We 'Do' SoftwareArlene MinkiewiczIn 1961 at the MIT Centennial, John McCarthy opined "if computers of the kind I have advocated become the computers of the future, then computing may someday be organized as a public utility just as the telephone system is a public utility&. The computer utility could become the basis of a new and important industry."[1] In 2006, Amazon Web Services was launched providing computing on a utility basis. Since that time the notion of cloud computing has been emerging and evolving...2012Software/Hardware
Utilization of Visual Basic in Cost Estimating ToolsJeremy EdenAs collaborative computing environments become more prevalent in all industries, the cost estimating industry is no exception to this movement. Increasing amounts of pressure are put upon cost estimators to develop tools that are robust in design, have long standing methodologies, do not require proprietary software or licenses, and are and easy to use by both the advanced cost estimator looking for maximum control and the novice simply trying to diligently support the early stages of a development program...2012Software/Hardware
A Standard Process for Software Code CountingBetsy Legg, Brian EnserSoftware Lines of Code or SLOC has long been one of the critical inputs for measuring the effort and time involved with software development projects. SLOC is by no means the only way to measure the size of a project but it is the most mature measure and is well understood by decision makers. Software cost estimating that relies on SLOC as an input depends on accurate and consistent counts. The problem is that no standard has been universally accepted for defining SLOC. Numerous code counting tools in use throughout the industry rely on interpretations of the definition of a line of code...2012Software/Hardware
COTS Estimating Metrics for Increased Cost AccuracyJoshua Patapow, Eric TiminskiObserved cost analysis issue was that multiple Commercial Off-The-Shelf (COTS) Hardware/Software cost estimates were significantly higher than recent contract award values. Using data from those contract awards, an analytical approach and metrics were developed to ensure increased accuracy on future estimates. For COTS Hardware (HW) and Software (SW), a metric of the actual contract price as a percentage of the mean online price/list price (Part I) was developed. Additionally, an estimating relationship was developed (Part II) to calculate the cost of annual maintenance support as a percentage of the actual HW/SW initial cost and the annual maintenance support actual contract price as a percentage of a vendor quote...2012Software/Hardware
Estimation of Expedited Systems Engineering SchedulesBarry Boehm, Dan Ingold, JoAnn LaneA major objective of many organizations is to reduce the calendar time needed for a project to perform its systems engineering (SE) functions, without compromising the resulting product's needed functionality and quality attributes. This paper will present the derivation of a model to estimate the necessary calendar time to perform the project's SE functions as a function of its product drivers (size, domain familiarity), process drivers (maturity, streamlining), project drivers (team cohesion, collaboration technology support), people drivers (knowledge, skills, and agility), and risk drivers (vs. relative needs for speed, quality, and scalability)...2012Software/Hardware
A Closed-Form Solution for the Production-Break Retrograde MethodDarrell Hamilton, Brian GillespieThis article explores and discusses concepts surrounding the multi-step retrograde analysis process for learning curve production breaks that was popularized by George Anderlohr, in his 1969 Industrial Engineering article "What Production Breaks Cost". Mr. Anderlohr based much of his analysis using the cumulative average curve method, but the basic principles have been widely accepted and used to calculate the equivalent calculation using the unit theory learning curves...2012Software/Hardware
Targeting Affordability and Controlling Cost Growth through Should-Cost AnalysisAnthony DeMarcoOn September 14th, 2010, The Honorable Ashton B. Carter; Under Secretary of Defense for Acquisition, Technology and Logistics, released a memorandum addressed to the acquisition professionals of the Department of Defense. The primary thrust of the memorandum was the current need for greater efficiency and productivity in defense spending. Secretary Carter provided guidance organized into five initiatives...2011Applications
An Application of Data Mining Algorithms for Shipbuilding Cost EstimationBohdan L. KaluznyThe North Atlantic Treaty Organization (NATO) Research and Technology Organization (RTO) Systems Analysis and Studies (SAS) 076 Panel (NATO Independent Cost Estimating and its Role in Capability Portfolio Analysis) is a working panel generating independent cost estimates for NATO systems with the aim of standardizing how NATO countries conduct cost estimation. One of the systems analyzed by the SAS-076 Panel in an ex post exercise was Her Netherlands Majesty's Ship (HNLMS) Rotterdam Landing Platform Dock (LPD), an amphibious transport dock ship that was launched in 1997...2011Applications
Selection of Data Source for Systems Contractor Labor Rates and Overheads and Their ApplicationBrian Wilkerson, Wallace RigginsSelection of the best data source can reduce the uncertainty in system contractor labor rates and overheads, facilitate improved government program budgeting, and produce savings for the government in contract negotiations. This presentation compares and contrasts the use of historical contract data vs. Forward Pricing Rate Recommendations (FPRRs) in determining contractor labor rates and overheads in the preparation of Independent Government Cost Estimates (IGCEs). The advantages and disadvantages of each approach are discussed. The development of forward pricing rates beginning in the proposal stage, through DCMA recommendation and DCAA audit to a bilaterally signed agreement is covered...2011Applications
Rolling On The Affordability River (While Managing The Acquisition Program In The Rapids)Christopher SvehlakFrom inception through execution, every acquisition program in the Department of Defense is now expected, even mandated, to incorporate aspects of affordability. This paper explains the role and importance of affordability; then, taking an empirical approach, it presents scenarios depicting actual events based on the author's observations and experience within the context of two separate program offices. Affordability, cost estimating, and financial management processes are described and detailed within the scenarios. Problem root causes and areas of strength are briefly discussed, and some topical research is offered as supporting and explanatory material throughout. Finally, affordability guidelines, relevant public law, current Department of Defense Instructions, and senior leader perspectives are presented, reinforcing the importance of affordability in acquisition programs. 2011Applications
Commercialization Activities at NASA and Resulting Cost ImplicationsJames Roberts, Torrance LambingDue to mandates of the current Obama Administration, much of the space launch activities that have traditionally been led and contracted out by NASA are being turned over to the Commercial Sector. NASA's role is changing in many instances from being a program manager - overseeing development of space launch hardware and conducting space exploration missions - to that of a supplier and manager of facilities and infrastructure to support the space development and launch activities of commercial entities...2011Applications
Reducing Maintenance Costs Using Beyond Economic Repair AnalysisJerry Le MayA Beyond Economic Repair (BER) analysis compares the cost of repairing a product with the cost of replacement giving a company information to help decide if repairing a product is more economical than replacement. Using information from a BER analysis, repair procedures can be written so that once a pre-determined amount of time has been spent on repair without success, a product can then be replaced, spending the additional time on replacement rather than further attempts at repair. A BER analysis starts out as a prediction using anticipated repair costs for a new product to establish the amount of hours spent to attempt repair before stopping repair work and replacing the product...2011Applications
Cost by Capability: Funding the Right Mission Capabilities in a Cost Constrained EnvironmentJohn ScardinoIn an environment of increasing demands on constrained resources, it has become imperative that Department of Defense (DoD) organizations be able to correctly identify the cost and proper allocation of resources across units to provide a sustained level of mission readiness. As operational units train and deploy across the globe, it is important to provide decision makers with the processes that will enable, implement, measure, and manage cost-wise readiness objectively...2011Applications
Earned ReadinessJohn Williams, John ScardinoThe Department of Defense (DoD) faces complex challenges in today's globally strategic and economic environment. During these times of increasing demands and constrained and limited resources, it is imperative that organizations be able to correctly identify the proper alignment and allocation of resources across operational units. The ability to sustain readiness and performance levels over time at lower cost is critical. At multiple levels across the enterprise, leadership must be able to determine solidified readiness and financial goals, as well as have the ability to quickly analyze the readiness and financial performance achieved in meeting those goals...2011Applications
INCOSE Affordability Work Group " Design for AffordabilityEdwin B. Dean, Joseph BobinisThe Affordability Working Group of The International Council on Systems Engineering (INCOSE) is striving to understand and promulgate affordability within the systems engineering community. The group began at the 2010 INCOSE International Conference last summer, meets monthly by telephone, and will meet for two days at the end of January 2011 at the annual INCOSE Workshop. In the interest of intersociety sharing, this paper will share the perspective and status of the Affordability Working Group meeting at the INCOSE Workshop.2011Applications
Use of JCL Data for Programmatic SuccessRey CarpioWith surging interest in Joint Confidence Level (JCL) and corresponding expectations levied on the program management communities, what is a PM to do with JCL data and information? This presentation puts emphasis on the value and use of data and information that JCL tools and processes bring to the table. Appreciating the pedigree of the inputs used in JCL development, and the rigor of the JCL process, the presentation will cover the multiple JCL products and the interpretation of the data and information. The presentation will also cover the effective way to communicate the JCL results to the program managers and the stakeholders...2011Applications
Budgeting to the MeanRick Garcia, Casey WallaceDecision makers (and policy) often require cost estimators and analysts to move to a higher percentage on the S-Curve to ensure enough budget is requested so that a program does not overrun its budget target. Although the request to budget at a higher confidence level is a pragmatic attempt to avoid overruns (under budgeting), there are other factors to consider besides simply moving to the right on an S-Curve; such as ensuring a cost estimate captures all relevant uncertainty and acknowledging acquisition changes that will impact a program after a budget has been set...2011Applications
Evolved Expendable Launch Vehicle (EELV) Discrete Event Simulation: Ensuring the Buck Results in a BangColleen Craig, Scott DeNegreAn accurate cost estimate is an essential prerequisite when considering expansion or modification of a major government system. To make an informed decision, cost must be considered in light of the expected benefit of the expenditure, to ensure the investment is justified. We describe one method of performing this analysis, using the EELV supply chain as a demonstrative example. We demonstrate the methodology of Discrete Event Simulation (DES), and its application in the cost community, as a tool to measure the current performance of and potential improvement in the EELV system...2011Applications
Multiply or Divide? A Best Practice for Factor AnalysisShu-Ping Hu, Alfred SmithFactors are commonly used in engineering build-up equations to derive cost estimates. For example, software development hours are often estimated based on an estimate of source lines of code (SLOC) divided by a productivity rate: SLOC per hour. To develop the productivity rate, the analyst collects software size (SLOC) and development time (hours) metrics from a variety of programs. Ideally, these programs have similar characteristics to the one being estimated, for instance, similar complexity, language, resources (people and tools) and environment...2011Applications
Interconnected Estimating Relationships: Their Derivation and ApplicationStephen Book, Amanda FeatherBy the term Interconnected Estimating Relationships, we mean estimating relationships for hardware and software costs, schedules, weights, and below-the-line programmatic costs that are jointly impacted by each other or by the same drivers. There are two common examples of this phenomenon: (1) cost-estimating relationships and schedule-estimating relationships, which are interconnected for many reasons, but primarily because a projects schedule is a significant driver of its cost; and (2) cost-estimating relationships for hardware and software and techniques for estimating below-the-line costs, the latter of which is typically done by applying a factor or percentage to the hardware and/or software cost estimate...2011Applications
Using the New 881 WBS/CES for ERP Acquisition: Lessons LearnedVirginia Stouffer, Gerry BelcherMIL HDBK 881's work breakdown structure/cost estimating structure (WBS/CES) is under revision, and one of the key new WBSs is an automated information systems WBS for enterprise resource planning (ERP). We used the draft 881 to create the program office estimate for the US Coast Guard's CG-LIMS system. Even with help from key committee members in defining the categories, we encountered some difficulties in assigning ERP costs to their proper categories, in accounting for all the ERP costs, and interpreting some of the newer subelements...2011Applications
Determining Cost Estimating Relationships for Nine FAA Solution Development ElementsWilliam BarfieldThe Federal Aviation Administration is responsible for management of our National Airspace System, which requires massive amounts of software development and maintenance. The writing and testing of large-scale software is expensive and involves many substantial costs in addition to the development of the basic software itself. In an effort to improve financial management practices within the FAA, new Cost Estimating Relationships (CERs) were determined for nine FAA Work Breakdown Structure elements pertaining to software development and delivery life cycles activities. CERs are regression equations are typically based on normalized actual costs of prior analogous software development...2011Applications
Standardizing the Cost Technical Baseline - Template and TutorialDiane Butler, Jason NewmanOver the last four years, the Department of Defense (DoD) cost community has experienced changes in cost policies, regulations and implementation procedures. Decision makers want more insight into cost estimate development, improvements in estimate quality, and a better understanding of program changes that affect funding levels. As managers of cost estimators, it is our responsibility to provide guidance on how these changes impact our analysts in the program offices...2011Applications
A Methodology for Multivariate Regression on Large DatasetsMatthew PitlykLarge datasets can present a formidable challenge to the analyst. Proper analysis of a long list of potential cost drivers can require enough regression computations and scatter plots to make discovering good relationships difficult. A methodical approach helps to keep the analysis organized and ensures that nothing is overlooked. This presentation will put forth a standardized way to approach analyzing independent variables in regression and an application of automating this approach in VBA...2011Applications
Defense Acquisition University's (DAU) EVM Checklist WorkshopKim HunterAs the Department of Defense (DoD) increases emphasis on Earned Value Management (EVM) as a project management tool, more and more program offices are finding themselves having to implement and then subsequently execute EVM on their contract. Although EVM is a project management tool, EVM responsibilities are routinely assigned to the business or financial management section which is typically staffed with financial management analysts and cost estimators who may have little or no EVM experience or training. Even if an EVM analyst is on staff, his or her experience and training can vary greatly...2011e-Track
An Improved Method for Predicting Software Code Growth: Tecolote DSLOC Estimate Growth ModelMichael RossThis paper describes the Tecolote DSLOC Estimate Growth Model, which provides probabilistic growth adjustment to Technical Baseline single-point Estimates (TBEs) of Delivered Source Lines of Code (DSLOC) for New software and for Pre-Existing software, these estimates being sensitive to the maturity of the estimate; i.e., when, in the Software Development Life Cycle (SDLC), the TBE DSLOC estimate is performed. The model is based on Software Resources Data Report (SRDR) data collected by Dr. Wilson Rosa of the U.S. Air Force Cost Analysis Agency (AFCAA). This model provides an alternative to other software code growth methodologies such as Mr. Barry Holchin's (2003) code growth matrix.2011e-Track
Parameters in Parametric Cost EstimatingMyung-Yul LeeParametric estimating methods are one of the most desirable and have high creditable due to the fact that this method use actual hours for the weapon system we are estimating. The purpose of this paper is to study how to create a parametric estimating model for weapon system in the aircraft. C-17 Engineering organization will be utilized for generating estimating model...2011e-Track
Integrated Cost-Schedule Risk Analysis using Risk Drivers and Prioritizing RisksDavid HulettMany cost estimates assume that the project schedule is engraved in stone. A risk analysis of the cost estimate is conducted using only cost risk drivers. In this analysis the impact on cost risk of uncertainty in schedule is often ignored or incompletely considered. This presentation discusses how cost and schedule risk analysis is integrated so that cost risk is affected by both traditional cost risk elements (e.g., uncertain material costs or labor rates) and schedule risk elements (e.g., fabrication or testing that may take longer and thus cost more)...2011e-Track
Historical Trend Analysis AnalysedDale ShermonThis article describes three alternative approaches to historical trend analysis. First, the study considers the trend over time of the complexities of past systems. This results from the application of a parametric cost model (PRICE H) to the normalisation of historical projects costs and to the plotting of the complexity over time. 2011Journal of Cost Analysis and Parametrics
A Probabilistic Approach to Determining the Number of Units to Build in a Yield-Constrained ProcessTimothy P. AndersonMany cost estimating problems involve determining the number of units to build in a yield-constrained manufacturing process, when it takes, on average, n attempts to produce m successes (m - n). Examples include computer chips, focal plane arrays, circuit boards, field programmable gate arrays, etc. The simplistic approach to this problem is to multiply the number of units needed, m, by the expected number of attempts needed to produce a single success, n. 2011Journal of Cost Analysis and Parametrics
Statistical Foundations of Adaptive Cost-Estimating RelationshipsDr. Stephen A. Book, Daniel I. FeldmanTraditional development of cost-estimating relationships (CERs) has been based on full data sets consisting of all available cost and technical data associated with a particular class of products of interest, e.g., components, subsystems or entire systems of satellites, ground systems, etc. In this article, we review an extension of the concept of analogy estimating to parametric estimating, namely the concept of adaptive CERs - CERs that are based on specific knowledge of individual data points that may be more relevant to a particular estimating problem than would the full data set. 2011Journal of Cost Analysis and Parametrics
Use of Earned Value Management Trends to Forecast Cost RisksDaniel I. FeldmanThis article uses earned value management trend analysis to forecast trends in BAC and BCWP. The resulting equations are then used to solve for the expected month at completion. With the month at completion date in hand, the article uses trend analysis to and the EAC at that month along with the BAC at that month far in the future to solve for VAC. By using variance against a baseline, the article shows how much risk this program will incur by the date at completion. 2011Journal of Cost Analysis and Parametrics
An Application of Data Mining Algorithms for Shipbuilding Cost EstimationBohdan L. Kaluzny, Sorin Barbici, Garan Berg, Renzo Chiomento, Dimitrios Derpanis, Ulf Jonsson, David Shaw, Marcel C. Smit, Franck Ramaroson This article presents a novel application of known data mining algorithms to the problem of estimating the cost of ship development and construction. The work is a product of North Atlantic Treaty Organization Research and Technology Organization Systems Analysis and Studies 076 Task Group "NATO Independent Cost Estimating and its Role in Capability Portfolio Analysis.2011Journal of Cost Analysis and Parametrics
Estimating Cost and Schedule of Reliability ImprovementDr. David A. Lee, E. Andrew LongWe extend a well-established model of reliability growth in a reliability improvement program, the Army Materiel Systems Analysis Activity Maturity Projection Model (AMPM), to include a model of the programs cost. 2011Journal of Cost Analysis and Parametrics
A Closed-Form Solution for the Production-Break Retrograde MethodDr. Brian Gillespie, Darrell HamiltonThis article explores and discusses concepts surrounding the multi-step retrograde analysis process for learning curve production breaks that was popularized by George Anderlohr, in his 1969 Industrial Engineering article "What Production Breaks Cost." Mr. Anderlohr based much of his analysis using the cumulative average curve method, but the basic principles have been widely accepted and used to calculate the equivalent calculation using the unit theory learning curves. 2011Journal of Cost Analysis and Parametrics
Using Earned Value Data to Detect Potential Problems in Acquisition ContractsC. Grant Keaton, Dr. Edward D. White, Eric J. UngerGovernment contractors report earned value information to government agencies in monthly contract performance reports. Though major differences may exist in the data between subsequent contract performance reports, we know of no government effort to detect these occurrences. The identification of major changes may locate and isolate problems and, thus, prevent million and billion dollar cost and schedule overruns. 2011Journal of Cost Analysis and Parametrics
A Probabilistic Method for Predicting Software Code GrowthMichael A. RossA significant challenge that many cost analysts and project managers face is predicting by how much their initial estimates of software development cost and schedule will change over the lifecycle of the project. Examination of currently-accepted software cost, schedule, and defect estimation algorithms reveals a common acknowledgment that estimated software size is the single most influential independent variable. 2011Journal of Cost Analysis and Parametrics
Life Cycle Cost Growth Study - 20 Science Mission Directorate (SMD) MissionsClaude FreanerPrevious studies have concentrated on examining development cost growth, excluding launch vehicle, mission operations and data analysis. This study looks at cost growth from a Life Cycle Cost perspective, including launch vehicle and mission operations and data analysis. Costs are separated into major WBS elements to see if any elements are more likely to have cost growth than others. The different WBS elements are tracked over time with snapshots of costs at major milestones to see where in the project life cycle the cost growth occurs. 2011Management
How Cost Arises - How We Can Reduce CostEdwin B. DeanThis presentation is based upon nine years of research for NASA on how cost arises and how we can reduce cost. A summary of this research was first published as the 500+ page NASA Design for Competitive Advantage web site (1994-1998). This paper will illustrate how cost arises, suggest various means of reducing cost, and provide resources for those who desire to further this research.2011Management
Constructing a Price-to-WinFrank R. FlettWinning a competitive federal procurement requires more than just writing a good technical proposal. A win strategy that has been formulated without consideration of the competition is bound to fail. A thorough competitive assessment includes the identification of the custome's Most Important Requirements (MIRs), the assessment of our own and our competitors' strengths and weaknesses versus those MIRs, the investigation of our competitors' current market position and behavior on recent competitive procurements, and the postulation of the competitors' strategy options, probable strategy, and price for the opportunity being bid. Only then can we determine our own win strategy and our Price-to-Win (PTW)...2011Management
EELV Should Cost Review Overview and Lessons LearnedJames Smirnoff, Karen Schaben, Joe KabeisemanIn his September 14, 2010 Memorandum for Acquisition Professionals, Ashton Carter, the Undersecretary of Defense for Acquisition Technology and Logistics, directed that the manager of each major program conduct a Should Cost analysis justifying each element of program cost... In early March of 2010, as a response to significant pending contract price increases, supplier readiness uncertainty, and unresolved Defense Contract Audit Agency (DCAA) audit issues, Michael Donley, the Department of Defense (DoD) Executive Agent for Space directed a Should Cost Review (SCR) of the Evolved Expendable Launch Vehicle (EELV) program...2011Management
What to Know When Estimating Virtualized Environment CostsJennifer Woolley, Ryan Boulais, Sandra WilliamsVirtualization, which allows a single piece of hardware to function as multiple virtual pieces of hardware, is one of the principal trends for current IT systems. As the technology has matured over the last several years, various organizations in both the public and private sectors have made the transition to operating in a virtual environment in an effort to reduce costs and streamline data access. With server, storage, and network virtualization becoming more prevalent, cost analysts are frequently asked to estimate and assess the costs associated with virtualization...2011Management