2024 Workshop Breakout Sessions

Analytical Methods Track

Data-Driven Lifecycle Analysis to Optimize Cost, Risk, and Sustainability
Analytical Methods Track (ANM01)
George Bayer
Brian Carroll

Many government infrastructure investments adhere to a standard lifecyle to estimate program cost, plan replacement timing, and compare business cases to one another in a cost-benefit analysis. What if those lifecycle replacement timelines are inconsistent with system sustainability and are not cost-effective? Some infrastructure systems which are replaced according to an end-of-life schedule can be sustained more cost-effectively for longer periods of time via preventative maintenance. Our team examined multiple infrastructure program replacement timelines and analyzed operational effectiveness, cost/risk trade-offs, system redundancy, and sustainability, and we recommended lifecycle adjustments based on those considerations. We reduced overall program cost by extending replacement timelines, eliminating system redundancy without compromising sustainability, and reprioritizing maintenance portfolios on critical backlogs. We document a comprehensive process to customize program lifecycles to optimize cost, risk, and sustainability.

Keywords: Business Case, Cost-Benefit Analysis, Lifecycle, Statistics, Data Analysis, Critical Thinking, Cost Avoidance, Regression, Forecasting, Process Analysis, Monetization


Triage the Sub-Projects: Calculating and Applying Portfolio Contingency
Analytical Methods Track (ANM02)
Stephen Koellner
Nick Peeples

Risk-adjusted cost estimates are needed to understand the potential range of actual costs through execution. Cost risk analysis produces uncertainty distributions which can be used to calculate an expected cost as well as contingency, which can be thought of as the difference between expected cost and a higher confidence level chosen for planning purposes. In a portfolio of projects, allocating uncertainty at the portfolio level will result in a different risk-adjusted cost than applying the same allocation at the project level, and so it is unclear whether a portfolio should allocate and manage risk-informed contingency at the portfolio or project level. This topic will explore methods for calculating portfolio contingency, using a tangible example to demonstrate.

Keywords: Budgeting, Cost Management, Decision Analysis, Program Management, Project Controls, Uncertainty


Things Forgotten Since College – Foundational Statistics
Analytical Methods (ANM03)
Jordan Harlacher
Kyle Davis

Statistical analysis is one of the foundations of cost estimating, but fundamentals are easy to overlook. This presentation will help ensure that is not the case for your next estimate as we will discuss how the data collection and organization processes can form the basis for your estimate. Once the relevant data has been collected and organized, the real fun begins, as the central tendencies and variability of the data can now be examined. The central tendencies and variability can be used to determine the most applicable distribution and assess the probability of different events occurring. We will examine the best ways to visualize different data sets, using charts and graphs to convey the information clearly to stakeholders, as visualizing the data can help inform relationships between variables. Finally, we will touch on key statistics to look for in your regression analysis to ensure a meaningful relationship is defined.

Keywords: Data Collection, Regression, Statistics


Stretching Purchasing Power through Improved Escalation Methods
Analytical Methods Track (ANM04)
Amanda Schwark
Matthew Siiro
Shahriar Rayhan
Carson Lo

Escalation methods ensure cost estimates adapt to economic changes and facilitate accuracy and reliability. The NNSA chartered the Programmatic Recapitalization Working Group (PRWG) to track mission-critical equipment directly supporting weapons activities across the NSE. The PRWG maintains a comprehensive database of equipment above the NNSA capital acquisition threshold of $500,000. The previous escalation methodology for equipment purchase price was limited to using a single equipment inflation index. Additional fidelity in price projections can be achieved by leveraging empirical price data and published indices to derive escalation rates specific to various equipment categories. This paper explores our approach to improving upon the previous escalation methodology to better inform planning and programming decisions. This approach can be leveraged when one broad escalation index is used to predict costs for many significantly differing data elements.

Keywords: Data-Driven, Regression, Escalation


Spacecraft Design to a Cost Target
Analytical Methods Track (ANM05)
Ryan Sukley

Perfect performance of every system is critical for space missions. Identifying capable designs is a challenging process, and one that often comes at the expense of exceeding cost targets. The Cost as an Independent Variable (CAIV) approach helps mitigate this issue by treating cost as a primary consideration in the design or procurement of systems. Establishing a fixed cost target sets a ceiling for the cost versus performance trade-off and, in the case of NASA’s in-house spacecraft, enables more cost-conscious decision making. This paper examines the application of CAIV to identify upper bounds for parameters (mass, power, quantity, etc.) early in the process of designing a spacecraft that satisfies mission requirements. It describes the process of developing, maintaining, and explaining the limitations of this capability, and addresses potential applications of the approach to other commodities.

Keywords: Cost Management, Data-Driven, Early Cost, Parametrics, Space


Early-Stage Cost Growth CER Development
Analytical Methods Track (ANM06)
Gabriel Sandler
Haley Harrison

Capital acquisition projects at the National Nuclear Security Administration (NNSA) have experienced significant early-stage cost estimate growth, driven in part by early optimism and missed scope. To account for these potential scope changes, NNSA’s Office of Programming, Analysis, and Evaluation (PA&E) developed a cost estimating relationship (CER) for construction projects which relates the actual total project cost (TPC) to its early-stage scope estimate. This methodology differs from usual CERs which model actual cost as a function of actual scope, but reflects the scope uncertainty NNSA projects have at early stages. Three cost drivers (gross square footage, hazard category, and equipment complexity) were selected as the variables to solve for the TPC. The results of the CER were compared to another PA&E CER built with actual scope and actual costs so that early-stage cost estimate growth at the NNSA for various types of capital acquisition projects could be quantified.

Keywords: none provided


Market Dimensional Expansion, Collapse, Costs, and Viability
Analytical Methods Track (ANM07)
Douglas K. Howarth

Most government programs set out with cost caps and minimum force requirements. Commercial projects usually begin with a budget, sales targets, and specifications. All too often, in both cases, producers and customers give little thought to the changing market structures they face. When it comes to Demand, markets self-organize to form up to four boundaries each, including 1) Upper (price-limited), 2) Outer (saturation-limited), 3) Inner (efficiency-limited), and 4) Lower (margin-limited) Demand Frontiers. When new market segments appear as different product forms with enhanced functionality over existing options, as the new markets grow, the product groupings they replace may contract across one or more Demand Frontiers. This paper examines preparing for these inevitable eventualities in an N-dimensional framework.

Keywords: Functional Requirements, Modeling, Viability, Multidimensional


Comparison of UMP in the Great Recession and the Covid-19 Recession
Analytical Methods Track (ANM08)
Nathan Gallacher

This piece aims to produce a review of the Unconventional Monetary Policy (UMP) used in both the Great Recession 2007-09 and the COVID-19 Recession, then compare the two recessions to show how unconventional monetary policy changed, differences in tools used by the Bank of England and the size of the tools put in place. Notably, tools such as quantitative easing see use in both recessions suggesting similarities in the aims of the Bank of England during both recessions. The main results show a significant increase in the use of unconventional monetary policy from the Great Recession to the COVID-19 Recession. At the same time, inflation outcomes were worse during the COVID-19 Recession. This suggests that the greater reaction by the BoE in the use of UMP towards the COVID-19 Recession may not have been as effective in controlling inflation compared to the Great Recession.

Keywords: Decision Analysis, Government, Microsoft Excel, Modeling, Economics, Monetary policy, Unconventional monetary policy, policy analysis


 

Data Science & Machine Learning Track

Explosive Analysis: Using Data to Hold Warfare Centers Accountable
Data Science & Machine Learning Track (DML01)
Ryan Webster
Robel Semunegus
Taylor Fountain

The Joint Service Explosive Ordnance Procedure Publications program creates and maintains critical documents for the safe handling of ordnances. This effort is managed by Naval Warfare Centers. Historically, senior leadership has funded these efforts without the ability to evaluate reasonableness of annual funding requests. Augur has recently obtained publications system data, resulting in valuable analysis of historical efforts. This data is being leveraged to develop a planning calculator, capable of estimating ranges of labor hours based on ordnance type, country of origin, and other complexity drivers derived through regression analysis and other visualization techniques. This tool and the accompanying insights will enable senior leadership to negotiate with warfare centers and more easily measure performance.

Keywords: Data Science, Parametric, Performance Management, Cost Tools, Regression


Maximizing Analysis of Minimalized Datasets
Data Science & Machine Learning Track (DML02)
Taylor Fountain
Obai Kamara

Many techniques exist to determine parametric relationships within large datasets. While cost estimation relies heavily on identifying such relationships, a data-scarce environment, driven by factors such as vendor proprietary restrictions, security concerns, and the uncertainty of emergent technologies, is a common barrier in implementing these techniques. This topic will evaluate common methods to analyze minimalized datasets for developing defendable cost estimates, such as complexity factors and 3-point distribution fitting, and demonstrate the statistical impacts of their underlying assumptions.

Keywords: Data-Driven, Methods, Monte Carlo, Parametrics, Regression, Uncertainty, Data Science


Labor Mapping in Parametric Estimates
Data Science & Machine Learning Track (DML03)
David Ferland
Tristan Judd

Contractors and Original Equipment Manufacturers (OEM) alike often struggle applying specific resources or labor categories to their parametric estimates. Many parametric modeling techniques produce hours by generic resources that still need to be translated into labor resources that have rates and other attributes before they can be useful for analysis. I will outline a tool development framework that fills this gap and allows the cost estimates to stay in-sync with downstream tools like ProPricer that may compile the final estimate. This case study uses TruePlanning® as an input to the pipeline but can be applicable to most parametric sources. In cases where Basis-of-Estimates (BOEs; as opposed to Realistic Cost Estimates or RCEs) using proposed resource hours are still being required to justify parametric estimates, the traceability and justification of these pipelines is also an important consideration.

Keywords: Data Collection, Labor, Parametrics


Data Cleaning in Python for Beginners
Data Science & Machine Learning Track (DML04)
Alexis Somers

As cost estimators, we collect large amounts of data from many sources, and it’s often messy. Cleaning and organizing the data often requires time-consuming manual effort before proper analysis can begin. Using Python to clean and manipulate data is one of the easiest ways to save time and maximize efficiency when working on cost or other data analyses. As a free, beginner-friendly, and versatile tool, Python is an excellent choice for processing and analyzing data. This session will cover how to get started using Python to create simple scripts that produce clean, organized data. We will use the pandas and NumPy libraries to clean datasets by correcting errors, reformatting data, handling missing values, adjusting for outliers, and more. The ability to create simple Python scripts can improve the quality of your cost estimates and other deliverables by improving accuracy, efficiency, and saving time.

Keywords: Cost/Benefit Analysis, Data Collection, Data-Driven, Python, Data Cleaning, Automation


Going Beyond Count-Based Methodologies with Semantic Vector Embeddings
Data Science & Machine Learning Track (DML05)
Trevor Lax
David Ferland

Machine Learning (ML) is a topic of persistent interest and a frequent buzz word because of the astounding capabilities it has shown across disparate fields. However, the complexity of ML combined with the overwhelming number of options can lead to decision fatigue and reduced understanding in new users. While much attention is duly focused on the data and machine, occasionally the basic components of ML, such as input data type, are not properly interrogated. Indeed, a frequently used Natural Language Processing method, Term Frequency – Inverse Document Frequency (TF-IDF), simply uses counts, which cannot encode syntactic or semantic information. An alternative to TF-IDF, Word-2-Vector, creates vector embeddings of the words in a corpus, instead of relying on sentence-level counts, and attempts to encode Semantic information. Word-2-Vector has its own limitations, such as the need for a large corpus, however, it can allow for better performance and greatly improved flexibility.

Keywords: Data-Driven, Parametrics, Variables, Machine Learning


Automation and Process Improvement in Cost Estimating
Data Science & Machine Learning Track (DML06)
Anil Divvela

In the last couple decades, there has been a wave of innovation across all Industries to streamline data analysis processes and cut costs with the introduction of data science tools. I will demo simple Python use cases that will automate time consuming redundant tasks that every Cost Estimator hates!

Keywords: Python, Automation, Process Improvement


AI and Machine Learning/Data Science Tools for Cost Analysis
Data Science & Machine Learning Track (DML07)
Daniel Harper
Kevin McKeel

AI and Machine Learning/Data Science Tools such as Chat GPT have taken on an expanded presence in Cost Analysis. E.g., NLP is used to automate functional software sizing in commercial models. Large Language Models (LLM) may even have applications for cost and acquisition professionals. We will present an overview of modern usages of data science, to include Machine Learning, AI and data visualization. We will also provide several use cases for applying these tools in cost estimation.

Keywords: Artificial Intelligence, Data Science, Machine Learning, NLP


Costing Web App Development for Operations Research
Data Science & Machine Learning Track (DML08)
Kyle Ferris
Eric J. Hagee

Commercial-off-the-shelf (COTS) web application development platforms empower analysts to leverage low-code environments to build comprehensive business tools. Therefore, understanding the lifecycle cost requirements to design, develop, deploy and maintain low-code web applications as both analytical and decision support tools for stakeholders is of interest to the cost community. We define web application lifecycle requirements as analogous to an overarching Data Operations Stack. The Data Operations Stack is a conceptual framework that describes data operations as a set of hierarchical requirements, from base-level IT infrastructure and tools to high-level business products. With this framework in mind, we describe web application lifecycle requirements through successive levels of the Data Operations Stack, elucidating the required personnel, tools, and capabilities integrated into each level. Finally, we discuss how an understanding of interconnected dependencies across the Data Operations Stack can be used to develop defensible cost estimates and manage resources for web application lifecycle requirements.

Keywords: Communication, Data-Driven, Data Science, Web Applications, Web Apps, R, Shiny, Programming Languages


From a Man-Month to an AI-Minute, Myth or Reality?
Data Science & Machine Learning Track (DML10)
Colin Hammond

In this session I will share some of our discoveries of using AI over the last five years that can help software cost estimators and our thoughts on how AI will be changing software development costs in the coming years. Back in 1975 Fred Brooks discussed observations of software engineering, many of which are counter-intuitive in a book entitled The Mythical Man Month, we pay homage to his book title in this talk as we share some observations and quantifications of how AI is helping to improve early software estimation. I will also share our predictions on areas where AI will help accelerate software development and impact on software costs over the next few years.

Keywords: AI, NLP, cost estimation, software estimation


Implications of Generative AI (Artificial Intelligence) in Software Engineering
Data Science & Machine Learning Track (DML11)
Arlene F. Minkiewicz

Generative AI is positioned to revolutionize software development, with potential far reaching implications for productivity. Generative AI applications leverage Large Language Models to understand language, imagery and code, then use what they learned to generate content; answering questions, organizing multimodal information, and writing text and code snippets. A McKinsey report from 2023 reports that the software development landscape is quickly changing as Generative AI applications such as ChatGPT and Github Copilot have the potential to enable software engineers to complete development tasks; achieving as much as 2x productivity over traditional development practices. Activities such as inception and planning, system design, coding, testing, and maintenance can all be aided through applications of Generative AI. This paper will include an introduction to Generative AI in the software engineering context. Following will be a discussion of productivity impacts and guidance for incorporating them into a software estimates.

Keywords: Software, Machine Learning, Generative AI, ChatGPT


Distribution Free Uncertainty for CERs
Data Science & Machine Learning Track (DML12)
William King
Shaun Irvin

For this presentation we intend to introduce and demonstrate the application of conformal prediction as a tool to specify prediction intervals for any machine learning algorithm. Conformal prediction intervals offer rigorous statistical coverage guarantees without distributional assumptions and only require the exchangeability of data (a weaker assumption than independence). Moreover, generating these prediction intervals is an easy consequence of retaining the sub-models trained during k-fold cross-validation. Specifically, we intend to summarize the “CV+ for K-fold cross-validation” method (and its locally weighted variant) from Predictive Inference with the Jackknife+ (Barber, Candes, Ramdas, Tibshirani, 2021, The Annuals of Statistics), and show how conformal prediction enables distribution free uncertainty for CERs. Additionally, we plan to discuss how this technique can be applied to direct human-in-the-loop intervention when applying machine learning models.

Keywords: Methods, Modeling, Regression, Risk, Statistics, Uncertainty, Conformal Prediction


Industry Leaders’ Insights: Enhance Efficiency and Simplify Your Work Using AI
Data Science & Machine Learning Track (DML13)
Karen Richey Mislick
Greg Mislick

The modern workplace is increasingly influenced by leaders who recognize the transformative power of data analytics and AI. This presentation delves into the practical experiences and insights gleaned from industry frontrunners effectively utilizing these technologies. These leaders have not only achieved significant operational efficiencies but have also mastered the art of simplification in complex business processes. Their lessons underline the importance of strategic integration, the value of data-driven decision-making, and the transformative potential of AI-driven automation. Attendees will gain a comprehensive understanding of how top enterprises are reducing costs, streamlining operations, and fostering innovation. Drawing from real-world case studies, this presentation aims to encourage cost analysts to tap into the immense potential of data analytics and AI, turning insights into actionable strategies for enhanced work efficiency.

Keywords: AI, LLMs, machine learning, ChatGPT4


Generative AI for Government
Data Science & Machine Learning Track (DML14)
Conner Lawston

‘ChatGPT’ has been making massive waves across the world in the last year! This presentation gives an introduction to several ‘Generative AI’ models, and how they can create new images, code, data, and text, seemingly out of thin air. We will look at the process of how to build these models, including their training dataset sizes and costs. Examples will be shown of how to use ChatGPT to generate python code for you, as well as R, and PowerBI. After the general overview, specific examples of applications to Government will be shown (including acqbot- an AI tool for generating proposals). There will also be a demo of the ‘GURU’ bot, which was trained on the Federal Acquisitions Regulation (FAR) pdf, and can answer questions about PPBE, EVM, and Acquisition questions. We will summarize the pros, cons, and potential risks of Generative AI, as well as the future outlook to come.


 

Management, EVM & Risk Track

The Cost-Risk Uncertainty Determination (CRED) Model – A New Approach
Management, EVM & Risk Track (MER01)
Cheryl L. Jones
Robert Charette, PhD
Bradford Clark, PhD
Sariyu Marfo

The objective of this model is to improve the credibility of and trust in a cost estimate by: 1) Identifying, characterizing, and accounting for different cost performance factors that may be sources of risk/uncertainty that can result in creating material impacts on a software sustainment and maintenance cost estimate. 2) This approach makes visible the “knowledge gap” (if any) between “what should be known” and “what is known” about the system under assessment – this “gap” is an input used to assess a range of uncertainty associated with the estimate. 3) It also fully documents the key program issues and related performance factors that may influence the cost estimate and why. While this presentation focuses on the software domain, it is easily adaptable to other domains.

Keywords: Modeling, Program Management, Risk, Software, Uncertainty


Schedule Risk at Early Acquisition
Management, EVM & Risk Track (MER02)
Gabriella Magasic
Sam Kitchin

It can be difficult to construct a realistic schedule early in the acquisition lifecycle due to the limited certainty of requirements, design decisions, and other key elements of program planning. Understanding risk and uncertainty in a schedule is essential, and the GAO Scheduling Guide includes “Conducting a Schedule Risk Analysis” as one of the 10 Best Practices. A Schedule Risk Analysis (SRA) can provide quantitative insight into potential areas of delay along with associated cost impacts. However, a well-formed SRA requires clear input and structured analysis of risk events and uncertainty. In this presentation, we will discuss how to address schedule risk in low maturity projects by investigating different risk modeling techniques, reviewing existing guidance on schedule risk, and analyzing how uncertainty analysis must be interpreted and applied early in the project lifecycle.

Keywords: Program Management, Risk, Scheduling, Uncertainty, Analysis


Cost Estimation for Project Control
Management, EVM & Risk Track (MER03)
Ed Spriggs

Project control in software development is a critical responsibility of program managers and contracting officers. And although the job is a difficult one for most analysts, the inability to measure and control what is being created and tested can result in loss of stakeholder confidence and, in the worst case, a cancelled project/program. What got us here? You guessed it – agile development. The adoption of agile means less defined up-front scope and little to no requirements documentation. While that flexibility allows for more development freedom it creates more variability in the features and functionality of the delivered product. This paper will describe the best new and existing practices for forecasting capabilities (features) that can be delivered within a certain timeframe given the fixed parameters of cost, schedule, and development team size. We will explore innovative techniques to measure software features, even in the absence of requirements, using function points and story points among others.

Keywords: Budgeting, Cost Management, Cost/Benefit Analysis, EVMFunction Points, Functional Requirements, Performance Management, Scheduling


Advancing EVM with a Modernized Framework
Management, EVM & Risk Track (MER04)
Aaron Everly
Corey Maples
Scott Campbell

DoD’s FY24 procurement budget is the largest in history. The cornerstone of this budget is the procurement of complex, technologically advanced systems. DoD programs require new technologies to meet end-user requirements; however, the challenges inherent in new technology often translate to significant cost growth. PMs utilize EVM analysis to make informed decisions and mitigate contract cost growth. The IPMDAR exemplifies DoD’s recognition of the need for meaningful data by requiring a modernized data schema (machine-readable format providing near real-time cost performance). Likewise, Technomics implements a modern approach to EVM using data analytics software and BI tools applied through a framework that incorporates a comprehensive view of EVM. This paper describes Technomics’ EVM Framework (EVMS Surveillance, Contract Startup, Data Aggregation, EV Analysis, and Program Management), which implements modern tools to not only reduce monthly reporting tasks but also perform powerful EV analysis that enables programmatic decisions.

Keywords: Data-Driven, Government, IPM, Performance Management, Program Management, EVM


EVM Reviews – Surveillance Reviews vs. IBRs
Management, EVM & Risk Track (MER05)
Sam Kitchin
Greg Smith

Successful Earned Value Management (EVM) implementation requires an effective Earned Value Management System (EVMS) and a well-planned performance measurement baseline. Meaningful insight into project performance can only be achieved with this combination of a compliant system with the active planning and management of project execution. A critical method to evaluate adherence to EVM best practices is to conduct reviews. Compliance reviews and surveillance reviews are used to evaluate the sufficiency of the EVMS, while integrated baseline reviews are used to assess the reasonableness of a project baseline. This presentation will compare and contrast these two types of review, demonstrating how and why they differ. Key terminology, stakeholders, artifacts, timeline, and intended results will be discussed. Real life examples may be used.

Keywords: EVMFunction Points, Government, Performance Management, Program Management, Project Controls, Earned Value Management


Advanced EVM Analysis using Time Series Forecasts
Management, EVM & Risk Track (MER06)
Anna B. Peters
Mark W. Hodgins

The recent digitization of contractor EVM data affords cost analysts a newfound ability to execute robust statistical and data science techniques that better predict total project cost and schedule realism. Time series analysis, a well-established method in private sector finance, is one such method. Auto regressive integrated moving average (ARIMA) models may capture the persistence and patterns in EVM data, as measured by CPI, SPI, and schedule execution metrics (SEMs). As a second option, macroeconomic regression models can measure the relationship between contract performance and external considerations, like unemployment and inflation, over time. Both techniques, moreover, may forecast future changes in EVM variables interest, like IEAC. This presentation will discuss how these types of time series models and forecasts are employed on real acquisition programs and their associated IPMDAR data using Python based tools to raise program analysts’ alertness to emergent acquisition risks and opportunities.

Keywords: Cost Management, Methods, Regression, Statistics, EVM, Time Series Modeling, ARIMA, Vector Auto Regression


Deriving Total Project Costs from WBS Elements’ Probability Distributions
Management, EVM & Risk Track (MER07)
Prof. Dr. Rainald Kasprik

Studies on possible cost variances in major acquisition projects are focusing on total project costs in order to come to plausible project budgets with a confidence level of 80%. Different lognormal probability distributions had been worked out representing different states of uncertainty. However, these models cannot be applied when using risk management software for deriving the total project costs based on cost probability distributions for WBS elements. Due to a limited processing capacity, risk management software demands a division of the underlying probability distributions into intervals A simple discretization of the models developed to date is not possible, as these models contain unrealistic cost growth factors. Based on simulation studies, three lognormal probability distributions are presented that meet these challenges. Finally, some practical hints are given on the minimum number of intervals which still represents the curvature of a probability distribution and on how to interpret the joint CDF’s not-defined areas.

Keywords: Budgeting, Modeling, Uncertainty, Discretization


 

Modeling Track

Cascading Effects – Performance Impacts of Fragile Tasks
Modeling Track (MOD01)
Tommie (Troy) Miller
Joshua Hamilton

The growing popularity of Joint Cost & Schedule Analysis has highlighted the need for quality Schedule Risk Assessments (SRA). Modeling schedule risk and uncertainty requires an understanding of schedule networks. Network Analytics (NA) has been furthered in recent years due to research in fields such as social networks, IT networks, and transportation networks. Key aspects of these advancements can be used in SRAs to improve our understanding of schedule risk and mature our modeling techniques. For example, epidemiologists study the propagation of diseases through a community. The techniques used to model this phenomenon may be applicable to SRAs to model the propagation of task slips through schedules. This presentation integrates classical concerns in schedule analytics, principally Merge Bias, with NA processes, such as node centrality measures and edge properties, to uniquely identify fragile tasks and illustrate how delays in these tasks cascade through a schedule and ultimately affect program execution.

Keywords: Decision Analysis, Statistics, Uncertainty, Variables, Schedule, Task, Merge Bias


Data-Driven Constellation Architecture Design Using Integrated Models
Modeling Track (MOD02)
W. Allen Wautlet
David Brown
Greg Thanavaro

The modern space mission landscape requires consideration of numerous trade variables to deliver optimal mission performance at low cost. Academic methods exist to address such challenges, however, practical deployment of these methods to constellation mission design remains uncommon. This paper presents a practical space mission constellation architecture approach that employs proven statistical, data science, and machine learning techniques on the products of an integrated cost and engineering modeling framework. When deployed at the early stages of constellation development, this integrated modeling framework and analysis approach provides stakeholders insight into key design parameters that drive mission performance and cost sensitivity. Furthermore, it enables the uncovering of promising design regions in large trade spaces that can be further examined and refined by technical subject matter experts. This approach leads to better decision making earlier in the acquisition timeline and increases the efficiency of design cycles.

Keywords: Data-Driven, Modeling, Parametrics, Mission Modeling, Bus Modeling, Cost Modeling


Mission Operations Cost Estimation Tool (MOCET) 2024 Status
Modeling Track (MOD03)
Marc Hayhurst
Roshni Patel
Cindy Daniels
Lissa Jordin

The Mission Operations Cost Estimation Tool (MOCET) is a model developed by The Aerospace Corporation in partnership with NASA’s Science Office for Mission Assessments (SOMA). MOCET provides the capability to generate cost estimates for the operational, or Phase E, portion of full NASA space science missions. MOCET is a widely accepted model in the NASA community used in full mission Announcement of Opportunity competitions since 2015. MOCET was awarded NASA Cost and Schedule Team award in 2017 and honorable mention in the 2021 NASA Software of the Year competition. The cost estimating relationships and documentation have been implemented as a standalone Excel tool that is available within NASA and publicly through software.nasa.gov. Extended mission and Level 2 work breakdown structure costing capabilities are continuing to be developed and a status will be presented.

Keywords: none provided


A CASE for Estimate Analytics at the Enterprise Level
Modeling Track (MOD04)
Josh Angeo
Miguel Aceves

Are our estimates improving over time? What did this cost 2 years ago? When was the last time we reviewed this estimate? These questions, amongst many others, are why SSC FMC developed the Cost Analytics for SSC Estimates (CASE) tool. CASE includes over 175 cost estimates, 60 programs, and goes back as far as 2017. The tool creates comprehensive dashboards capable of analyzing programs individually and in aggregate. CASE utilizes various data sources and performs extensive data pre-processing to ready the data for Power Bi. Data pre-processing steps utilize python, DAX, and Power Query. Estimate data comes from a combination of POST reports, PDFs, and spreadsheets. Custom meta data tables were developed to enable parsing and other functions. Lastly, data sources comprising of program actuals have recently been integrated. All of this results in a new found capability to evaluate estimates using analytics.

Keywords: Cost Management, Data-Driven, Government, Microsoft Excel, Program Management, Story Points, Power BI


Modeling Electronic/IT System Deployment Projects
Modeling Track (MOD05)
F. Gurney Thompson III
Ben Robinson

This presentation will discuss the development and application of cost models for electronic and IT system deployment projects. The deployment projects include various technical studies and preparation activities, site survey visits, and comprehensive installation efforts across many sites. The models consider size drivers such as the amount of hardware and software systems to be installed, number of sites, scope of activities, and number of different system configurations. Project complexity can be adjusted for many system/technology intricacies and site conditions. The models have been applied successfully, with validation against actuals, in estimating deployment costs for communication network deployment projects such as data centers, air traffic control towers, and military vehicle/ship/aircraft communication systems. Additionally, these models have been applied to weapon system and train signaling equipment deployments, with model validation relying on expert judgment. This presentation outlines the model’s development, stru

Keywords: none provided


Recipe: Homemade Pizza (or Facility Estimate)
Modeling Track (MOD06)
Kristen Marquette
Caitlin Burke
Olivia Lindsey

Have you ever wanted to “”wow”” your guests with a homemade pizza, but didn’t know where to start? This is how we felt when beginning our facilities estimates. This presentation will break down both recipes step by step, leaving everyone satisfied and writing rave reviews. Just as you need delicious dough, sauce, and toppings for great pizza, you need detailed structural, material, and recurring scope requirements for a great facilities estimate. We will take you through our experience with data collection spanning multiple facilities and serve up comprehensive methodologies with high fidelity. If you don’t have time to create a homemade pizza or perform your own detailed facilities analysis, you can leverage the tools and methodologies provided (as to-go slices), to build your own facilities estimate based on your specific program requirements.

Keywords: none provided


Well, That Escalated Quickly – A Novel Approach to Forecasting Escalation
Modeling Track (MOD07)
Sean Wells
Kaitlyn Hagy

Escalation rates are an important part of estimates and as such the provenance and derivation of indices should be regularly scrutinized, yet are rarely contemplated. This paper will compare a commonly used black-box escalation resource, IHS Global Insight, to a traceable, simplified forecasting method to determine if a purely mathematical model delivers an improved level of forecasting accuracy. Our model relies on a curated set of Bureau of Labor Statistics (BLS) indices to develop a moving average forecast. With access to over 15 years of IHS forecasts dating back to 2006, spanning 800+ indices, this study has the unique opportunity to quantify the accuracy of IHS and moving average forecasts against historical BLS indices. Our paper will establish and explore various measures of forecast accuracy for use in creating defensible estimates. The goal is to provide a quick, transparent, and flexible way to develop tailored escalation projections without sacrificing accuracy.

Keywords: Data-Driven, Methods


Comparative Analysis of NASA Cost Estimation Methods
Modeling Track (MOD08)
Camille Holly

NASA policy and customer expectations dictate use of various cost estimating tools depending on milestone and program maturity, regardless of level of effort or accuracy of results. This paper presents a case study of the tradeoffs of modeling the cost of an unmanned space mission using different NASA-approved parametric tools. The comparison addresses subsystem and component-level cost estimates, providing invaluable insight into the granularity of cost modeling for complex space missions and differences in results associated with more or less granular estimates. The study offers perspective on the challenges and opportunities associated with parametric cost modeling methodologies due to the varying levels of input detail, and of effort, needed to complete an estimate. It also aims to provide practical insights on the number and types of subjective decisions made when modeling costs using different approaches, and the impacts that these choices have on cost results.

Keywords: Subsystem, Component-level, Parametric tools, Cost modeling, Case study, NASA, Cost estimates


The Nuclear Option: Avoiding Critical Delays with Advanced Constraints Analysis
Modeling Track (MOD09)
Hannah Hoag Lee

NNSA construction projects are often subject to funding constraints. The ripple effect of funding shortfalls can be severe; projects are forced into suboptimal execution profiles that produce costly schedule slips with drastic mission implications. This experience is not unique to NNSA construction projects. Funding constraints occur in most government sectors, negatively impacting many types of projects’ progression, schedule, and mission. However, since inadequate funding is often unavoidable, it is imperative to use a data-driven methodology to predict schedule deviations and calculate ideal cost phasing to mitigate additional or unanticipated implications on project timeline. This paper demonstrates how a constrained phasing model uses historic project cost and schedule data to estimate a new project timeline based on a constrained funding profile. It also reveals how the model re-phases costs for the remainder of the project duration to generate a viable execution plan.

Keywords: Data-Driven, Scheduling, Statistics, Phasing, Weibull, Funding constraints


Costing a Ballistic Schedule
Modeling Track (MOD10)
Rob Carlos
Kaden Howell

Join us to explore an imminent solution addressing recurring concerns in the DoD involving cost overruns and schedule delays resulting from program practices and schedule dynamics. We’ll address the power of Integrated Cost & Schedule Risk Analysis (ICRSA) & Joint Confidence Level (JCL) assessment from a DoD program office perspective, emphasizing its practicality. Such outputs yield more reasonable and quantifiable estimates by incorporating cost & schedule risk and uncertainty. We’ll present a case study involving a DoD ACAT IB program, discussing the lessons learned during ICSRA implementation and JCL attainment. Our presentation illustrates the impact of ICSRA and JCL, facilitating improved forecasting, early risk identification, trade space analysis, and informed decision-making. The primary objective is to provide real world insight based on lessons learned, quantitative analysis, and creative problem solving on the efficacy, utility, and power of the ICRSA and JCL.

Keywords: Agile, Data-Driven, Government, Monte Carlo, Risk, Scheduling, Uncertainty, Integrated Cost & Schedule, Joint Confidence Levels, Integrating Schedule & Cost Risk Analysis, Risk Based Integrated Cost and Schedule Analysis, Simplified Cost and Schedule Risk Analysis, Cost Overruns, Schedule Delays


Flavors of Commonality: Learning in a Multiple Variant Environment
Modeling Track (MOD11)
Brent M. Johnstone

Commonality – the reuse of parts, designs and tools across multiple aircraft models — is a popular strategy to reduce program costs in commercial and military applications. But its use poses unique challenges to learning curve practitioners. This paper examines five approaches to estimating multiple variant programs using different learning curve techniques. A notional dataset is created, and the accuracy of each method is measured to highlight the advantages and disadvantages of each. This presentation should be of interest to anyone doing learning curve analysis in their cost estimates.

Keywords: Labor, Learning Curves, Manufacturing, Methods, Modeling, Commonality


Installation Cost Analysis
Modeling Track (MOD12)
Eric White

Navy IT program managers have been frustrated in recent years by increasing system installation costs. Large amounts of siloed but related installation cost data has previously proven difficult to analyze and identify core problem areas. This paper describes an innovative new solution to this problem, utilizing data visualization tools to combine related data sources and illustrate the trends and relationships in visuals that make it easy for program managers to consume and act upon. By dynamically expanding cost data, this visualization dashboard can express cost across time, product types, location, and more, while also offering the ability to quickly drill into the inherent cost makeups. Not only can this tool quickly identify significant variances, but also offers an explanation to those cost variances. Once the historical cost data is understood it is then used in a cost model that accounts for the time value of money across future years.

Keywords: Installation Costs, Data Visualization, Cost Analysis


 

Processes & Best Practices Track

Scrutinizing an Organization’s Project Planning Performance
Processes & Best Practices Track (PBP01)
Sergey Kozin

Explore the intriguing world of Project Planning within a typical sustainment organization, spanning nearly a decade worth of estimation and execution data for dozens of special projects as PMs, Engineers, and Estimators desperately fight to defend their budgets and keep the system operating. Did we prioritize having thoroughly developed requirements definitions or wait till the 11th hour to establish them? Was schedule and scope realistic or heavily reliant on optimism as a primary methodology? Did we find ourselves broken and send up a signal flare or accept the shackles of a constrained budget? It is accepted that no plan or estimate is perfect, but rarely do we scrutinize and quantify the errors of our ways to encourage improvements within the process. Join this thought-provoking expedition, as we use metrics to judge the performance of planning practices, seeking insights and wisdom for the projects that lie ahead.

Keywords: Data-Driven, Performance Management, Program Management, Project Controls


Mission Class in Unmanned Space Estimating
Processes & Best Practices Track (PBP02)
John Swaren
Bryan Howe
Vivian Tang, PhD

The cost engineering community needs consistent guidelines in addressing mission assurance processes for a given space vehicle mission risk class (A, B, C, or D) based on programmatic constraints and mission needs. This presentation reviews current considerations and research. Current best practice recommendations typically relate Mission Class to an operational environment specification that conveys quality information based upon requirements. Specific end item maintenance accessibility, reliability, structuring, testing and documentation requirements are typically driven by mission operating environment. More user-definition is needed for factoring in parts quality, test-sampling, orbit-ranges and mission duration. Mission Class estimating needs to tailor component-level Part Quality as well as affect “informed” higher-level assembly and system charges. Modeling operational environment should reflect specification flow-down, validation and documentation and modification/ integration of subcontracted material items.

Keywords: Unmanned Space, Mission Class


GAO Cost Guide and 10 Years of Cost Estimate Assessments
Processes & Best Practices Track (PBP03)
Ben Wilder
Jennifer Leotta

Since the Government Accountability Office (GAO) Cost Guide was released as an exposure draft in 2007, GAO has used it as criteria to review and assess agency’s cost estimates. This presentation will look at a 10 year period (FY13-FY23) to see (1) if there are any consistent gaps in agency performance of the four characteristics of a reliable cost estimate and (2) if there has been any improvement in scores over the course of the 10 year period.

Keywords: Cost Management, Life Cycle, Performance Management, Project Controls


GAO Agile Assessment Guide: Best Practices in an Agile Environment
Processes & Best Practices Track (PBP04)
Jennifer Leotta

In 2020, the Government Accountability Office (GAO) released the Agile Assessment Guide as an exposure draft. After an open comment period, vetting comments, updating text, and applying best practices during audits, we have recently issued the updated Agile Guide. This presentation will provide an overview of the best practices discussed in the Guide and then take a deeper dive into Chapter 7; using program controls such as using a WBS in an Agile environment and what we have found in recent audits for programs using cost estimating, scheduling, and EVM best practices in an Agile environment.

Keywords: Agile, Performance Management, Program Management, Project Controls, Scheduling, Software


A Series of Unfortunate Slides
Processes & Best Practices Track (PBP05)
Shannon Cardoza
James J Monopoli

Embark on a journey through the realm of impactful presentations, where we unravel the secrets to captivating briefings. Picture this: a vivid showcase of real-life blunders that often muddy the waters of comprehension and engagement—slides lacking labels, drowning in excessive words, or confounded by chaotic transitions. Join us as we delve into the essence of strategic naming and purposeful design to craft presentations that captivate and inform. Witness the transformation with us as we reveal the magic of visuals and charts, drawn from successful briefings to Cost Chiefs, PEOs, and beyond. You’ll discover how to master the art of avoiding tricky questions by leveraging compelling visuals and enhancing your soundtrack for seamless narrative flow. Moreover, we’ll shed light on how these skills not only save valuable time and resources but become a cornerstone for professional growth — empowering you to conquer larger audiences with clarity and confidence.

Keywords: Communication, Data-Driven, Decision Analysis, Visual Story Telling


The ABC’s of Contract Proposal Pricing Evaluation& Cost Analysis
Processes & Best Practices Track (PBP06)
Christopher Svehlak

Almost every nook and cranny of the Government relies on contracts for services, support and, well, “stuff.” As a cost estimator (especially a certified one), you may not know that you probably already have the requisite base of knowledge, skills, abilities and Excel-spreadsheet-jockey talent to learn and do pricing evaluation and cost analysis of contract proposals. This presentation offers you the “what-for,” the “why,” and the “how-to-perform” to potentially add this tool to your arsenal. It will distinguish between price evaluation/analysis and cost analysis, their purposes, when each is needed, and explain cost realism and reasonableness. Then comes the nitty-gritty — how to perform a pricing evaluation and cost analysis on contract proposals. The goal: you leave with a better understanding & appreciation of the process … and perhaps even consider offering your services to the contracting department.

Keywords: Price Analysis, Price Evaluation, Cost Analysis, Contracting


The Complex Database Design Tool Belt
Processes & Best Practices Track (PBP07)
Jamie Bouwens
Tristan Judd
David Ferland

The process of designing a Dimensional Database (DDB) for complex and evolving data types can be difficult for those who have never made one before. A case study is used to demonstrate how to turn an unsustainable method of data management into a DDB using two Six Sigma methodologies, Define-Measure-Analyze-Improve-Control (DMAIC) and Define-Measure-Analyze-Design-Verify (DMADV). DMADV is a preventative method used to create a process from scratch. While DMAIC is a reactive method that is used to improve an existing process. We walk through answering questions such as: What is a fact in a varied, complex, and evolving data set? How do you visualize these fact tables and dimensions? How do you track time phased data? We illustrate that these techniques are cost saving, because they reduce rework, and, most importantly, enable individuals without extensive prior experience to successfully implement an operable DDB.

Keywords: Data-Driven, IT, Methods, Six Sigma


Context-Responsive Cost Evaluation: Dynamic Approach to Cost Estimate Reviews
Processes & Best Practices Track (PBP08)
Brittany Holmes
Christan Yarbrough

Summary not approved for public release

Keywords: Methods, Operations


Space Fence: A Cost Analysis Success Story
Processes & Best Practices Track (PBP09)
Rick Garcia

In 2007, the AFCAA was tasked with providing a Non-Advocate Cost Assessment (NACA) for the Space Fence program, among others. As a central cost estimator on that team, this analysis will describe the research and actions taken by the team to develop cost and schedule estimates that represented an unbiased view of the program’s most likely cost under normal conditions. The resultant cost and schedule estimates were within 5% of the eventual program actuals. This analysis will also describe the partners used for the critical independent technical assessment, as well as the successful interactions with the industry leading companies that were competing for the contract. Lastly, this analysis will outline how our team developed a Ground Radar specific expenditure phasing model based explicitly on historical ground radar programs.

Keywords: Cost analysis, Methods


 

Soft Skills & Trending Topics Track

Convincing Leaders of the Value of COTS Tools for Quick Assessments
Soft Skills & Trending Topics (SST02)
Karen Mourikas
Denise Nelson
Terrance Evans

Multiple COTS tools and industry databases exist in our profession. But many organizations prefer to develop their own tools based on their own historical data, which then better represents their own environment. However, often the effort to develop these tools can be time-consuming. What happens when decision makers need answers immediately and there isn’t enough time to collect and analyze their own data? One approach employs COTS tools and their underlying industry data. But cost analysts often need to convince decision makers of the validity of using COTS tools. This presentation describes several use cases in which program decision makers needed information right away, issues facing the decision makers, how the cost analysis team convinced program leaders of the validity of using COTS tools, including their pros & cons, as well as surprising insights that emerged, ultimately enabling decision makers to determine feasible paths forward.

Keywords: Communication, Decision Analysis, Modeling, Performance Management, COTS, Industry Data


Priceless Culture: Crafting a Culture for the Future of Work
Soft Skills & Trending Topics (SST03)
Cassidy Shevlin
Wyatt Beaman

Priceless Culture: Crafting a Culture for the Future of Work delves into the intricate web of elements that constitute a thriving organizational culture. At its foundation lies effective leadership, setting the tone for a space where core values are not just stated but lived out daily. A unified purpose drives every team member, fostering genuine accountability across all levels. Essential to this mosaic is effective communication, ensuring that everyone is not only heard but also understood. Furthermore, the culture is enriched when leadership embraces vulnerability, showing authenticity and encouraging openness. Intertwined with all these is the spirit of gratitude, acknowledging every contribution, big or small. In an era where workplaces are rapidly evolving, crafting such a priceless culture is not merely beneficial—it’s imperative for the future of work.

Keywords: Communication Leadership, Culture, Employee Retention


Equity and Environmental Justice in Early-Stage NNSA Planning
Soft Skills & Trending Topics (SST04)
Haley Harrison
Zachary Matheson

Recent executive orders (EO13985, EO13990, EO14008) directed federal agencies to prioritize environmental justice and reduce systemic barriers affecting minority and underserved groups. As an organization specializing in decision support for early-stage planning, the Office of Programming, Analysis, and Evaluation has developed a framework for incorporating quantifiable factors as proxies for equity and environmental justice-related factors in analyses of alternatives and early-stage planning studies. This framework will be used to inform decision-makers about potential project impacts from an equity and environmental justice-focused lens. An equity and environmental justice-informed approach to planning within the NNSA can minimize the incidence of negative environmental and health outcomes, maximize the number of opportunities available to historically marginalized groups, and contribute to greater trust of the NNSA mission within minoritized communities improving equity.


Advancing the Art of Cyber Cost Estimating
Soft Skills & Trending Topics (SST05)
Austin MacDougall
William Gellatly
Jessica Kleinman

The growth in quantity and intensity of cybersecurity threats has led to new cyber best practices, such as Zero Trust and Secure by Design. These practices present challenges when developing cost estimates for the development and maintenance of information systems. This paper examines how these topics and other new cyber trends influence costs. It evaluates the cost implications in both the design (incorporating cyber requirements into new system development) and sustainment (cyber support for existing systems) phases. This research also examines existing cyber frameworks and relates them into a cost element structure to drive data collection and methodology development. Finally, this paper translates cyber cost estimating lessons learned into recommended content improvements to the technical baseline documentation upon which cost estimators rely. Standard treatment of cyber in technical baselines should facilitate much needed consistency in the composition of cyber cost estimates.

Keywords: IT, Software, Cybersecurity


Mind the Gap: Bridging the Generational Divide
Soft Skills & Trending Topics (SST06)
Jennifer Aguirre
Annie Bagay
Shannon Cardoza

Do you ever feel you’re speaking a different language than your peers? Ever struggle relating to your IPT as they talk about recent college experiences or upcoming retirement plans? Join us as we explore various ways each generation sees the world, whether through their own eyes or through a high-res smartphone camera. Let’s bridge that gap to reap the full benefits of working in a multi-generational environment enabling effective connections between cost and IPT members. With each wave of people comes new ideas, perspectives, communication styles, and workplace preferences. This diversity can be challenging to navigate and when not properly managed can cause miscommunication, feelings of exclusion, disconnected goals, and failed tech baselines. When harnessed properly, it can be the superpower enabling success within team cohesion, gathering cost inputs, and delivering estimate packages. Whatever stage of life you’re in, come with us on a journey of self-discovery in the workplace!

Keywords: Communication, Soft-Skills


ChatGPT: Friend or Foe – Meet Your New EN SME
Soft Skills & Trending Topics (SST07)
Patrick Casey

ChatGPT Friend or Foe is an insightful exploration into the capabilities and nuances of ChatGPT. Delving deep into the genesis of this AI model the presentation tracks its evolution from inception by OpenAI to its fourth iteration. Patrick Casey, a Senior Cost Analyst at Quantech Services, candidly shares his experiences with the tool, highlighting its transformative power in various use cases for cost analysts ranging from WBS considerations to innovative recipe creations. While celebrating its prowess the presentation does not shy away from addressing its limitations and security concerns, urging a cautious approach. As a grand finale, attendees are treated to an entirely AI-generated TV commercial. This engaging journey demystifies ChatGPT offering both appreciation and critical insight into this modern marvel. Beyond mere technology the presentation invites audiences to consider the impact of AI in our lives challenging us to harness its potential responsibly.

Keywords: Data Collection, Functional Requirements, Life Cycle, Methods, AI, Large Language Models, Use Cases


Economics of Digital Twins in Aerospace and Defense
Soft Skills & Trending Topics (SST08)
Patrick K. Malone

Defense and Aerospace systems engineering is transforming from a document to a digital model framework, leveraging low-cost multidisciplinary modeling, analysis and simulation tools. With these methods, engineers can specify, analyze, design and verify systems. Digital Twins enable the approach, they are digital or virtual replications of physical products and processes allowing increased speed to market and performance evaluation at reduced costs. Not straightforward is return on investment evaluation when generating cost to develop digital twins. This paper looks at development of DT architectures, capabilities and resulting life cycle cost estimates. Factors impacting DT development costs are model fidelity, design features, analytical tools, integration difficulty, scalability, and programming languages. Concepts are grouped providing practitioners tools and methods to apply digital twin concepts to recommended solutions that maintain positive ROIs and identify cost drivers.

Keywords: Cost Management, Decision Analysis, Program Management, System Engineering Digital Twins


 

Software Track

Cloud Estimating in the 21st Century – Okay, well in 2023!
Software Track (SWR02)
Chris Price

Cloud deployments represent a fast-paced technology. The ability to produce quality cost estimates for Cloud Deployments is challenging. In the current state, cost estimates must be able to address Kubernetes Orchestraters, Containers, IaaS and PaaS. Cybersecurity is also key to cloud deployments and modern development processes include working in a DevSecOps environment using Agile software development approaches. This presentation will discuss all these challenges and describe ways to perform quality cost estimates for cloud deployments.

Keywords: Data-Driven, Decision Analysis, Early Cost, IT, Parametrics, Cloud, Containers, Kubernetes, Orchestrators, IaaS, PaaS, DevSecOps


Simplified Software Estimation (SiSE) – Updated on Advancements and Trends
Software Track (SWR03)
Curtis Chase
Carol Dekkers

In 2019, representatives from the DHS Cost Analysis Division (CAD) presented early research findings for their Simplified Software Estimating (SiSE) approach at the ICEAA annual professional development workshop. Since then, further advancements ensued facilitated by the IFPUG Simple Function Point Method (SFP), revisions and expansion of the DHS CAD verb keyword lexicon, the addition of requirements risk and uncertainty considerations, and a full guidebook supporting the method. The addition of uncertainty reduces the risk associated with requirements and verb keyword ambiguities. It also gives the estimator the flexibility to create min/max/most likely estimates for requirements that are simply vague at this early requirements stage. As such, the sizing results take into account uncertainties related to different document writers, style, and verb interpretations. This presentation outlines some of the key findings, ongoing research, and (re-)introduces the SiSE approach, offering a more streamlined and accessible process.

Keywords: Early Cost, Functional Requirements, Risk, Software, Uncertainty


Agile Software Development Cost Estimating
Software Track (SWR04)
Jim Golden

This presentation will discuss agile software development cost estimating in the multi-year planning cycle. Agile software development programs focus on near term workload and activities with only limited future planning cycles identified. Future cycles are only activated as their start date nears. Any model-based cost estimate or predictive analysis for the budget needs to be flexible, responsive, and adaptive to the daily dynamics of Agile software development program planning and execution. As a cost estimator, integrating with the IPT for a particular program or project has always been a critical factor in understanding requirements, gathering data, and producing a quality estimate. With agile processes being adopted more frequently across software development organizations, cost estimators and program offices are challenged even further to work closely with developers to continuously update cost estimates. Agile sprint results reveal progress of development, and subsequently could affect the cost estimate and budget requests.

Keywords: None provided


How to Choose a Database Storage Model
Software Track (SWR04)
Tristan Judd
Jamie Bouwens
David Ferland

To design and implement a database solution, teams must conceptually understand how data is formatted in storage. We compare traditional ways of storing data in Excel or CSV formats with that of a scalable SQL format. Within a SQL database, data is typically stored in either a relational or dimensional format and we will explain these formats for novices with examples. Relational may be easier to implement but is less powerful than a dimensional format. We take you through the process of analyzing the types of data used in a team, and how that would be reflected in a dimensional format. The ability to query efficiently, linkage to popular business intelligence techniques, and scalable structure make dimensional databases the preferred option for structured data storage.

Keywords: Data-Driven, Data Storage


Measuring Software Development Efficiency in an Agile Environment
Software Track (SWR06)
Benjamin Truskin
Aubrey Dial
Peter Braxton
Ken Rhodes

Agile software development practices, while designed to deliver value sooner and accommodate changing requirements, are not intended to mitigate cost growth. Nevertheless, Program Managers must navigate this paradigm and control risk while ensuring stakeholder requirements are fully met. Traditional metrics used to measure growth (e.g., SLOC counts, productivity factors, requirements sell-off) are likely unavailable in Agile projects and while recent DoD policy recognizes the need for metrics, agile metrics are not standardized and using them for independent estimation is uncommon. This paper discusses real-world experience balancing leadership’s goals for independent analysis with the realities of an Agile environment. It will show the value of utilizing program-specific metrics and calculating useful measures such as Change Traffic and Feature (in)efficiency for producing defensible estimates, enabling better program outcomes, and providing insights for others to use themselves.

Keywords: Agile, Data-Driven, Government, Modeling, Performance Management, Program Management, Software


A Software Sizing Toolkit – Choosing the Right Approach(es) for Your P
Software Track (SWR07)
Carol Dekkers
Dan French

You’ve probably heard of source lines of code (SLOC) and function points as choices for software size, but what about RICEFW, t-shirt sizing, story points and Simple Function Points? Like the old adage “If all you have is a hammer, everything looks like a nail” – the most appropriate software sizing approach for your cost estimate may include multiple sizing methods. This presentation outlines the various units of measure available and outlines how and when each approach is most suitable. It’s a primer for cost estimators new to software intensive systems and who need to understand what are the options available when estimating software projects.

Keywords: Data-Driven, Early Cost, Functional Requirements, Software Size


Unlocking Untapped Software Metrics Potential with Jira’s RESTful API
Software Track (SWR08)
Blaze Smallwood

Many software projects manage their efforts in Application Lifecycle Management (ALM) tools, like Jira, and these tools can capture a rich set of data, which can be a treasure trove for a cost or project analyst. However, many analysts limit themselves by simply exporting flat lists of records from the tool’s User Interface (UI), which ignores valuable data captured in the system that can further enhance various analyses. This paper will focus on Jira and explain how an analyst can access several interesting additional data sets from its RESTful Application Programming Interface (API) with appropriately structured Uniform Resource Identifiers (URI). This paper will also cover how an analyst can use Java or Python programming to parse the JSON data returned from the API and produce simple but powerful data formats that can inform metrics dashboards or cost analyses.

Keywords: Agile, Data Collection, Software, Jira


Risky Business: Navigating the World of Software Productivity
Software Track (SWR09)
Dave Brown
Kevin Cincotta

Size and productivity are commonly cited as the two major software development cost drivers. Logic dictates that the two are related and inversely correlated. But what is the probabilistic range of uncertainty for productivity, given a software size? What is meant by “an 80% confidence level for productivity”? Cost analysts often quantify uncertainty with an S-Curve; why can’t this be done for productivity directly? We use International Software Benchmarking Standards Group (ISBSG) data to estimate the distribution of productivity directly and provide closed-form formulas for the fitted distribution(s). We find that productivity (and, with certain assumptions, cost) can be estimated with an S-Curve directly, using built-in Excel formulas, with no need for Monte Carlo simulation. This result has significant implications for almost any software development cost estimate, and is particularly relevant to agile development efforts where time-boxed effort is generally fixed.

Keywords: Agile, Risk, Software, Uncertainty, Productivity, Probability Distribution, S-Curve


Sizing Agile Software Development Programs
Software Track (SWR10)
Bob Hunt
Heather Meylemans
Denton Tarbet
Chad Lucas
Rainey Southworth

Size is a critical element in software cost estimation. As Agile has become more prevalent, the use of lines of code as a software size metric for software estimation has become less accepted. This presentation will discus and compare sizing alternatives including “tee shirt” sizing and functional size alternatives on large Federal Programs. The presentation will provide some emerging metrics for assessing size.. Since many automated models convert functional size to physical size, the presentation will address techniques to accomplish “backfiring”. The presentation will address the use of Natural Language Processing and models such as Cadence and ScopeMaster. And, the presentation will discuss models such as COCOMO III that directly convert functional size to hours.

Keywords: Agile Software, Software Sizing


Why Care About CEBoK-S if we Don’t Build Software?
Software Track (SWR11)
Carol Dekkers

Given the increase in software-intensive programs today, it should come as no surprise to experienced cost estimators that even minor software development can render a program overbudget and behind schedule. This presentation outlines the key differences in cost estimating approaches from traditional industries (hardware, facilities, systems) versus software development, and why CEBoK-S knowledge is critical for today’s cost estimators. Given that close to 60% of software projects are deemed failures (overbudget and/or late), with little improvement despite modern technologies, understanding the basics of software cost estimating can provide a competitive advantage for anyone involved in estimating programs for which software development is a component.


 

 

Strategy Track

From “Plan and Pray” to “Sense and Respond”: War Gaming Defense Acquisition
Strategy Track  (STY01)
Alex Wekluk
Brian Flynn
Ben Bergen

“The most dangerous phrase in the language is, ‘We’ve always done it this way.'” – Rear Admiral Grace Hopper. The need for flexible and rapid solutions in the face of emerging threats warrants a radical reset in defense acquisition. NATO’s canonical post-World War II plan-acquire-pray acquisition processes lack the agility to meet a generational change in what military historian John Keegan calls the face of battle. A new paradigm is urgently needed to meet the exigencies of modern warfare with the adaptability of the best business firms: innovating and reacting at the speed of competition. This paper provides an innovative risk-driven framework for an Acquisition War Game that laser-focuses on key metrics such as scalability, logistical footprint, time-to-contract, and fungibility – to support today’s battles and near-peer competition with our enemies. This new Acquisition War Game strategy senses and responds rather than plans and prays, meeting reality head-on in an ever-changing battlespace.

Keywords: Cost/Benefit Analysis, Data-Driven, Decision Analysis, DOD/MOD, Methods, Modeling, War Gaming


FP&A: Can We Disrupt Traditional Government Cost Estimating?
Strategy (STY02)
Christopher Metz

There is tremendous value potential in the cost estimates built today across Government under the guidance of GAO’s best practices, but perhaps with varying realization. “Cost Teams” and “Cost Estimators” are sometimes viewed as simple calculators of FTEs times labor rate in the minds of those who do not understand where a cost estimate goes and how its value can increase the chance of mission success. At our relatively new DoD Agency, we set out to find the industry equivalent to “Cost Estimating” and found “Financial Planning & Analysis (FP&A).” As we stand-up this competency we have been gathering and integrating best practices from industry’s “FP&A” and Government’s “Cost Estimating” along with novel ideas and contracting vehicles to disrupt the cost estimating field to better operationalize our cost estimates, steward taxpayers’ dollars, and meet the mission.

Keywords: Government, Operations, Cost Estimating, Financial Planning and Analysis, FP&A


Portfolio Analysis Made Effective and Simple
Strategy (STY03)
Brandon Schwark
Alan Karickhoff

Effective portfolio analysis strategies rely on robust recognition of resource constraints, competing priorities, interdependencies, and executability. They transform complexity into simplicity. Our strategy details a flexible, efficient, and analytically rigorous evaluative framework that integrates complex sets of interconnected analyses to assist leadership with data-driven resource allocation. The framework offers solutions in data cleaning, optimization algorithms, and visualization tools that enable stakeholders to effectively navigate complicated portfolio landscapes. Applicability of the framework is demonstrated through a use case that details a facility construction portfolio expected to grow aggressively in the coming years. This paper addresses the complex and often conflicting portfolio objectives mentioned above and outlines their corresponding solutions.

Keywords: Data-Driven, Decision Analysis, Government, Infrastructure, Modeling, Statistics, Portfolio Analysis; Portfolio Optimization; Resource Allocation


Parametric Construction Estimating: Consistency through Turbulence
Strategy (STY05)
Cortney Collins
Margaret Melchor

Not all estimates are created equal, but all are necessary. How is construction estimating different from DoD estimates? They both predict costs based on agreed-upon requirements; they both use historical information to develop parametrics; and they both exist as living documents, updated as new information becomes available, and delivered to the customer to assist with budgeting and purchasing. So – how are they different? This paper will highlight some of the major disparities – from how inflation is handled, to validity of pricing, to how the current economy factors into the estimate. We will also explore how materials could be affected by pandemics and natural disasters (hurricanes, earthquakes, etc.). All of these events can impact the prices of lumber and steel – which in turn, can have estimators scrambling to update the models.

Keywords: Data Collection, Government, Methods, Microsoft Excel, Parametrics, Construction


Leveraging Cost Estimating Techniques in Price to Win Analysis
Strategy (STY06)
Darren Kreitler

Leveraging cost estimating techniques is pivotal in “Price to Win” (PTW) analysis for competitive bidding. This session delves into various techniques, from analogy-based to parametric and bottom-up estimating. By integrating these methods with PTW analysis, organizations can strategize optimally, balancing profitability with competitive pricing. Real-world applications underscore the benefits of this synergy, emphasizing the role of accurate cost prediction in securing contracts and ensuring sustainability in today’s dynamic markets.

Keywords: Price-to-Win, PTW, Price Strategy, Pricing

CEBoK® Training Track

Certification Program Overview
CEBoK® Training Track (CEB00)
Kevin Cincotta, CCEA®, SCEC
Jennifer Kirchhoffer, CCEA®, SCEC
This interactive session introduces the ICEAA certifications – Certified Cost Estimator/Analyst (CCEA®), Professional Cost Estimator/Analyst (PCEA®), and Software Cost Estimating Certification (SCEC). It covers eligibility and certification requirements, examination topics, relationships to the Cost Estimating Body of Knowledge (CEBoK®), the online exam format, and recertification requirements. Great opportunity to talk with Certification Principal (Kevin Cincotta) and VP of Professional Development (Jen Kirchhoffer) and get your questions answered.

Cost Estimating Basics, Costing Techniques, and Parametric Estimating
CEBoK® Training Track (CEB01)
Bill Barfield, CCEA®
The Basics & Techniques session introduces an overview of cost estimating and analysis and the reasons for doing cost estimates, as well as four essential cost estimating techniques most often used to develop realistic and credible estimates. Additionally, we will review cost estimating products and related topics such as schedule and operations and support estimating, providing the background information and fundamental knowledge from CEBoK® Modules 1-3.


Data Collection and Normalization
CEBoK® Training Track (CEB04)
Markie Harris
Katiana (Kat) Lemmons
This session covers the Core Knowledge section of CEBoK® Module 4: Data Collection. All estimating techniques and cost estimating models require credible data before they can be used effectively. In this module we will discuss the various types of data, processes needed to collect and analyze the data used in parametric applications, as well as data types, sources, and adjustment techniques.


Inflation and Index Numbers
CEBoK® Training Track (CEB05)
Peter Braxton, CCEA®
Bob Hunt
This session covers the Core Knowledge section of Inflation and Index Numbers (CEBoK® Module 5). Proper inflation analysis is essential to the success of any cost estimate or economic analysis. Calculating inflation correctly and understanding the fundamental concepts will enable you to produce cost estimates that are timely, accurate, and credible to support your program’s lifecycle needs. It will also empower you to communicate with key stakeholders on the need to adjust your financial estimates based on changes in the economy.


Basic Data Analysis Principles and Probability and Statistics
CEBoK® Training Track (CEB06)
Kimberly Roye, CCEA®
This session discusses the analytical steps to take after obtaining a set of cost data and covers techniques for displaying and analyzing data graphically and statistical and graphical analysis of univariate and bivariate data sets (CEBoK® Modules 6 & 10). Other topics include measures of central tendency and dispersion and important probability distributions. We also introduce the concept of a random variable; Monte Carlo simulation; and the differences between the normal and lognormal distributions. Finally, we discuss hypothesis testing.


Learning Curve Analysis 
CEBoK® Training Track (CEB07)
Troy Miller, CCEA®
This is a training track presentation of the CEBoK® Module 7 (Learning Curves) will cover the key ideas, analytical constructs, and applications of the module. Beyond the theoretical information, we will present the study questions for Module 7 with steps required to solve the problems using only a calculator as is required on the certification exam.


Regression Analysis 
CEBoK® Training Track (CEB08)
Dave Brown, CCEA®, SCEC
Kevin Cincotta, CCEA®, SCEC
This course introduces the basic concepts of regression and provides a demonstration of a simple linear ordinary least squares model (CEBoK® Module 8). This session focuses on the basics required to build and evaluate a simple linear model such as a Cost Estimating Relationship (CER). Key concepts include correlation, minimizing error, homoscedasticity, statistical significance, goodness of fit, confidence intervals, uncertainty, and analysis of variance. The better you understand these concepts, the better you will be able to make inferences about cost data and employ more complicated regression techniques.


Cost and Schedule Risk Analysis
CEBoK® Training Track (CEB09)
Mel Etheridge, CCEA®
This session will provide motivation for the need for risk analysis and introduce the basic types and uses of risk (CEBoK® Module 9). It will focus on the practical execution of the general risk analysis process: develop a point estimate; identify the risk areas in the point estimate; determine uncertainty around the point estimate; apply correlation between uncertainty distributions; run the Monte Carlo simulation; assess the reasonableness of results; calculate, allocate, and phase risk dollars; and show the results.


Manufacturing Cost Estimating 
CEBoK® Training Track (CEB11)
Pat Malone, CCEA®
The goal of the Manufacturing Cost Estimating module (CEBoK® Module 11) is to arm the student with a set of techniques used to address issues unique to estimating in the manufacturing environment. It will be our objective in this module to raise a few of the most common general issues, considerations and concerns the estimator must be aware of in a typical major manufacturing environment and to provide techniques for addressing them. Depending on time and interest of attendees, example problems can be worked as exam preparation.


Software Cost Estimating Using CEBoK-S 
CEBoK® Training Track (CEB12)
Carol Dekkers, SCEC
This session covers the core knowledge of Software Cost Estimating using CEBoK-S (all PCEA/CCEA testable topics are included). It will be of particular interest to anyone studying for the ICEAA certification exam. The session provides an introduction to the basics of the software development and maintenance processes and how to estimate the related effort. The key ideas of Software Cost Estimating include the cost drivers of size, complexity, and capability. In the sizing area, we’ll focus on understanding the physical size, functional size, relative effort measures (agile software development), and RICE(FW)1 objects. We’ll also discuss the primary software development paradigms – Predictive (waterfall), Predictive with modification (incremental, evolutionary, and spiral methods), Agile (iterative, scrum, SAFe), and Hybrid – and how to model them from a cost estimating perspective.


Economic Analysis 
CEBoK® Training Track (CEB13)
Kellie Benefiel, CCEA®
This session covers the Core Knowledge section of Module 13 Economic Analysis of CEBoK®. It will be of particular interest to anyone studying for the ICEAA certification exam. The session provides a practitioner’s perspective for conducting an economic analysis (EA) by reviewing EA concepts, terminology, variables and measures-of-merit. By accounting for monetized costs, monetized benefits, opportunity costs and time-value-of-money (“discounting”), an EA enables one to calculate economic measures-of-merit.


Contract Pricing 
CEBoK® Training Track (CEB14)
Chris Svehlak, CCEA®
This session explores the basics of contract pricing (CEBoK® Module 14). We explore various contract types and the factors and considerations related to choosing a contract type. We also explore fee, shared risk, cost-price proposal preparation, the makeup of a good Basis of Estimate (BOE), and evaluation efforts. This session also provides an introduction to cost management. Some methods discussed include Total Ownership Cost (TOC), Cost As an Independent Variable (CAIV), Target Costing, and Activity Based Costing (ABC).


Earned Value Management and Cost Management  
CEBoK® Training Track (CEB15)
James Freeman, CCEA®
This session will provide an introduction to the basic concepts of earned value management (CEBoK® Modules 15 & 16), with a focus on implementation, governance, and practical application in support of a project or program. Specific topics will include basic EVM components and data elements, as well as standard earned value analysis techniques. We will use practice problems throughout the presentation to demonstrate and reinforce the basic principles of EVM.


Introduction to Design to Cost 
CEBoK® Training Track (CEB17)
Dan Kennedy
This session provides an introduction to Design to Cost (DTC), including an historical overview of DTC and other related concepts such as CAIV and Should-cost. It covers what DTC is, when it is best applied, how to implement it, and why it is important. We delve into the DTC process, focusing on several key steps in the process. Various examples of DTC in practice are discussed, as well as challenges and best practices related to DTC.