Analytical Methods Track
Developing Cost Requirements for Small Nuclear Plants
Analytical Methods (ANM01)
Dr. Amritpal Agar
Small Modular Reactor (SMR) technology is approaching commercial deployment with major investment decisions anticipated. Both the U.S. and U.K. have committed targets, with the UK aiming for 24 GW of nuclear capacity by 2050, and the DOE supporting SMR initiatives such as the $3 billion Advanced Reactor Demonstration Program. Globally, nuclear investment aims to triple installed capacity to meet escalating energy needs. Historically, nuclear projects in the West have suffered significant budget overruns, attributed to uncertainties in design, construction, and labor costs. At the front end of programmes, it is important to capture and quantify key cost requirements for successful commercial product development. In this exploratory research, I share a novel approach to gaining consistent and quantifiable cost requirements from technical experts for High Temperature Gas-cooled Reactors (HTGRs). Applying the Analytical Hierarchical Process (AHP) method at an early project stage can support clearer decision-making and provides greater confidence in design-for-cost.
Keywords: Budgeting, Decision Analysis, Early Cost, Infrastructure, Methods, Uncertainty, Variables, Nuclear, SMR.
ANM01 – Agar – Developing Cost Requirements for Small Nuclear Plants – ppt
Discount Rate Impacts on Private and Public Sector Capital Investment Valuation
Analytical Methods (ANM02)
George Bayer
The public and private sectors have different valuation techniques, motivations, and considerations when evaluating capital budgeting decisions. Understanding the difference between private and public sector capital investment analysis – discounted cash flow, cost/benefit considerations, nominal/real discounting, stakeholders, and cost of capital – helps decision-makers make better informed decisions. In 2023, the OMB updated its Circular A-94 with new nominal and real discount rates, significantly reducing the real discount rate for cost-benefit analyses. This could have unintended consequences in project valuation when including private sector stakeholders who traditionally have a higher cost of capital. In this paper, we explore the major differences in valuation, stakeholder motivations, and discount rates in public and private sector investment decisions and the impact of inflation. Understanding value drivers in both sectors is important for the cost estimating community and makes us more effective Finance professionals.
Keywords: Cost/Benefit Analysis, Data-Driven, Decision Analysis, Life Cycle, Variables, Valuation, Business Case, Economics, Discounted Cash Flow, Discount Rates
ANM02 – Bayer – Discount Rate Impacts on Private and Public – ppt
ANM02 – Bayer – Discount Rate Impacts on Private and Public – paper
National Security Space Launch Cost: Recent Trends and Estimating Approaches
Analytical Methods (ANM04)
Éder Sousa
Lisa Pelled Colabella
In 2019 the Evolved Expendable Launch Vehicle (EELV) program became the National Security Space Launch (NSSL) Program, and with that new name came many other changes in space launch providers, technology, and costs. In this briefing we provide a brief history of the NSSL and discuss recent trends, including new technology developments, providers, and cost patterns. We also discuss existing options for estimating launch costs, including available data sources, rules of thumb, and cost estimating relationships. We conclude by suggesting potential future directions to strengthen launch cost estimating capabilities.
Keywords: Methods, Space
ANM04 – Sousa – National Security Space Launch Cost – ppt
Uncovering the Hidden Triangles of Uncertainty Analysis
Analytical Methods (ANM05)
Julia Peters
Gabriella Magasic
Cost estimators require a deep understanding of risk and uncertainty to produce reliable, quality cost models. Since analysts come from diverse academic fields, they may not possess the intersectional knowledge of statistics and linear algebra that manifests in uncertainty analysis. The objective of this paper is to demonstrate this intersection using the intuitive language of geometry: we reduce the abstract problem of determining cumulative uncertainty to the more tangible problem of solving a triangle. The authors leverage the Pythagorean theorem and the Law of Cosines to geometrically explain the summations of independent and linearly correlated standard deviations, respectively. By adopting this theoretical framework for uncertainty analysis, estimators can establish a foundation from which to develop an understanding of other advanced analytical methods at the frontier of cost estimation.
Keywords: Communication, Methods, Risk, Statistics, Uncertainty
ANM05 – Peters – Hidden Triangles of Uncertainty Analysis – ppt
ANM05 – Peters – Hidden Triangles of Uncertainty Analysis – paper
Decisions in Motion: Visualizing Sensitivity with Monte Carlo
Analytical Methods (ANM07)
Gustavo Vinueza
This paper introduces a practical approach to sensitivity analysis for effective decision making. Combining Monte Carlo Simulation and scenarios, decision-makers will highlight changes impacting financial metrics. A complete set of visualizations will support this process. The approach uses two use cases: an economic project assessment and a budget model. This applied alternative generates easy-to-interpret outcomes and support clear, actionable decisions. The visualization component also permits assessing the probability of achieving target outcomes. Additionally, it aids in evaluating trade-offs, and deviations, enabling more informed choices. The proposed methodology emphasizes simplicity and flexibility. It provides organizations with a structured way to apply sensitivity analysis and simulation. Finally, it offers practitioners a data-driven framework that helps building manageable insights.
Keywords: Data-Driven; Decision Analysis; Methods; Modeling; Monte Carlo; Software; Sensitivity Analysis; What if Scenarios
ANM07 – Vinueza – Decisions in Motion Monte Carlo – ppt
Green IT Solutions for Sustainable Operations
2025 Best Paper Winner: Analytical Methods Category
Analytical Methods (ANM08)
Vivian Tang
Shane Moxley
Arlene Minkiewicz
As the urgency for environmental sustainability intensifies, this study explores how Green IT solutions can mitigate carbon emissions while driving economic benefits. It highlights the significance of understanding carbon footprint and pricing in enhancing operational efficiency and market competitiveness. Key drivers of energy costs in data centers—including facility, hardware, and software expenses—are examined, alongside the impact of geographical location and renewable energy sources on carbon footprints. The discussion outlines four actionable strategies for reducing operational costs and energy consumption: optimizing cooling systems, enhancing hardware configurations, improving software setups, and implementing advanced algorithmic techniques. Additionally, the study introduces a prototype carbon calculator, designed to estimate the carbon cost of cloud services, aligning sustainability with financial incentives. This approach demonstrates how organizations can foster greener IT practices while enhancing their bottom line.
Keywords: Cost/Benefit Analysis, Data Collection, Methods, Modeling, Regression, Carbon Study
ANM08 – Tang – Green IT Solutions for Sustainable Operations – ppt
ANM08 – Tang – Green IT Solutions for Sustainable Operations – paper
Re-Tooling the Estimator’s Approach to Escalation Forecasting
Analytical Methods (ANM09)
Sean Wells
Maya Bell
Jake Cronin
Today’s DoD contracting space is flush with economic price adjustment clauses, supply chain shocks, and half-century O&S plans. This environment necessitates that estimators treat escalation forecasts with more scrutiny than ever before. However, the toolbox for reviewing the volatility in escalation forecasts is severely limited. This paper builds off prior research and takes a two-pronged approach to enhancing estimators’ ability to forecast escalation volatility, producing more impactful insights for decision makers. First, our team has developed an ARIMA model utilizing available BLS data that forecasts escalation indices seasonally, enabling identification of commodity-level trends spanning numerous periods. Second, we compare this model to industry escalation forecasts to highlight where forecasts are historically tumultuous. This modeling advances the tool suite and offers repeatable steps for estimators to develop risk guidance and tailored approaches to escalation forecasts and better inform estimate products reliant on them
Keywords: Data-Driven, Methods, Modeling, Risk, Uncertainty, Escalation
ANM09 – Wells – Retooling Estimators Approach to Escalation – ppt
ANM09 – Wells – Retooling Estimators Approach to Escalation – paper
Communication & Visualization Track
Chasm of Archetype: Leveraging Personality Diversity in Cost Estimation
Communication & Visualization (CCV03)
Tyler Duran
Stephen Koellner
Cost estimation is a field requiring mathematical and technical expertise, typically attracting professionals from Science, Technology, Engineering, and Mathematics ( STEM) backgrounds. Individuals in these disciplines often display introverted personality traits, which can lead to a limited range of interpersonal dynamics within cost teams. However, collaboration, communication, and other interpersonal skills—often associated with extroverted personalities—are critical to team success and broader project goals. This presentation will explore strategies to harness the strengths of both introverts and extroverts, address challenges related to varying communication styles and offer insights for managing well-rounded teams where diverse personality traits complement each other.
Keywords: Communication, Program Management, Soft Skills, Extrovert, Introvert, Team Management, Interpersonal skills
CCV03 – Duran – Leveraging Personality Diversity in Cost Estimation – ppt
Cost Estimation: A Psychological Framework for Mitigating Bias
Communication & Visualization (CCV04)
Patrick Malone
Christina Snyder
Benjamin Snyder
Developing long-lasting cost estimates early in the acquisition life cycle are difficult due to many uncertainties and distortions, cognitive biases, soft architectures, risks, and budget struggles. Common tendencies for quick answers may overlook critical elements that could drive cost and schedule estimates beyond approved budgets. Understanding psychology behind cost bias and providing a framework for addressing mitigations using novel methods is discussed. The result is bias mitigation to provide realistic and reliable cost inputs for better decision-making and program outcomes. We look at root cause bias in estimates like customer pressures, shrinking industrial bases, skilled resources availability, and competitive pressures. We propose using novel methods such as weight of advice and judge advisor system to recognize and implement bias mitigation. Two examples are discussed to demonstrate how non-bias, high quality, and defendable cost estimates can be developed. Future work includes unconventional biases and Bayesian models.
Keywords: Bayesian, Communication, Data-Driven, Decision Analysis, Government, Cognitive Bias, Psychology of Bias
CCV04 – Malone – Psychological Framework for Mitigating Bias – paper
CCV04 – Malone – Psychological Framework for Mitigating Bias – ppt.pdf
Psychology of Cost Estimating
Communication & Visualization (CCV05)
Ryan Porter
Data analysis is a critical part of cost estimating, but how do our brains impact the way we utilize, analyze, present, and interpret data? This briefing will dive into some of the enlightening conclusions psychologists have arrived at regarding human brains and how they handle certain scenarios that are prevalent in cost estimating. The psychology concepts include human judgement vs. data-based predictions, system 1 vs. system 2 thinking, understanding compounding, anchoring effect, halo effect, narrative fallacy, and interpersonal communication. This presentation will include some real-life examples gleaned from Nobel Prize winners and Harvard Business School professors, as well as real-life cost estimating scenarios from two ACAT I programs in AFLCMC.
CCV05 – Porter – Psychology of Cost Estimating – ppt
CCV05 – Porter – Psychology of Cost Estimating – Paper
Recruiting and Development of Cost Analysts and Consultants: A Changing Approach
Communication & Visualization (CCV06)
Del Roberts
Robert Smale
Recruiting experienced cost analysts and consultants is challenging due to their specialised knowledge and high remuneration demands. The evolving landscape of costing now requires comprehensive through-life cost management and enhanced consultancy and stakeholder management skills. To address this, Sirius Analysis adopts a unique strategy by broadening its talent pool. Alongside experienced cost analysts, we recruit individuals with backgrounds in mathematics, economics, and accountancy, leveraging their transferable data and analytical skills. Our approach includes mentoring junior staff and developing their cost engineering and analysis expertise through the ICCEA CEBoK and CCEA certification programs. Additionally, we enhance consultancy and stakeholder management skills through gaining project management experience and in people skills development, utilising Motivational Maps and Myers-Briggs personality type assessments to foster self-awareness and optimise teamwork with clients and customers.
Keywords: Cost Management, Project Controls, Career development, people skills
CCV06 – Roberts – Recruiting and Development of Cost Analysts – ppt
Question the Requirement: Using IGCEs to Reduce Waste
Communication & Visualization (CCV07)
Ryan Webster
Wendy Cassidy
Independent Government Cost Estimates (IGCEs) serve a valuable purpose, showing the expected price of reasonable vendor bids. This ensures the government pays a fair price based on the written requirement. However, these written requirements often operate on momentum and skip a valuable step of validation. Software license renewals, subscriptions, professional service hours, and warranties are just a few examples of items often bought in bulk with no process to monitor utilization. This can result in the unintentional purchase ahead of need, and the re-purchase of services that go unused. Augur will walk through specific observations and discuss how the IGCE process can be used to force re-validation of requirements as well as suggest contract structures to more effectively control costs. Additionally, real life examples of visualizing utilization rates and warranty break even analysis will be shown, which have led to modifying program strategy and the inclusion of valuable contract language.
Keywords: Cost Management, Cost/Benefit Analysis, Data Collection, Data-Driven, Government, Methods, Performance Management, Regression, IGCE, Negotiations, Contracts, Waste
CCV07 – Webster -Question the Requirement – ppt
Justice for the Numbers: A Trial of Estimation Blunders
Communication & Visualization (CCV08)
Nikol Podlacha
James Monopoli
Get ready for the trial of the century — where numbers aren’t just crunched, they’re grilled! A cost analyst is in the hot seat, accused of wreaking havoc with a poor cost estimate, and it’s up to YOU to follow the evidence. Misused obligation vs. expenditure profiles? Full funding rules ignored? JA CSRUH violations? This courtroom is packed with shocking revelations and expert testimonies, with each twist in the trial exposing another costly mistake. You’ll see how SEPM factor miscalculations, mishandling CERs, and ignoring best practices can cause disaster if left unchecked. Every turn reveals practical insights on avoiding estimating pitfalls, and how to master the art of clear accurate cost estimating. You’ll leave equipped with essential takeaways to strengthen your cost analysis skill set and ensure you’re never caught off guard in front of your audience — or worse, on trial yourself.
Data Science Track
Use of VBA and Python in Analytical Tools
Data Science (DSC01)
David Ferland
Ashlen Grote
Michael Giannotti
Data analytics often requires automation and repeatability to be effective and two popular coding languages that accomplish this task are python and Visual Basic (VBA). These methods provide varying levels of processing power, user interface generation, data visualizations, and learning curves, but both have their own downfalls. Using a side-by-side comparison, we step through tutorials and use cases for both languages in basic regression generation and analytical processing. While both languages have an established history in the field of data analytics, we also propose a lesser-known technique of combining the capabilities of both languages into a synergistic method of tool generation. For data analysts or cost engineers who are not familiar with one of the three methods proposed (either language alone or the two used together), we provide the basic building blocks to help you plan your next project and examples to get you on the right track.
Keywords: Methods, Microsoft Excel, VBA, Python, Analytics
DSC01 – Ferland – VBA and Python in Analytical Tools – ppt
AI and Cost Estimation: Data Science’s Expanding Role in Cost Estimating
Data Science (DSC02)
Daniel Harper
Kevin McKeel
AI and ML Tools such as Chat GPT will take on an expanded presence in Cost Analysis over the next 5 years and beyond. With terms like Natural Language Processing and Large Language Models being bandied about like pickleballs, getting our heads around this technology can be overwhelming. The goal of this presentation is to give you, the professional estimator, a primer on AI and hopefully raise your comfort level with how AI should (and should not!) be used in our world. Our presentation will also touch on AI and data visualization/ economic and environment impact (chip manufacture, data centers) of AI; tips on AI prompting; and using AI securely.
Keywords: Artificial Intelligence, Data Science, Machine Learning, NLP
DSC02 – Harper – AI and Cost Estimation – ppt
Leverage Business Intelligence Tools for Creative Data Exploration
Data Science (DSC03)
Kyle Lowder
Ryan Lowenstein
Business Intelligence software is more than just dashboards! Leverage tools like Microsoft Power BI to take your engineering data exploration to new heights. With flexible data manipulation, robust relational architectures, powerful multi-context formulas, and easy-to-adjust interactive reports, these tools can expand your data analysis and maximize your efficiency. This seminar explains how to harness this capability quickly and repeatably. Intuitively straighten out data from messy source formats. Manipulate data with DAX formulas that far exceed spreadsheet capabilities. Quickly develop robust relationships between disparate data sources. Explore results with the library of engaging, interactive visuals (including Python plots!), and share your results as a professional, polished report. Users who complete this seminar will understand how to harness Power BI for their day-to-day efforts, including a set of further resources suited to all skill levels.
Keywords: None
DSC03 – Lowder – Leverage Business Intelligence Tools – ppt
Missing in Space: How Imputation Fills the Satellite Data Void
2025 Best Paper Winner: Communication, Data Science, and Machine Learning Category
Data Science (DSC04)
Jonathan Matkin
Daniel Newkirk
Data analysts must often deal with potentially valuable but sparsely populated datasets. Missing fields can limit the utility of a dataset for conventional regression analysis and predictive methodology development. One way to overcome this challenge is to impute the missing fields, but this approach can face concerns over the fidelity of derived data points and resulting methodologies. No prior study of space system data has extensively compared multiple imputation techniques using varying percentages of missing data and quantified their performance against known truth values. In order to assess the potential of using imputation techniques, the Space Systems Command cost research team has conducted a rigorous case study in applying multiple imputation methods to produce a single “complete” dataset of technical parameters such as weight, power, design life, etc. for a large open-source dataset of ~1900 satellites. This paper highlights the major areas of research that were undertaken including: various imputation techniques, quantified imputation performance, limitations of multiple imputation, and lessons learned. Additionally, a real-world application is described using an imputed dataset to assess the reasonable of technical input parameters for early life cycle cost estimating for satellites is discussed.
Keywords: Data Collection, Modeling, Imputation, Machine Learning
DSC04 – Matkin – Imputation Sattelite Data Void – ppt
DSC04 – Matkin – Imputation Sattelite Data Void – paper
Implementing Low Code Solutions for Financial Surveillance and Reporting
Data Science (DSC05)
Ryan Nicholos
Connor Maloney
Sebastian Rodriguez Traconis
The adoption of low-code platforms has revolutionized financial surveillance by enhancing the integration and analysis of data across enterprise systems. Traditional methods for connecting disparate financial data sources often require time, technical expertise, and manual intervention, leading to inefficiencies and operational risks. Low-code technology addresses these challenges by enabling faster development of custom applications and workflows with minimal hand-coding. This allows institutions to automate data collection and reporting, seamlessly bridging multiple enterprise data sources such as ERP systems and siloed databases. The intuitive interfaces and pre-built modules empower stakeholders to contribute to processing automation, reducing the reliance on IT teams and enabling rapid iteration to address client needs. In support of USN, our team set out to implement low-code techniques to streamline data validation, anomaly detection, and regulatory compliance by creating dashboards between two enterprise data sources.
Keywords: Budgeting, Cost Management, Cost/Benefit Analysis, Data Collection, Data-Driven, Low Code
DSC05 – Nicholos – Low-Code Platform Implementation – paper
DSC05 – Nicholos – Low-Code Platform Implementation – ppt
Dr. Strangedata, Or: How I Learned to Stop Worrying and Love O&S Data Validation
Data Science (DSC07)
Patrick Shelton
Daniel Puentes
Alexis Lewandowski
NNSA leadership, the GAO, and Congress are increasingly focused on estimates that capture the total lifecycle cost of capital projects, rather than just the acquisition phase. In response, NNSA has undertaken an effort to improve data-driven methods to estimate the Operations & Support (O&S) portion of lifecycle cost. This effort was derailed when exploratory data analysis on NNSA’s O&S database of record turned up a litany of inconsistencies, errors, and insufficiencies that precluded method development. This was a result of weak policy and nonexistent validation for how the NNSA enterprise reports data. This paper will document the challenges that had to be addressed, detail the extensive data normalization and validation process conducted, before presenting defensible O&S cost estimating relationships that were the original focus of the mandate from oversight. Lastly, this paper will provide recommendations for organizations interested in making their O&S databases authoritative and functional for the purposes of cost analysis.
Keywords: Budgeting, Data Collection, Data-Driven, Early Cost, Manufacturing, Methods, Modeling, Operations, Parametrics
DSC07 – Shelton – Dr. Strangedata – paper
Management, EVM & Scheduling Track
Show Me the (Schedule) Metrics
Management, EVM & Scheduling (MES01)
Wendy Cassidy
Michelle Chau
With urgent needs to deliver capabilities to the fleet, program managers must focus on maintaining schedule, delivery dates, and the metrics to track program progress. Analysts can find themselves in a difficult position when vendors have a low or zero likelihood of meeting their baseline with program managers resistant to accept schedule slips. This topic discusses how vendors can be held accountable when schedule baselines are unrealistic, how to generate and manage to more realistic projections, as well as effective methods to communicate to program managers when they say “show me the metrics” and the answer is unfavorable. Specific topics to be explored include critical path and driving path analysis, critical path length index (CPLI), vendor current execution index (CEI) earned schedule metrics and projecting more accurate vendor schedules through schedule risk analysis.
Keywords: Performance Management; Program Management; Scheduling; Critical Path; Schedule Baseline Management; Communication
MES01 – Cassidy – Show Me The Schedule Metrics – ppt
Annie Oakleying Your Risk Cube: Management Goes Better with Analysis
Management, EVM & Scheduling (MES02)
Peter J. Braxton
David H. Brown
Robert G. Fatzinger
Sean Wells
Despite advances in both Risk Management and Risk Analysis over the past two decades, there remains a persistent divide between the two disciplines. Risk Management, with its predisposition to discrete risks (“what could go wrong?”) and central construct of the Risk Cube, tends to have a myopic focus on Issues and struggles to accurately assess and portray Likelihood and Consequence. Risk Analysis, with its predisposition to continuous risks (“what is the range of possible outcomes?”) and central construct of the S-Curve, brings the mathematical tools needed for fidelity in risk calculations but struggles to put a recognizable face on these necessary abstractions. This paper seeks to unify these two fields by drawing the best from each. It identifies fundamental deficiencies in the Risk Cube, such as lack of basis, distortion of linear scales, misleading color codes, and junk math, and remedies them with principles from Risk Analysis to better inform decision-makers.
Keywords: Communication, IPM, Methods, Modeling, Risk, Risk Management, Risk Analysis
MES02 – Braxton – Annie Oakley Risk-Management – paper
MES02 – Braxton – Annie Oakley Risk-Management – ppt
The Best of Both Worlds: Combining Cost Risk Analysis with EVM
Management, EVM & Scheduling (MES03)
Murray Cantor
Christian Smart
Cost risk analysis and earned value data are typically used separately and independently to estimate Estimates at Completion (EAC). However, there is significant value to combining the two in order to improve the accuracy of EAC forecasting. In earned value management (EVM), the Estimate at Completion (EAC) is a critical metric. It is used to forecast the effort’s total work cost as it progresses. In particular, it is used to see if the work is running over or under its planned budget, specified as the budget at completion (BAC). This paper will explain how to specify the initial PDF and learn the later PDFs from the data tracked in EVM. We describe the technique called Bayesian parameter learning (BPL). We chose this technique because it is the most robust for exploiting small sets of progress data and is most easily used by practitioners.
Keywords: Bayesian, Cost Management, Data-Driven, Methods, Risk, Statistics, Uncertainty
MES03 – Cantor – Best of Both Worlds – paper
MES03 – Cantor – Best of Both Worlds – ppt
Enabling Measurable Success in DoD AI Programs from Acquisition to Operations
2025 Best Paper Winner: Management, EVM, Software & Agile Category
Management, EVM & Scheduling (MES05)
Dave Cook
Kenneth Rhodes
Tim Klawa
In today’s DoD, AI programs are focusing on integrating capability and are failing to invest in the acquisition framework necessary to enable continued success. Considering how contracts specify cost, schedule, and technical performance parameters to ensure end products are directly linked to measurable and manageable criteria is critical for a program that evolves rapidly. AI program managers can optimize resource allocations by building a metric-level risk score and identifying both qualitative and quantitative impacts to cost/performance, which can transfer across contracts as training data is developed, algorithms/models are built and tested, and capabilities are deployed into operations. Grounded in recent experience with large-scale AI programs, this paper identifies gaps associated with the DoD Acquisition Pathway and provides an adaptive framework to measuring risk, as well as quality and performance metrics that reduce risk, enable real-time decision-making, and ultimately result in more successful acquisitions.
Keywords: Data-Driven, Risk, AI, DevSecOps
MES05 – Cook – Enabling Measurable Success in AI Programs – paper
MES05 – Cook – Enabling Measurable Success in AI Programs – ppt
Quantifying Schedule Uncertainty in Acquisition: A Bayesian Network Approach
Management, EVM & Scheduling (MES06)
Joshua Hamilton
Conventional schedule estimating methods, like CPM and PERT, often make simplifying assumptions about task independence and uncertainty that can substantially limit forecast accuracy and precision. Bayesian network (BN) analysis serves as an alternative technique that may overcome weaknesses of current methods by intricately modeling the interdependencies among many tasks and addressing task duration uncertainty. This study applies BN to acquisition program schedules by generating a duration probability distribution for tasks, milestones, and overall program completion, in order to estimate confidence-level predictions for schedule delays that can be updated as new information becomes available. This research finds that BN can help to identify critical tasks that are likely to cause cascading delays and maps at-risk tasks across the network. BN is a powerful, dynamic tool for improving project risk management and offers more accurate delay forecasts than conventional methods.
Keywords: Bayesian, Data-Driven, Scheduling, Uncertainty
MES06 – Hamilton – Quantifying Schedule Uncertainty – ppt
Python Tool Development to Support Alternative EVM reporting
Management, EVM & Scheduling (MES07)
Daniel Hearn
The cost of full EVM for this Fixed Price Incentive contract was estimated to increase the cost per AUR by nearly 6% which exceeded what the program was willing to pay. However, cost and schedule analysis were still required to support the program, leading us to explore alternative forms of EVM which did not follow the usual format 1-7 requirements. One of the ways this is accomplished is by leveraging the programming language Python which enables us to work with data sets that exceed Excel’s limitations and automate analysis and even deliverables. This presentation will cover the history of the program and the evolution of a python tool which began as a simple time-saving script and grew into an automated product which updates its forecast with each new data set and generates a PowerPoint brief.
Keywords: Data-Driven, DOD/MOD, Government, Modeling, Program Management
MES07 – Hearn – Python Tool Support EVM – ppt
Gotta Go Fast: Modular Open System Approach’s Impact on Schedule
Management, EVM & Scheduling (MES08)
Kayla Vogler
A modular open system approach (MOSA) and its inclusion of open architecture are among the prevailing acquisition strategies for cost and schedule management. This approach involves the incorporation of reusable, modular packages that can be incrementally added and upgraded throughout programs’ lifecycles. Few studies examine the interaction between MOSA and the extent to which it influences cost and schedule performance and no studies to date examine open architecture’s impact via programmatic evaluation of Earned Value Management (EVM) metrics. This session will be focused on the results of my research that investigated and compared EVM data for aircraft that do and do not employ open architecture. Overall, findings support that the presence of open architecture is negatively associated with schedule performance around the halfway point for development contracts. It is theorized that programs adopting open architecture may be too overoptimistic estimating schedule.
Keywords: Data-Driven, EVM, Function Points, Performance Management, Regression, Modular Open System Approach, Open System Architecture
Machine Learning & NLP Track
Employing AI Agents for Cost Estimation
Machine Learning & NLP (MLN02)
Brad Clark
David Rampton
Agile’s flexibility enables rapid software releases and fosters strong collaboration, but it often lacks rigorous preplanning, making cost tracking and budget alignment challenging. Traditional solutions, like leveraging analogous systems or assigning a preplanning team to develop functional requirements, can be inefficient and costly, compromising Agile’s speed. This session introduces a groundbreaking approach using generative AI to assist or automate critical preplanning tasks, such as functional requirement development and function point counting. We’ll discuss how this AI-driven solution preserves Agile’s speed while enhancing cost estimation accuracy, providing a cost-effective way to address preplanning challenges. Real-world examples will illustrate how generative AI delivers precise cost insights without slowing the development cycle. Attendees will learn strategies for integrating generative AI into Agile workflows, streamlining processes, and achieving budget alignment.
Keywords: Agile, Methods, Parametrics, Software, AI
AI-Powered Coding: How LLMs Help Cost Analysts Build Custom Tools
Machine Learning & NLP (MLN03)
Kyle Ferris
Eric J. Hagee
The rapid advancement of artificial intelligence (AI) capabilities, particularly large language models (LLMs), establishes AI-powered coding assistance as a new industry benchmark. Despite their ongoing evolution, these emerging technologies already prove their value in the field of cost estimating, where they enable cost analysts to quickly and efficiently develop custom tools that improve analytical workflows. Through demonstration of an interactive LLM session using ChatGPT, this presentation aims to provide three key takeaways: (1.) How effective prompt engineering can fine-tune LLMs to assist in the development of cost-specific tools, (2.) The iterative nature of user-model interactions, emphasizing how high-quality user inputs lead to progressively improved model outputs, and (3.) How cost analysts with limited programming experience can leverage LLMs for code syntax support, code debugging, and the translation of natural language requirements into script templates, enhancing their confidence in building custom toolsets.
Keywords: Functional Requirements, IT, Methods, Modeling, Process Engineering, Software, AI, Data Science, Data Analytics, Large Language Models, Cost Tools
MLN03 – Ferris – AI Powered Coding LLMs – ppt
A Real Example of How I Used ChatGPT
Machine Learning & NLP (MLN04)
Rivers Jenkins
Have you seen endless presentations on how generative AI could be applied to cost estimating but no real “success stories”? This presentation will walk through a case study where ChatGPT actually helped solve a problem and informed a cost estimate. In this example, a program office asked for help generating a cost estimate for their software bug fix maintenance. Further discussion with the program office revealed 8 years of monthly software bug hours data, and data from the past 1.5 years revealed a festering backlog of 500 outstanding software bugs needing fixes. After identifying the true problem and what data was available, it was time for some ChatGPT. Note that certain distinguishing details of the program and data in this presentation are intentionally obfuscated, but the analytic steps and conclusions remain valid.
Keywords: Data-Driven, Software, Uncertainty, AI, Python
MLN04 – Jenkins – Real Example of How I Used ChatGPT – ppt
Leveraging Synthetic Data for Maximum Predictive Power
Machine Learning & NLP (MLN05)
Obai Kamara
Taylor Fountain
In cost estimation, limited data sets often constrain the effectiveness of analytical techniques leading to less accurate predictions. This paper explores the application of Generative Adversarial Networks (GANs) for generating synthetic data to enhance estimating practices in scenarios with limited data. By examining the implications of synthetic data generation and the accuracy of Cost Estimating Relationships (CERs) derived from synthetic versus real data, we assess GANs’ potential to improve reliability in cost estimates. This presentation will provide an overview of GANs as a machine learning technique, describe potential applications for cost estimators, identify limitations of the technique, and make recommendations for near term uses. The findings aim to illustrate the viability of synthetic data as a complement to traditional sources, ultimately contributing to more robust and adaptable cost estimation methodologies.
Keywords: Data Collection, Data-Driven, Methods, Modeling, Monte Carlo, Parametrics, Regression, Machine learning, CERs, data science, Generative Adversarial Networks
MLN05 – Kamara – Synthetic Data for Maximum Predictive Power – ppt
MLN05 – Kamara – Synthetic Data for Maximum Predictive Power – paper
Cost Estimation Guidance for AI Software Development Projects
Machine Learning & NLP (MLN06)
Arlene F. Minkiewicz
The rapid evolution and integration of Artificial Intelligence (AI) within software development demands updated approaches to cost estimation. Traditional cost estimation models, primarily designed for conventional software projects, often fail to address the unique complexities and variables inherent in AI algorithms. This paper provides a comprehensive framework for estimating costs in AI-based software projects, bridging the gap with insights from industry best practices, academic research and lessons learned from recent AI projects. Key cost drivers such as data preparation, algorithm selections, model training, deployment and maintenance are thoroughly analyzed. The framework introduces a multi-phased approach encompassing the many activities in a software development project, tailored specifically to AI projects. Emphasis is placed on the iterative and experimental nature of AI development, often involving extensive
Keywords: Data Collection, Software, Artificial Intelligence (AI), Machine Learning (ML), software cost estimation, iterative development
MLN06 – Minkiewicz – Guidance for AI Software Development – ppt
MLN06 – Minkiewicz – Guidance for AI Software Development – paper
Why You Should Not Use Gen-AI
Machine Learning & NLP (MLN07)
Trevor Lax
Advances in Generative Artificial Intelligence (GAI) and Large Language Models (LLMs) have made them ubiquitous features of our lives, with AI generated summaries even appearing at the top of web searches. Because of the immense quantity of data these models are trained on, nearly all training is unsupervised with occasional, supervised fine-tuning. While LLMs excel at mimicking human speech, the output is essentially statistical amalgamations of context clues and is truth agnostic, so it is incorrect to speak of hallucinations or lies. Because GAI/LLMs can only be connected to truth through supervised learning, they should not be used in situations where truth matters. Encouragingly, GAI/LLMs perform splendidly on standardized questions involving summarization of material, and there are continual advancements, such as entropy based uncertainty estimators and Explainable AI (XAI). We explore various promising and ill-advised use cases related to cost analytics while discussing underlying mechanisms and potential advances in GAI/LLMs.
Keywords: Data-Driven, IT, Risk, Statistics, Uncertainty
Modeling, Tools & Case Studies Track
Accelerating Readiness – The Cost Impact of O&S Modernization
Modeling, Tools & Case Studies (MTC01)
Alexander Bonich
Sean Wells
Donovan DeStefano
Ronit Mukherjee
The Department of Defense faces the daunting challenge of improving warfighter capabilities while effectively managing resource allocation. One way to accomplish this is via hardware upgrades during sustainment, incorporating readily available components and new technologies to reduce costs and enhance capability of deployed systems. However, current policy dictates that Operations and Maintenance (O&M) funds cannot be used for hardware upgrades. This paper examines the impacts of a proposed budget reform removing this restriction. First, by analyzing Army depot data to validate the purported benefits of acquisition contractors performing more upgrades. Second, by reviewing Army ground vehicle programs utilizing newer, modular designs to understand the impact of sustainment modifications on operational availability. With a growing emphasis on sustainment reviews and O&M costs, this timely study examines whether such a reform would effectively achieve these objectives or inadvertently contribute to depot delays and fleet deterioration.
Keywords: Budgeting, Cost Management, Cost/Benefit Analysis, Life Cycle, Program Management
MTC01 – Bonich – Accelerating Readiness – ppt
MTC01 – Bonich – Accelerating Readiness – paper
Schedule Risk Using TRL Data
2025 Best Paper Overall
Modeling, Tools & Case Studies (MTC03)
Kyle Coughlin
The Aerospace Corporation has developed a statistical model for measuring the probably that a technology will advance from TRL X to TRL Y in a specific amount of time and can be used to complement current cost-based risk models. It was originally a model based on the NASA TechPort Database. The model is flexible and can be generated on any data set that contains start/end TRL and start/end date. The model can be used in acquisitions to measure which projects have the highest probability of succeeding within the project time frame. One can also invert the model result to determine the expected time to complete a TRL advancement. Additionally, the model can be adjusted to include natural language data such as descriptions and taxonomy. This adjustment allows a user to input the same natural language data for their projects, the model is then built on a subset of the underlying database that reflects projects similar to the user input. This allows users to consider how different types of projects differ in advancement probability.
Keywords: Data-Driven, Decision Analysis, Methods, Schedule Analysis
MTC03 – Coughlin – Statistical Modeling of TRL Levels – Paper
MTC03 – Coughlin – Statistical Modeling of TRL Levels – Ppt
Affordable Solutions – Forging the Missing Link: Projects to Portfolio
Modeling, Tools & Case Studies (MTC04)
Brian Flynn
Ben Bergen
Alan Karickoff
Nobody but nobody wants unaffordable solutions. Yet projects are frequently initiated that can’t be paid for. This leads to cost and schedule growth, program cancellations, or fielding a capability without the resources to sustain it. Affordability analysis, holistically executed, ensures that the cost of a project syncs with long-range modernization, force structure, and manpower plans, given resource constraints. Too often, however, affordability analysis devolves into an assessment of funding needs over a five-year horizon. This myopic perspective short-changes decision makers. Our research presents an innovative alternative, the Technomics Affordability Analysis Tool (TAAT). Integrating life-cycle cost estimates, assessments of risk, and evaluation of resource availability, it offers leadership a menu of options for pricing the current program of record and for adjusting other projects in the portfolio. The bottom line? Sharpened illumination of trade space, better understanding of risk, and stronger alignment of projects with strategy.
Keywords: Budgeting, Decision Analysis, Early Cost, Project Controls, Uncertainty, Cost goals, cost caps, affordability, portfolio impacts, S-curves
MTC04 – Flynn – Affordable Solutions – paper
MTC04 – Flynn – Affordable Solutions – ppt
Forecasting Price Escalation
Modeling, Tools & Case Studies (MTC05)
John C. Fries
For a commercial product currently in development in a private/public partnership, how would we forecast the movement in price once the product is available on the market? We start with the vendor’s view of the likely starter price, and its expected availability date, but then what? Would economies of scale for the new product kick in and lower the price, or would surging commercial demand drive the price higher? We present a forecast analysis for the US Army Medical Materiel Development Activity’s development of an Enterotoxigenic Escherichia Coli vaccine, an effort to create an FDA-approved E-Coli vaccine to treat DOD civilian and military personnel – with potential benefits to worldwide civilian populations exposed to E-Coli. We show how, with a decade of price data for our selected analogous vaccine – one for typhoid – we derive a price escalation factor used to make the price forecast for E-Coli vaccine.
Keywords: Budgeting, Methods, Statistics, Uncertainty
MTC05 – Fries – Forecasting Price Esclation – Ppt
MTC05 – Fries – Forecasting Price Esclation – Paper
Generating Full Monte Carlo S-Curves Without External Software
Modeling, Tools & Case Studies (MTC06)
Kevin Cincotta
Matt Griesbach
Travis Goodwin
MITRE developed the Investment Value Management Framework (IVMF) to automate many commonly performed tasks for cost analysts. It provides a rigorous, systematic process for monetary and non-monetary comparisons of costs and benefits by Course of Action (COA). Automated treatment of Cost Risk and Uncertainty Analysis (CRUA) is central to our approach. We used innovative techniques such as Normal to Anything (NORTA) and Cholesky decomposition to generate Monte Carlo random variates that faithfully replicate the analyst’s desired distributions, while preserving correlation. In its “Output Only” mode, the method generates S-curves instantaneously, without a need for Excel Add-Ins, external software, or coding. Numerical experiments show the method to be equivalent or superior to commonly used Excel Add-Ins. We will demonstrate the capability and discuss its potential uses.
Keywords: Budgeting, Data-Driven, Microsoft Excel, Modeling, Monte Carlo, Parametrics, Risk, Uncertainty, Variables, NORTA, Cholesky
MTC06 – Cincotta – Generating S-Curves Without External Software – ppt
MTC06 – Cincotta – Generating S-Curves Without External Software – paper
Cost vs. Contagion: Evaluating the Impact of Public Health Spending on Pandemics
Modeling, Tools & Case Studies (MTC07)
Joseph Guy
The COVID-19 pandemic presented a century-defining challenge, forcing countries into quick and decisive action or risk their public health services being overwhelmed. Problems arose from the variation in national doctrines, strategies and implementation methods. This study investigates these responses, evaluates their cost-effectiveness and draws conclusions on their success and future viability in the next global pandemic. Using data gathered from the COVID-19 pandemic, this study uses a compartment model framework to build a simple set of equations to visualise a hypothetical COVID-like pandemic and mitigation responses from the international community. Using established cost analysis methods, these alternative responses are analysed in terms of their relative cost and effectiveness. This will enable a more robust view of the cost effectiveness of mitigation strategies and support decision making in the event of another pandemic.
Keywords: Data-Driven, Government, International, Public Health, Epidemiology, Cost-Effectiveness, Alternatives Analysis
MTC07 – Guy – Cost vs Contagion – ppt
Safety Fourth
Modeling, Tools & Case Studies (MTC09)
Douglas Howarth
When building a plane, designers often begin by considering its capacity, top speed, and range. Rocket scientists studying a new launcher will be interested in its payload, total burn time of all stages, and how efficiently its engines generate thrust. In both cases, safety concerns may fall to the level of a fourth-order effect. As fourth-order effects go, though, safety is hugely important. The Tupolev Tu-204 carries more people, goes faster, and has a longer range than the Boeing 737 NG series but sells for less and has only one sale for every 80 NGs. While very safe, the Tu-204 crashes 36 times more frequently than the NG. Research shows markets support an increased value for added safety. This paper will study the additional costs and returns on investment for increased safety in aircraft and launch vehicles, providing readers with a framework to optimize their programs in the future.
Keywords: Cost/Benefit Analysis, Decision Analysis, Methods, Parametrics, Space, Hypernomics, Optimization
MTC09 – Howarth – Safety Fourth – ppt
MTC09 – Howarth – Safety Fourth – paper
Terminal Descent: The Manufacturing Delay and Disruption Cycle
Modeling, Tools & Case Studies (MTC10)
Brent M. Johnstone
Another large, technically complex project comes in late to need and over cost. These failures are attributed to bad cost and schedule estimates, poor program management, or unforeseen circumstances. But is there a deeper explanation? This paper examines the cycle of delay and disruption that begins when changes from planned conditions are introduced. Large or frequent changes begin a cycle of rework and degraded performance that, once initiated, is difficult to escape. This cycle creates cost overruns and schedule delays which, if uncontrolled, can cause disaster. To illustrate the rework and performance cycle, a conceptual model is presented to apply these principles to a hypothetical aircraft development program. The model demonstrates that even relatively small changes in scope, performance of quality, or program funding can quickly push a project off the rails and result in late deliveries and higher costs.
Keywords: Labor, Learning Curves, Manufacturing, Methods, Modeling, Scheduling, Disruption
MTC10 – Johnstone – Terminal Descent Manufacturing Delay Disruption – Paper
MTC10 – Johnstone – Terminal Descent Manufacturing Delay Disruption – ppt
Escalation and Inflation – Handle with Care
Modeling, Tools & Case Studies (MTC11)
Melissa Cyrulik
Jennifer Kirchhoffer
Lions, tigers, and bears have nothing on inflation, escalation, and real price changes – oh my!!! Is your cost model set up adequately to identify and handle all the components of escalation correctly? Dealing clearly and appropriately with inflation and escalation in your estimates is imperative. The ACEIT team has spent the last two years working with DoD and other federal clients to understand their needs in dealing with escalation throughout the estimating process and to incorporate DoD-approved escalation terminology into ACEIT 8.2. Join us to understand the key concepts in working with the most recent escalation guidance. Escalation issues properly conquered in our models… then this is a day of independence for all the Munchkins and their descendants!
Keywords: Life Cycle, Methods, Modeling, Inflation, Escalation, Real Price Change
MTC11 – Cyrulik – Escalation Inflation Handle With Care – ppt
Cost Modeling for IT Deployment Projects
Modeling, Tools & Case Studies (MTC12)
F. Gurney Thompson III
In this presentation, we discuss cost modeling for IT projects with equipment deployed both in data centers and in the field, using a Smart City implementation as a use case. Our methodology is structured around four main branches: 1) developing configurations for all deployed equipment, 2) developing software to run on servers, 3) conducting system-wide verification and validation testing, and 4) conducting deployment activities such as site surveys and physical installations. Our ongoing research focuses on creating prototype models to aid laymen estimators in making credible cost estimates. We are developing new models to estimate device configuration and network engineering effort for servers, supporting network hardware, and field devices. These models evaluate the system architectures and complexities of the software/operating system, networking, security, scalability/performance, and data. We aim to share our methodologies and demonstrate how these models streamline cost estimation for complex IT deployment projects.
Keywords: Infrastructure, IT, Methods, Modeling, Parametrics
MTC12 – Thompson – Cost Modeling for IT Deployment Projects – ppt
Processes & Best Practices Track
Biases in Project Estimating and Mitigation Strategies to Overcome Them
2025 Best Paper: Processes, Best Practices, and Technical Innovations Category
Processes & Best Practices (PBP02)
Brian D. Glauser
Project estimating is a critical component of successful project management, yet it is often hindered by cognitive biases and logical fallacies that distort accuracy, credibility, and reliability. This paper examines common biases, alongside logical fallacies, which negatively impact estimates. Real-world examples illustrate the costly consequences of these cognitive pitfalls, particularly from the aerospace and defense industries—where such biases have had profound negative impacts. Mitigation strategies are explored drawing on the foundational work of Daniel Kahneman, Bent Flyvbjerg and others. The paper emphasizes the importance of transparency, accountability, and adaptive approaches in mitigating biases. By applying these strategies, organizations can enhance estimation credibility, manage risks effectively, and better ensure project success within budget and schedule constraints.
Keywords: Bias, Decision Analysis, Program Management
PBP02 – Glauser – Biases in Project Estimating – paper
PBP02 – Glauser – Biases in Project Estimating – ppt
Behind the Scenes: A Look at Recent GAO Audits
Processes & Best Practices (PBP04)
Jennifer Leotta
Since the Government Accountability Office (GAO) Cost Guide was released as an exposure draft in 2007, the GAO has used it as criteria to assess agencies’ cost estimates. This presentation will provide an overview of the GAO process and criteria used to assess cost estimates. Then, we will examine the results and recommendations from three recent GAO reports.
Keywords: Government, Project Controls, Audit, Oversight
PBP04 – Leotta – Look at Recent GAO Audits – ppt
Challenges in Implementing GAO’s Estimating Best Practices
Processes & Best Practices (PBP05)
Michael Nosbisch
DOE large projects have been on GAO’s High Risk List since its inception in 1990, which continues to have funding implications in relation to planned projects/programs across the Complex. To be removed from this infamous list, DOE sites/contractors need to show that their estimating policies and procedures are aligned with the best practices GAO has formalized within its Cost Estimating and Assessment Guide. Over the past year, the presenter has been conducting formal estimating training and audits in accordance with GAO’s best practices, and has identified four areas that appear to be the most challenging for M&Os to implement, whether from a cultural or technical perspective. Leveraging this understanding, in addition to the presenter’s own 35 years of project management/controls experience, the presentation will discuss each of the four challenges in detail, and provide recommendations for how they can effectively be addressed by DOE and their contractors at the site-level.
Keywords: Cost Management, Government, Life Cycle, Monte Carlo, Parametrics, Risk, Estimating, GAO, best practices
PBP05 – Nosbisch – Challenges in Implementing GAOs Best Practices – ppt
Interesting Results from EVAMOSC – FY25 Edition
Processes & Best Practices (PBP06)
Kimberly Roye
Shanice Kaduru
EVAMOSC is OSD CAPE’s cloud-based Operations & Sustainment (O&S) database, containing terabytes of data on over three thousand different DoD weapon systems. In this presentation, the EVAMOSC team will provide an update on newly ingested and normalized data, review interesting results – including attempts to use AI for analysis – and discuss the potential impact of big-data platforms like EVAMOSC on the cost estimators and analysts community, both in the near and long-term.
Keywords: Data Collection, Data-Driven, Decision Analysis, DOD/MOD, Operations
PBP06 – Roye – Results from EVAMOSC FY25 Edition – ppt
Contract Pricing Math Forensics: The What, The How, and Some (Shhhh!) Secrets
Processes & Best Practices (PBP07)
Chris Svehlak
Want to sniff around a career field related to cost estimating? Expand your skill set? Maybe consider contract pricing. You could potentially craft and calculate competitive pricing volumes for your company, helping to win contracts. Or evaluate bid submissions for the Government’s contracting office. After a brief overview of what pricing is and what it involves, we’ll forensically dissect a plucked-from-real-life (sanitized, of course!) Request-For-Proposals labor pricing template and see how it’s populated–from labor categories to manhours to fringe, G&A, overhead and profit rates. But wait, there’s more! Some mathematical secrets will be revealed as to how companies can and do “strategically price” (yes, legally!) labor bids to best position themselves to win `lowest price/technically acceptable’ contracts without undue performance or profit risk, and how smart Government pricers can spot this mathematical maneuvering. Let’s follow the math trail.
Keywords: Cost Management, Labor, Methods, Microsoft Excel, Program Management, contract pricing, RFPs, bids, cost estimating
PBP07 – Svehlak – Contract Pricing Math Forensics – ppt
Closing the Gap: Using Maturity Assessments to Improve Cost Estimation Accuracy
Processes & Best Practices (PBP08)
Esteban Sanchez
Effective budget management for public sector agencies relies on disciplined cost estimation practices. After a Government Accountability Office (GAO) audit revealed weaknesses in cost estimation controls, our team implemented the EstimaPro360 framework to establish a structured, iterative approach to enhancing maturity. Starting with comprehensive baselining, we identified critical gaps across people, processes, and tools. We then used these insights to transform ad hoc methods into continuous advancement. This session will explore how baselining and ongoing maturity assessments close control gaps, improve cost accuracy and satisfy audit requirements, ensuring programs are delivered within budget. By aligning our approach with industry standards and leveraging parametric estimation tools, we achieved measurable gains in efficiency and provided actionable metrics for continuous improvement. Ultimately, this strategy fostered a culture of accountability and precise financial planning, driving lasting benefits for program success.
Keywords: Cost Management, Government, Methods, Performance Management
Software & Agile Track
From Zero to Hero: Roadmapping in an Agile World
Software & Agile (SWA02)
Jacob Blackthorn
Kristen Marquette
“We’re Agile! We don’t need to tell you what we’re working on! It’ll be done when it’s done.” This is a common adage in the Agile Community and has led to numerous failed audits. However, Agile Roadmapping IS possible! Our team has worked directly with Program IPTs to develop Roadmaps that incorporate projected work in an iterative process utilizing past performance data. We were then able to turn these Content Driven Roadmaps detailing the time-phased product backlog directly into an FTE requirement utilizing an Analysis of Agile Data taken directly from the Software Teams Jira tool. Join us to learn how our program went from zero or hero, as they are now considered the golden standard of agile estimating.
Keywords: Agile, Data Collection, Data-Driven, EVM, Function Points, IT, Labor, Methods, Microsoft Excel, Modeling, Software, Statistics, Story Points, Jira, Data analysis, epics, metrics, sprint, PI Planning, SAFE, GAO
SWA02 – Blackthorn – Zero to Hero Roadmapping Agile – ppt
SWA02 – Blackthorn – Zero to Hero Roadmapping Agile – paper
Commitment Issues: From User Stories to Love Stories
Software & Agile (SWA03)
Shannon Cardoza
Annie Bagay
In this lively presentation, we will delve into the world of agile metrics through the captivating lens of the reality TV show “Love is Blind.” Just as contestants embark on a journey of emotional discovery without initially seeing one another, agile teams navigate the challenges of project estimating and prioritizing work based on collaboration and limited information. Story points are our secret weapon, allowing teams to gauge the effort and complexity of tasks—much like contestants assess their compatibility amidst uncertainty. We’ll draw parallels between the metrics that guide agile processes and the evolution of relationships, providing a unique framework for understanding story point creation and the lifecycle of stories, bugs, epics, and tasks. Join us as we demonstrate how implementing agile metrics can foster stronger team dynamics, improve collaboration, and ultimately lead to successful outcomes—both in the world of software development and in the quest for lasting love!
Keywords: Agile, Software, Space, Story Points, Love, Relationships
SWA03 – Cardoza – User Stories to Love Stories – ppt
Generative AI for Agile: Closing Cost Estimation Gaps with Efficient Preplanning
Software & Agile (SWA04)
Curtis Chase
This session addresses a key challenge in Agile software development: accurately estimating project costs as requirements evolve. Agile’s iterative nature accelerates releases and supports collaboration but often lacks the structured preplanning needed for precise cost tracking and budget alignment. Traditional solutions—like analogous systems or dedicated preplanning teams—can be inefficient and costly, limiting Agile’s benefits. This presentation introduces a new approach using generative AI to automate essential preplanning tasks, such as functional requirement development and function point counting. We’ll discuss how generative AI enhances cost estimation accuracy while preserving Agile’s speed and adaptability. By automating these processes, this AI-driven method provides a practical, cost-effective way to maintain Agile’s flexibility with dependable budget insights. Attendees will see how this technology can refine Agile workflows and improve financial precision, ensuring efficient project management.
Keywords: Agile, Functional Requirements, Methods, Parametrics, Software, AI
SWA04 – Chase – Closing Cost Estimation Gaps – ppt
Software Modernization: A Real World Example
Software & Agile (SWA06)
Chad Lucas
Nicholas Taw
The ability to estimate the effort and schedule required for software development programs is challenging. This is even more so for programs in the early phases of development where requirements are not always well defined, and design is in a constant state of flux. Also challenging are program modernization efforts where the design of the program is changing but it’s operations and purpose are not. In this paper we present a real-life example of how our cost team took the initiative to come up with an estimation plan for mission critical program beginning the process of modernization, and executed it, for a customer that was unused to applying some of the accepted techniques of Agile software estimation.
Keywords: Agile, Early Cost, Modeling, Software, Simple Function Points
SWA06 – Lucas – Software Modernization Real World Example – paper
SWA06 – Lucas – Software Modernization Real World Example – ppt
Managing Software Obsolescence: Strategies and Cost Impacts in Cost Estimation
Software & Agile (SWA07)
Dr. Sanathanan Rajagopal
As software becomes essential in many long-life systems (e.g., in defense, aerospace, and manufacturing), managing software obsolescence is critical to controlling long-term costs. Software obsolescence occurs when a software component is no longer supported or compatible, leading to security risks, increased costs, and potential downtime. This presentation will outline key strategies to manage software obsolescence—like updating legacy systems, modular design, and using open standards—and discuss their cost impacts. Using case studies, we’ll show how these approaches affect total ownership costs and explore ways to include obsolescence in cost estimation, leading to better budgeting and risk control over time.
Keywords: Agile, Cost Management, Cost/Benefit Analysis, DOD/MOD, IT, Risk, Software, Software Obsolescence, Cost Estimation, Lifecycle Management, Risk Control
SWA07 – Rajagopal – Managing Software Obsolescence – ppt
Busting Myths of Software Estimation for Better Decisions
Software & Agile (SWA08)
Colin Hammond
Executives in all industries make high-value decisions about software initiatives. These decisions depend on defendable estimates of cost, time and achievability. Estimators who understand software fundamentals, can see through the myths and employ the best techniques are able to deliver objective, defendable estimates that lead to better executive decisions. This presentation explores: The business value of defendable software estimates, Some universal software principles, Common myths – that will be busted, How AI can help improve both the validity and defendability of early software estimates.
Keywords: Agile, Bias, Cost/Benefit Analysis, Early Cost, EVM, Function Points, Functional Requirements, Project Controls, Software, Story Points, AI
Technical Innovations Track
MacGyvering Cost Estimating: Equipping Estimators with a Swiss Army Knife for the AI Era
Technical Innovations (TCI01)
Sandy Burney
Cost analysts often rely on fragmented tools, leading to inefficiencies in cost estimation. This paper explores AI-enhanced cost tools that integrate data querying, machine learning, and natural language processing into a unified system. These tools streamline workflows, improve accuracy, and reduce manual errors while enhancing collaboration. AI plays a key role in cost strategy development, risk assessment, and pricing function reengineering. Technologies like Large Language Models, Machine Learning, and Retrieval-Augmented Generation optimize cost estimation. Additionally, AI-driven tools improve proposal development through automated compliance checks, competitive intelligence gathering, and proposal scoring. While challenges exist, such as data integration and organizational resistance, AI-powered cost tools provide significant efficiency gains, enabling businesses to develop high-quality, competitive proposals faster and more accurately. These advancements reduce costs and improve decision-making for both contractors and government estimators.
Keywords: Data-Driven, Early Cost, Methods, Modeling, Operations, Software, AI, artificial intelligence
TCI01 – Burney – MacGyvering Cost Estimating – ppt
TCI01 – Burney – MacGyvering Cost Estimating – paper
Cost Data Unleashed: A PowerBI AFLCMC/HN Case Study
Technical Innovations (TCI02)
Patrick Casey
Cost Data Unleashed: A PowerBI AFLCMC/HN Case Study explores the capabilities of PowerBI applied to cost analysis using the AFLCMC HN Labor Rate PowerBI Dashboard. Geared towards both beginners and advanced users, we will walk through PowerBI essentials (the Ribbon & Linkage) and delve into intermediate (One-to-Many Relationships) and even advanced (Grouping Within PowerBI) topics. Attendees will learn how the HN Labor Rate PowerBI Dashboard harnesses the power of the HN Labor Rate Database, highlighting the breadth of labor rate data filtered through customizable views while safeguarding sensitive information. We will cover the DoD adoption of PowerBI, relationships between data tables, grouping within fields, and the intended use of the dashboard in supporting cost analysis. Attendees will also see examples of practical visualizations, linking, and data exploration. This session concludes with a look at future enhancements and an invitation for further collaboration for those interested in deploying a similar solution.
Keywords: Data Collection, Data-Driven, Government, Labor, Methods, Modeling, PowerBI
TCI02 – Casey – Cost Data Unleashed – Paper
TCI02 – Casey – Cost Data Unleashed – ppt
AI-Powered Cost Estimation; Translating 2D Diagrams for Obsolescence & Sustainment Solutions
Technical Innovations (TCI03)
Christopher Rush (PhD)
As industries grapple with the challenges of equipment obsolescence, the need for efficient part replacement and sustainment has never been more critical. Traditional methods of converting outdated 2D part diagrams for manufacturing are labor-intensive and costly and often lead to prolonged downtime, impacting operational readiness and stretching the lifespan of valuable capital assets. This presentation will explore using AI-powered technology to digitize 2D part diagrams and extract essential data relevant to generating accurate cost analyses. Key data, such as part dimensions, features, and tolerances, can be automatically processed, enabling procurement teams to quickly evaluate cost estimates and explore alternative supplier options. Through practical case studies, we will highlight how this approach reduces obsolescence costs, decreases lead times, and extends asset service life, ensuring organizations can maintain a state of readiness while mitigating the operational risks associated with aging equipment.
Keywords: Data-Driven, Manufacturing, Software
TCI03 – Rush – AI Powered Cost Estimation – ppt
My Dog Ate My Engineering – Empowering Excuse Free Digital Transformation
Technical Innovations (TCI04)
Logan Hartley-Sanguinett
Hannah Lee
Alex Wekluk
Margaret Davis
The National Nuclear Security Administration (NNSA) is striving to adapt and thrive in today’s engineering ecosystem by integrating digital engineering technology into all enterprise areas. Digital Transformation (DT) enables increased business efficiency and effectiveness, upleveling of agility and resilience, and accelerates innovation. However, decision-makers are often faced with unclear objectives, few technical specifications, and overall uncertainty about the cost of what it means to “go digital”. Our team employed a novel mixed-methods approach of cost estimating practices to give decision-makers a clear picture of how to incorporate redundancy, cloud processing, storage, and AI/ML into their architecture. We will demonstrate how we leveraged public data and Cost as an Independent Variable (CAIV) techniques to define DT within the NNSA as well as how other agencies can learn from our experience. With this unique approach, leadership can see an immediate tradeoff between value in the DT space and the cost of investments
Keywords: Cost/Benefit Analysis, Data Collection, IT, Methods, Parametrics, Cloud, CAIV, Digital Transformation
TCI04 – Hartley – My Dog Ate My Engineering – paper
TCI04 – Hartley – My Dog Ate My Engineering – ppt
Maiden in the Tower: The Intersection of MBSE and Cost Estimating
Technical Innovations (TCI05)
Stephen Koellner
Daniel Larison
Model based systems engineering (MBSE), and more broadly digital engineering, is becoming a more widely adopted practice within federal/defense acquisition. The motivations behind this adoption are to produce more effectively engineered systems and reduce the likelihood of costly rework in the later stages of system design and integration. Cost estimators and other project control professionals are most effective when fully integrated within the overarching SEPM team. What do MBSE and project control practitioners have in common? The answer is that both are effectively “jack-of-all trades” with significant overlap in goals and processes. An unfortunate reality is that neither is regularly well-integrated with the other, despite the many philosophical and procedural commonalities. Through this topic, a sample case study will be presented to illustrate how these skillsets can operate in conjunction to produce more accurate and credible cost estimates as well as more dynamic and rapid evaluation of architecture/design alternatives.
Keywords: Communication, Cost/Benefit Analysis, Early Cost, Functional Requirements, Life Cycle, Methods, Modeling, Operations, Process Engineering, Program Management, Project Controls, Model Based Systems Engineering, Digital Engineering, Single Source of Truth
TCI05 – Koellner – Intersection of MBSE and Cost Estimating – paper
TCI05 – Koellner – Intersection of MBSE and Cost Estimating – ppt
Moneyball Metrics: Exploring Power BI Through MLB Player Analysis
Technical Innovations (TCI06)
Sam Lepordo
Meghan Greathouse
With recent enterprise access to Power BI, the DoD Cost community has a powerful new tool at its disposal. This presentation examines not only how cost analysts have utilized Power BI over the past year but will also explore its potential usages and innovative ways to leverage its capabilities within the program office. Through the lens of the MLB, we’ll explore Power BI’s capabilities, with player salaries, contract durations, team locations, and conferences representing key elements of Defense Acquisition. This data will be transformed into dynamic dashboards, providing valuable support for cost estimating and program management by visualizing labor rates, contract costs, and more. As cost analysts, our role is to provide decision-makers with accurate, actionable insights. Power BI’s capabilities offer a shift from traditional tools enhancing the effectiveness and adaptability of decision support and database management. Ultimately, this presentation aims to inspire new and creative approaches to data visualization and analysis.
Keywords: Data Collection, Decision Analysis, Power BI, Data Visualization, Defense Acquisition
TCI06 – Lepordo – Exploring Power BI through MLB Player Analysis – ppt
Power Platform for Workload Projections
Technical Innovations (TCI07)
Matthew McGlone
Brooke Bires
Power Platform enhances workload management and cost estimation by tackling key challenges in configuration control, collaboration, and communication. Traditional methods often involved scattered files, leading to inefficiencies and slow processing times. While VBA was once the solution, many government agencies are phasing it out for security. By implementing Power Apps, a centralized repository for information, accessibility and standardizing workflows can improve. Leveraging DAX and Power Query within Power BI allows for real-time data processing, minimizing delays and enabling dynamic interactions. Unlike static visualizations, Power BI offers customizable reporting that supports effective comparisons and informed decision-making. With a specific use-case example, these improvements will be demonstrated. Overall, attendees will learn how Power Platform enhances workload management and cost estimation by improving configuration control, collaboration, and real-time data processing, ultimately transforming operational efficiency.
Keywords: Budgeting, Communication, Cost Management, Data Collection, Data-Driven, Decision Analysis, Government, Labor, Methods, Microsoft Excel, Modeling, Operations, Performance Management, Program Management, Project Controls
TCI07 – McGlone – Power Platform for Workload Management – ppt
Generative AI’s Long-Term Impact on Cost Engineering for Manufacturing 4.0
Technical Innovations (TCI08)
Charles Orlando
Dan Kennedy
As manufacturing embraces Industry 4.0, artificial intelligence (AI) transforms cost engineering, optimizing operations and driving competitiveness. Generative AI, a next-generation advancement, reshapes cost estimation by synthesizing data from diverse sources, enabling swift adaptation to changing conditions. This presentation explores generative AI’s role in enhancing speed to market, accuracy, and profitability. We’ll cover practical applications like design optimization, real-time inventory management, and sustainability initiatives, showing how AI-driven systems streamline processes and support environmental goals. Real-world examples will illustrate how generative AI moves beyond traditional data limitations, enabling informed decisions with unprecedented precision. Ethical considerations, including data security, transparency, and human oversight, will also be discussed. Generative AI represents a transformative shift for cost engineering, positioning manufacturers to innovate and thrive in today’s complex market.
Keywords: Cost/Benefit Analysis, Decision Analysis, Manufacturing, Parametrics, Software
TCI08 – Orlando – Generative AI-Manufacturing – ppt
Estimating in an AI-Enabled Development Environment
Technical Innovations (TCI09)
Eric van der Vliet
Bhawna Thakur
Raghav Kumar
In software development, AI tools are increasingly used to enhance efficiency. These tools aid in generating test cases, streamlining documentation, supporting maintenance, and overall improving the development process. While AI has shown significant productivity gains, with improvements of up to 50%, these benefits don’t apply universally across entire projects or solutions. This presents a challenge for estimation processes, requiring high levels of accuracy in metrics to achieve reliable estimates and a clear understanding of performance gains. Additionally, timing plays a critical role, as it is essential to determine when and how specific tools are applied and with what intent. Drawing on practical experience, this presentation addresses these challenges and offers actionable guidelines to improve estimation and metrics processes within an AI-enabled development environment.
Keywords: Software, AI, Estimating, Metrics, Development
TCI09 – vanderVliet – Estimation in AI Enabled Dev Environment – ppt
Next-Gen Analogous Estimating: A Framework Grounded in Actuals, Powered by AI
Machine Learning & NLP (TCI10)
Tom Shanahan
Traditional estimating processes are still broken—too reliant on spreadsheets, SME recall, and generic templates. This presentation introduces a next-generation framework that transforms cost estimating into a data-driven, AI-powered process grounded in historical actuals. By combining conditional GANs (Generative Adversarial Networks), NLP-driven scope parsing, and continuous Monte Carlo simulation, the system generates tailored analogous estimates, automates BOE justification, and enables SME refinement in a BOE console. This isn’t just automation—it’s a paradigm shift. Attendees will see how structured and unstructured data can be harnessed through a guided application workflow to drive smarter, risk-adjusted, and repeatable estimates. The framework is built for evolution, empowering organizations to deliver estimates with greater speed, transparency, and confidence.
Keywords: Budgeting, Cost Management, Data-Driven, Decision Analysis, Early Cost, IT, Methods, Modeling, Operations, Project Controls, Risk, Software, AI, artificial intelligence
Introduction to Proposal Cost Estimating and Writing a Basis of Estimate (BOE)
Part 1: Basic Cost Estimating (BOE01)
Sandy Burney
This four-part training introduces the fundamentals of proposal cost estimating and Basis of Estimate (BOE) documentation, designed to supplement the ICEAA Cost Estimating Body of Knowledge (CEBoK®). Aimed at professionals with limited proposal estimating experience, the course focuses on developing credible, supportable cost estimates for government proposals. Core topics include estimating techniques such as analogy, parametric, and level of effort (LOE). Regulatory compliance, including adherence to FAR, DFARS, and DCAA audit standards are presented. Overall, it provides foundational knowledge to develop accurate, defensible cost estimates aligned with government expectations and acquisition processes.
Part 2: Data and Other Cost Estimating Topics (BOE02)
Javier Provencio
Part 2 of the cost estimating training focuses on the critical role of data in developing accurate and defensible proposal estimates. This section introduces key topics such as factored hours, inflation, and functional cost structures. It provides overviews of manufacturing and software cost estimating, highlighting unique processes like production setup, tooling, learning curves, and Agile methodologies. Estimators are introduced to the distinction between direct and indirect costs and the importance of understanding wrap rates and burdening. Specialized estimating techniques and considerations for both manufacturing and software environments are addressed, including make-buy decisions, material concerns, and parametric models.
Part 3: The Proposal Basis of Estimate (BOE) (BOE03)
Steve Glogoza
Jeani Dierkes
Part 3 focuses on developing a compliant and defensible Basis of Estimate (BOE) within the context of government acquisition and contracting. It explains how contractors respond to Requests for Proposals (RFPs) by developing detailed cost estimates aligned to Statements of Work (SOW) and pricing requirements. Key contract types (e.g., Firm Fixed Price, Cost Plus) are discussed in terms of risk allocation and pricing strategies. A strong BOE includes clear scope alignment, rationale, assumptions, data sources, and appropriate estimating methodologies such as analogy or parametric techniques. The training emphasizes BOE structure, including task descriptions, labor/material/ODC breakdowns, time-phased estimates, and use of spread curves. It also addresses common BOE errors like vague assumptions, lack of traceability, and unsupported adjustments. A quality BOE demonstrates credibility, supports audit readiness, and facilitates negotiation. Ultimately, the BOE is both a technical justification and persuasive tool, central to winning proposals and ensuring government pricing compliance.
Part 4: Example: Creation of a Proposal BOE (BOE04)
Brent M. Johnstone
This example-based training illustrates how to develop a compliant Basis of Estimate (BOE) for a proposal. Using a scenario for an example Counter-small Unmanned Aircraft System (C-sUAS) RFP, the walkthrough focuses on estimating a single WBS. The estimator selects an analogous WBS from a prior contract, adjusts it using a justified complexity factor, and calculates labor distribution over an 8-month period. The process demonstrates key BOE components: aligning the task description to the SOW, documenting assumptions, supporting data sources, and detailing rationale and math used in estimating. This example reinforces best practices for creating defendable, audit-ready BOEs aligned with proposal requirements and government expectations.
Certification Program Overview
Kevin Cincotta
This interactive session introduces the ICEAA certifications – Certified Cost Estimator/Analyst (CCEA®), Professional Cost Estimator/Analyst (PCEA®), and Software Cost Estimating Certification (SCEC). It covers eligibility and certification requirements, examination topics, relationships to the Cost Estimating Body of Knowledge (CEBoK®), the online exam format, and recertification requirements. Great opportunity to talk with Certification Principal (Kevin Cincotta) and get your questions answered.
CEB00 – CertificationOverview2025
CEBoK Module 6: Preview the 2025 Updates
Dave Brown
The world of data analysis is constantly changing, with new tools and methods emerging in the field every year. To keep the cost community up to date on these data analysis principles, we have updated CEBoK Module 6, Data Analysis and will be providing a preview into the changes. To modernize content on data visualization, we have added new visual examples, including waterfall charts, heat maps, and decision trees; cultivated new tables, images, and examples that are specifically relevant to the data analysis needs of cost estimators; and added new sections that cover machine learning algorithms and their applications to cost estimating and analysis. Each section in Module 6 now features guidelines for each topic on how to apply the methods discussed, with real world examples of their use.
CEBoKMod6 Data Analysis 2025 Update
CEBoK-S Training Track
CEBoK-S Lesson 1: Introduction to Software Cost Estimating (CBS01)
Dr. Sanathanan Rajagopal
This lesson explains the importance of software cost and schedule estimating for CEBoK-S audiences, and includes historical contexts, examples of software program failures and success, and highlights what students can gain from CEBoK-S. Note that CEBoK-S is intended to supplement the audience’s understanding of the cost estimating tools and techniques covered in CEBoK®, introducing the differences between software and hardware estimating and what is included in systems vs. software cost estimating.
CEBoK-S Lesson 2: Software Development Paradigms and Cost Considerations (CBS02)
Eric van der Vliet
This lesson introduces the two major paradigms involved in custom software development, predictive and Agile, and presents a variety of software development methods and where they differ from perspective of software cost estimation of custom software development efforts. Included are highlights of waterfall, iterative, spiral, Agile, Scrum, SAFe, DAD, and other approaches to give the software cost estimator a basic understanding of the activities and cost drivers involved.
CEBoK-S Lesson 4: Estimating Custom Software Development (CBS04)
Kevin Cincotta
Lesson 4 outlines where we use relevant data & custom-built software Cost & Schedule Estimating Relationships (CER & SER) to estimate software development effort, cost and schedule duration. We will present three ways to create the software estimate using estimating techniques: an overview of estimating approaches, how to create a software development estimate based on CEBoK-S Five-step Estimating Process and four estimating approaches, developing a schedule estimate, time-phasing the estimate, cross-checking the estimate, and reviewing estimates prepared by others.
CEBoK-S Lesson 6: Estimating Procured Solutions (CBS06)
Arlene Minkiewicz
Identify key characteristics of cost estimating considerations for, and differences between Commercial Off-the-Shelf (COTS) software and Enterprise Resource Planning (ERP) systems among other procured software solutions. This lesson will provide an introduction to procured software solutions, describe COTS software and the major types of procured software solutions, and cost drivers and estimating approach(es) for each. We will consider cost growth and sustainment in procured software solutions.