A Systematic Approach for Empirical Sensitivity Analysis on Monte Carlo Models

Risk Track



A key component of risk analysis is the sensitivity analysis performed on the input variables for Monte Carlo models with the goal of determining those variables that cause the most variation in the final distribution and identify the best candidates for risk mitigation plans. While the standard technique of calculating the correlation coefficient between the final distribution and each input distribution is appropriate for linear models, it is not sufficient to accurately identify the largest uncertainty drivers. In the case of the non-linear models it is neither appropriate nor accurate. This is because a correlation coefficient measures the strength of the linear relationship between variables and so is not appropriate for models where the relationships between distributions are non-linear. This paper will investigate an alternate empirical method for performing sensitivity analysis on both linear and non-linear models. The alternate method is a systematic approach that reduces the variation on a single input variable, reruns the simulation, and measures the effect on the variation of the final distribution. The variation of the input variable is then restored and the process is repeated on the next input variable. This paper will present and compare several approaches for reducing the variation from the input. The paper will continue with an exploration into the effect of correlation on this method of sensitivity analysis to determine if the correlation induced between the input variables needs to be adjusted when reducing input variation. The final section of the paper will compare the results of this systematic method with the traditional method of using a correlation coefficient for sensitivity analysis of both linear and non-linear models.


Matt Pitlyk
Booz Allen Hamilton
Matt Pitlyk is currently an analyst for Booz Allen Hamilton in the Washington, D.C. metro area. During his two years as a cost estimator, he has taken part in a Joint Confidence Level (JCL) analysis for NASA’s James Webb Space Telescope and an Independent Cost Assessment of NASA’s Next Generation of Manned Space Flight Shuttle Replacement Program. He has helped develop a methodology for creating CERs for estimating satellites for the Air Force as well as performed software estimating research. Matt is part of the RealTime Analytics team which develops several tools using Booz Allen’s proprietary RealTime Analytics technology. He has provided Portfolio Management support to the Army by building a resource optimization model. He has also supported kill chain analysis for the Navy. Prior to moving to the D.C. area, Matt graduated from St. Louis University with an M.A. in Mathematics and spent a year teaching math courses, including statistics, at SLU. He still enjoys teaching and presents the statistical background portion of RealTime Analysis training. He is reprising his role as instructor for part of the regression module at SCEA. He has helped with recent updates of Cost Estimating Body of Knowledge (CEBoK).