2020 Annual Conference Abstracts

Registration Page | Speaker Bios

Abstracts (In order of appearance. Schedule subject to change and more abstracts will be added soon)

Tuesday, April 21

Wednesday, April 22

The Value of Information — Because There Is Nothing Else
Sam L. Savage, ProbabilityManagement.org and author of The Flaw of Averages

The value of information is the ultimate metric of the Information Age, yet few people are aware of this fundamental concept developed by Stanford’s Professor Ronald A. Howard in 1966. I will provide some interactive spreadsheet examples to frame the conference.

Farming Decisions Under Uncertainty
Keith Shepherd, World Agroforestry (ICRAF), Innovative Solutions for Decision Agriculture (ISDA), and ProbabilityManagement.org

Farming anywhere is typified by high investment risks, due to uncertainties in weather, pests and diseases, labour availability, responses to inputs, market prices, and climate change, to name a few. The magnitude of the risk challenge is extreme for a smallholder farmer in Africa, with a farm income of only a few hundred dollars a year, no insurance and no credit, faced with considering an investment in improved seed and fertilizer, or of growing a new crop. ICRAF, Innovative Solutions for Decision Agriculture, and ProbabilityManagement.org have launched an initiative to promote a transformation in the way uncertainty is understood, leveraged and communicated in agriculture. Leveraging advances in the Open SIPmath™ Standard, we illustrate through examples how this involves: (i) communicating risk to users when providing advice, including simple, interactive economic simulation tools that farmers and agronomic advisors can use to better understand risks and returns on investments.; (ii) understanding decision uncertainty so that information collection is systematically and iteratively focused on the data most needed to improve decision outcomes; and (iii) explicitly designing agronomic practices that optimize yield and income risk-returns, for example through use of crop mixtures or variable fertilizer rates.

The Failure of Risk Management — Updates for the New Edition
Doug Hubbard, Hubbard Decision Research 

Doug Hubbard, author of The Failure of Risk Management: Why It's Broken and How to Fix It, will introduce some of the new content for the second edition of this book. The second edition, which came out just this spring, adds several new topics on risk management since the first edition was written 11 years ago. New content includes the update on the state of risk management from several surveys, including the joint survey of Hubbard Decision Research and KPMG. This edition also adds findings from many researchers on why there is resistance to quantitative methods and how to further improve subjective estimates of probabilities.  He will describe some surprisingly simple methods for estimating baseline probabilities when there is very little data available. Finally, Doug will present a short example of how a client of his implemented Enterprise Risk Management one module at a time using Excel spreadsheets.  

Stochastic Modeling of Biological Systems
David Cawlfield, Olin Corporation

Models that simulate biological functions are often used to predict toxicology, especially in the place of experiments on human subjects. But these models are typically done using single-point estimates of inputs, where real-world values may be highly variable. This talk reviews models of human toxicology of perchlorate. These models are being used by Federal and State EPAs to establish appropriate limits on perchlorate that will protect human health. Perchlorate interferes with iodine absorption into the thyroid, and in theory, over time, could reduce thyroid hormone secretion. A stochastic simulation shows that human tolerance of perchlorate exposure substantially improves with variability in dietary availability of iodine simulated using FDA dietary data. This simulation could help to resolve the apparent contradiction of predicted effects of environmental perchlorate and a lack of observed perchlorate-related thyroid hormone deficiency in human populations. If decisions by the EPA take this effect into account, regulatory limits on perchlorate in drinking water will be less costly to achieve and may even be unnecessary. This work sheds light on why many living things appear to be “antifragile,” thriving in a chaotic world. The results in this case suggest that most models of biological systems are subject to the Flaw of Averages, and can be improved through stochastic simulation. This is critically important when these models are used to inform decisions designed to improve health.

Transitioning to the Digital Factory at Lockheed Martin Aeronautics
Bryan Massie and Tony DeMarco, Lockheed Martin

The industrial internet of things (IIoT) is transforming the manufacturing landscape. Lockheed Martin Aeronautics is harnessing this revolution by connecting machine assets to sensors to capture, store, and analyze critical operational detail on the F-35 production line that will enable edge analytics, condition-based maintenance, and predictive modeling. Critical machinery must be kept working synchronously to maximize throughput. Unplanned stoppage can be reduced by forecasting emerging disruption events and arranging maintenance to meet operational requirements. Improvements here can lead to cost savings and efficiencies, on time delivery, higher quality, and increased production rates. All of this culminates in lower average aircraft unit cost for the F-35. Come learn how Lockheed Martin Aeronautics is securely capturing machine data and streaming this information into a data lake and we will also take a deep-dive into an LSTM (Long Short-Term Memory) recursive neural network model that we’ve implemented for automated anomaly detection in the time-series data. These efforts are key enablers for this transformation into Industry 4.0, i.e. the connected factory.

Saving For Rainy Day: Deciding How Much
Brenda Olwin, City of East Palo Alto, and Shayne Kavanagh, Government Finance Officers Association (GFOA)

The true measure of a good model is if it leads to better decisions. The City of East Palo Alto was faced with a decision about how much money to keep in its rainy day fund. The rainy day fund helps the City respond quickly and decisively to risks like recessions, floods, and earthquakes. Too little and the City could find itself unable to respond effectively.  Too much and the City incurs substantial opportunity costs in terms of foregone projects and even excessive taxation of the public. The City worked with GFOA to build a probabilistic model to represent the risks that the City faced and help the City arrive at a good reserve strategy. At this session, you will hear from the GFOA and CFO of East Palo Alto City government, Brenda Olwin. She will discuss how the model has been used by the City, what kind of questions it raised, and where the City will go next.

Everything We Know is Wrong
Patrick Leach

All of the logic, intuition, and metrics humans have for making decisions were developed during a time when resources were abundant, human populations were relatively small, and we could afford to focus on maximizing returns today because tomorrow always held new frontiers filled with limitless riches. We were far too wise to repeat the mistakes of the Anasazi, the Norse Greenlanders, or the Easter Islanders; Adam Smith’s “invisible hand” would ensure that resources were used efficiently and wisely, that self-destructive behaviors would be weeded out and halted.

None of this is true anymore (if, indeed, it ever was). The “invisible hand” never had to deal with a world in which any piece of information – whether true or not – is instantly flashed around the globe. The Tragedy of the Commons looms large in many forms, leaving huge environmental problems for future generations. The primary objective of most modern societies – constant economic growth – is mathematically unsustainable.  And our measures of economic value (NPV, IRR, P/I, etc.) punish exactly the type of behavior which is most closely associated with long-term success.

How do we make good decisions when everything we know is wrong?

How Much is Someone Else’s Help Worth? Exploring Why the Value of Information Can Be a CIO’s Best Friend When Making Big Decisions
William D. Reed, Optiv Security

When should a Chief Information Officer (CIO) reach out and bring in an outside consultant for help? Should he/she be willing to pay that consultant $100k for an assessment? How about $250k? Is it worth that much? Is there a way to determine that? Yes, there is a way; by using the Value of Information, a fundamental element of decision analysis. When uncertainty is high in a business decision, and the internal resources can't reduce that uncertainty themselves, an assessment from outside experts can help reduce that uncertainty to improve the odds of making the right decision. In this talk, I will walk the audience through a common decision that CIOs face and how the VOI can be used in a decision model to improve the potential outcomes.

Metalog Regression and Bayesian Inference
Tom Keelin, Keelin Reeds Partners and ProbabilityManagement.org

The holy grail of Bayesian inference would be to start with an “any shape” prior distribution over a variable of interest, to update its parameters in closed form in light of new data according to Bayes’ theorem, and to end with an “any shape” posterior distribution over the variable of interest that’s of the same family of distributions as the prior. This paper accomplishes this previously unattained goal. We use metalog distributions, which can take on any shape, to model the prior and posterior distributions over the variable of interest. As a special case of the Bayesian linear model, the parameters of the metalog quantile function can be updated in closed form according to Bayes’ theorem by applying the usual likelihood and conjugate prior formulation of that model. The structure of this metalog formulation makes assessment of prior hyperparameters and interpretation of posterior ones particularly natural and convenient. The generality, practicality, and tractability of this approach may make it useful for applications in any field and for teaching Bayesian inference.

Using Value of Information to Decompose Systemic System Integration Schedule Risks
Brian Asti and Dan Harmeyer, Northrop Grumman

Successful systems integration depends upon a tightly choreographed sequence of operations performing their roles correctly. Design, Procure, Build, Test, and Deliver stages must all be completed to schedule. There are multiple, linked, hierarchical dependencies, and any delays can propagate through the downstream stages. Conventional wisdom within the organization held that certain functions were more likely to cause delays than others.  Using the concept of a Stochastic Information Packet (SIP) to incorporate uncertainty, our Division team built a simulator for an entire product line moving through integration. We further applied the Value of Information to understand where we needed to reduce uncertainty.  Accordingly, we identified new ways to measure performance within stages that thus far had escaped scrutiny because they were difficult to measure. These tools enabled us to identify the biggest opportunities for improving performance. Needless to say, our biggest opportunities exist in places different from what conventional wisdom held. We’re now working with stakeholders to address those performance gaps. Northrop Grumman is a leading global security company providing innovative systems, products and solutions in autonomous systems, cyber, C4ISR, space, strike, and logistics and modernization to customers worldwide.

Information Economics and Stochastic Digital Twins
Steve Roemerman, Lone Star Analysis

Many digital twin applications require information economics. This can be caused by data transport costs, and often is caused by the costs of sensors and sensor installations. This presentation discusses different types of Digital Twins. It uses for an illustration the problem of improving the availability of the F-18 aircraft, lowering the cost of sustainment, and improving aircraft safety by use of stochastic digital twins predicting the remaining useful life of dangerous energetic components (e.g., ejection seat rockets). Adding sensors, cables and processing to critical safety items (e.g., ejection seats) is time consuming and expensive. A key aspect of this successful digital twin project was to aggressively apply the principles of information economics. A second aspect of the project was to embrace the uncertainty imposed by data parsimony, and employ stochastic methods in the twin design.

How Decision Quality Inoculates Against Uncertainty
Neil Hamlett, Uncertainty Management LLC

Uncertainty is an inescapable aspect of life. Business decision-making represents one activity for which this is particularly consequential. Economist John Maynard Keynes spoke about irreducible uncertainty. Probability and statistics practitioners use a related quantity “entropy” measure randomness that in the data that defies explanation. Practiced in combination, decision analysis and data science parse uncertainty. They bin it into categories so that decision-makers understand what they are actually facing:  What they can reduce and what they are stuck with. We group uncertainty into five distinct bins: “Statistical”, epistemic, model-misspecification, measurement, and behavioral. A case study illustrates how triaging and binning uncertainty can get the decision maker to statistical uncertainty, that irreducible part with which she must just live.

How to Depict and Assess Risk Correctly
Doug Samuelson, InfoLogix, Inc.

We discuss how to assess various methods of deterring terrorist attacks: addressing a long-standing argument in the risk community, risk is best expressed as neither a sum nor a product, but rather as a sum of products -- or, in more general form, a multiple Stieltjes integral. This seemingly somewhat daunting depiction greatly clarifies the problem and simplifies readily for most situations. Deterrent effect is then best depicted as the difference between computed risk with and without the proposed measure. Some real-life (sanitized) findings illustrate the method. We then explain why usual metrics of statistical variation can be wildly misleading when assessing risks involving rare, high-consequence events, often leading to grossly inaccurate estimates of risk. Again, proper depiction of risk avoids the pitfalls. Illustrative examples are drawn from statistics on spills of hazardous substances into water.  

Applying to Value of Information to IT Project Selection
Ann Dunkin, County of Santa Clara

This presentation will discuss the use of the Value of Information to assist in the selection of IT projects.  We will discuss how the Value of Information can be used to provide additional information to help organizations prioritize technology projects by using the probability and value of each outcome to analyze whether a technology project is likely to deliver sufficient value to provide a positive return on investment.  We will also review the application of the Value of Information to a specific public sector IT project and its placement in the overall portfolio. 

Measuring Model Uncertainty: Applications in Pricing Optimization and Wildfire Risks
Farshad Miraftab, PagerDuty

Farshad Miraftab will be providing 2 examples of how Python machine learning and Bayesian methods can be integrated into SIPs to create a more robust probabilistic workflow and modeling.

1. Pricing Optimization - A Bayesian Approach
Identifying the right price of a specific product/service is a common business challenge but also an economic challenge in trying to estimate price elasticity. This presentation will discuss using Bayesian statistical models to account for the uncertainty in our price elasticity estimate, even in cases where little data exists. Furthermore it will not only demonstrate how to estimate the parameters but also optimize the price in order to maximize average or 95-percentile profits. The outputs of these models can turn into SIP libraries and create excel native dashboards for management to explore further.  

2. Predicting Forest Fires using Machine Learning 
The recent tragic California wildfires have resulted in catastrophic safety and reliability impacts.  This presentation will demonstrate how historical data can be used to train machine learning algorithms to predict likelihoods of future fires in specific conditions. The machine learning predictions output probabilities which in turn can be used to simulate uncertainty in fire size.  These simulations, when performed in SIPs, can then be aggregated to reflect total risk across a collection of assets. 

3.  Bonus
How SIPmath could have adequately identified the risk in the 1986 Challenger explosion (if time exists).

Efficient Calculation of EVI with Monte Carlo
Lonnie Chrisman, Lumina Decision Systems, Inc.

Presentation developed with Max Henrion, Lumina Decision Systems. 

The expected value of information (EVI) has great advantages over other methods for sensitivity analysis to discover which assumptions and uncertainties really matter -- and to guide further data gathering and modeling. But, it has its challenges. It requires a complete decision analysis, including explicit decision variables, objective or value function, and probabilistic treatment of uncertainty. Some have argued that it is computationally intractable to compute EVI with Monte Carlo – and/or existing methods are crude approximations because they don’t properly handle the value of incomplete information. We’ll demonstrate an approach to calculating EVI that is tractable, with a practical way to model sources of partial information. 

Uncovering Value in Valves: Quantifying Monetary Consequence in Valves
Craig Daly, Pure Technologies

Valve programs in a utility generate less attention in water utilities as water mains take the limelight in capital improvement programs (CIP). Breaks in water mains are being monitored to meet certain metrics and prevention of water main breaks are performed over valve replacement or repair to avoid negative level of service impacts. Despite being cheaper to repair or rehabilitate many utilities usually don’t have cohesive valve programs. A method for valve prioritization is presented by using a quantified consequence cost, which estimates the impact of having multiple inoperable valves out of service by using distributions of financial loss when valves are inoperable. This prioritization method is data-driven, resulting in continuous scale distributions for each asset, which avoids a singular score which are typical of factor-based approaches. As such, a monetized consequence model for valves is performed using a dollar value for each financial impact on the level-of-service. Results reveal that the monetary consequences of valves are very significant and comparable to monetary consequences for pipes. However, with valve repair and rehabilitation costs being lower, it gives higher return-on-investment compared to pipes and provides a good investment decision in a water network. Results of a case study will be presented.

Modeling and Measuring Readiness: How Ready are We?
Shaun Doheney, JDSAT and ProbabilityManagement.org

Presentation developed by LtCol Shaun W. Doheney (USMC (ret)), LCDR Connor S. McLemore (USN), and Dr. Sam L. Savage.

In “Measuring Military Readiness” published in the December issue of ORMS Today, we proposed a probabilistic framework allowing planners, commanders and decision-makers to speak the same language when describing “how ready for what” their organizations are. We explain that the platform-agnostic discipline of probability management represents uncertainty as arrays of auditable data called SIPs, and that SIPs based on available system-level readiness data associated with individual assets such as tanks or aircraft can be rolled up through array addition for an improved military readiness representation framework. Specific examples in various military and national security contexts were provided in “Operational Readiness Rollup” published in September’s issue of Phalanx – The Magazine of National Security Analysis.

In this presentation, we discuss the development of a stochastic library that drives a decision dashboard for a hypothetical aircraft carrier air wing composed of all the aircraft on a single aircraft carrier using the previously proposed readiness framework. We provide working prototypes of models to demonstrate some of the underlying principles associated with this new readiness representation approach, to include our most recent modeling effort demonstrating a useful, logically consistent, and actionable method for calculating Carrier Air Wing readiness as described in our article published in the December issue of Phalanx.

The Universal “Chance of Whatever” Button
Sam Savage, ProbabilityManagement.org

Technology has developed to the point that in theory any manager should be able to estimate the chances of meeting their business objectives. We will demonstrate a JavaScript application that accesses SIP Libraries in the cloud and simulates the chance of whatever.

Registration Page | Speaker Bios