Uncertainty Quantification/Verification & Validation Seminar Series (Internal at Sandia)

This seminar series is sponsored by ASC Uncertainty Quantification Methods Development project. The purpose of these seminars is to foster communication between researchers and applications teams in the areas of UQ/V&V, make people aware of tools and resources that are available, provide a forum for creative discussion about ways to approach problems, identify gaps in our current practices, present current work, and discuss how to more broadly implement the QMU (quantification of margins and uncertainties) mandate.

Seminar 12

Title: Impact of Coding Mistakes on Numerical Error and Uncertainty in Solutions to PDEs

Impact of Coding Mistakes on Numerical Error and Uncertainty in Solutions to PDEs, by Patrick M. Knupp, Curtis C. Ober*, Ryan B. Bond

We investigated one source of uncertainty, the numerical error (NE), which is the difference between the numerical solution and the exact solution to the PDE. NE arises from four sources of error within a numerical calculation: (1) discretization error (DE), (2) roundoff error (RE), (3) incomplete iterative convergence error (IICE), and (4) implementation correctness error (ICE). ICE arises from the presence of coding mistakes (bugs) that prevent the correct numerical solution from being computed.

The main purpose of this study was to obtain some insight into the magnitude and effects of coding mistakes (ICE) on the numerical error, sensitivities and uncertainties in the solutions to PDEs. A simple 1D PDE was used in the investigation to circumvent difficulties in using large complex applications, to make use of an exact solution, and to better relate the impact of ICE to the quantities of interest. Using simple 'typo'-type mistakes, this study illustrates many common problems caused by coding mistakes and how they effect the numerical error and uncertainty. From solutions that blow up, to converging to an incorrect answer, the simple-model problem demonstrates how insidious some of the coding mistakes can be. Leading to the concern that it can be worse for our complex-physics applications!

Speaker: Curtis Ober, Dept. 1433

Date/Time: Tuesday, September 16 2:00-3:00 (NM), 1:00-2:00 (CA)

Location: Building 899 (JCEL) room 1811 (Sandia NM), Building 915, Room S101 (CA)

Seminar 11

Title: Predictive Model Validation for System Risk Management

Failures of engineering systems (e.g., vehicle, aircraft) lead to significant reliability and maintenance costs ($200 billion each year in US industry) and human fatalities, such as Ford Explorer Rollover (1998-2000) and the explosion of the Challenger space shuttle (1986). One of the greatest challenges in engineering systems design is eliminating risk of product systems before the design is produced. This talk thus presents a predictive model validation to system risk management. Hierarchical model validation (HMV) is developed for validating predictive system computer models. To make the validation systematic and affordable, HMV is composed of two steps: (1) Top-Down: validation planning and (2) Bottom-Up: validation execution. HMV has been demonstrated using two practical engineering examples: a cellular phone and a tire. Provided a computer model for an engineering system is validated, the remaining challenge in system risk management is system reliability analysis. A complementary interaction method (CIM) is proposed to formulate system reliability explicitly. For its numerical solvers, two sensitivity-free methods are proposed for various engineering applications:
(1) eigenvector dimension reduction (EDR) and (2) DR-polynomial chaos expansion (DRPCE). It is found that the DR-PCE method is more desirable for highly nonlinear problems, otherwise EDR is preferable. Some engineering applications will be employed to demonstrate the feasibility of the proposed approaches to system risk management.

Speaker: Byeng Dong Youn, University of Maryland

Date/Time: Thursday, June 26 10:00-11:00 (NM), 9:00-10:00 (CA)

Location: CSRI Room 90 (Sandia NM), Building 915, Room N153 (CA)

Seminar 10

Title: A Fresh Look at Mesh Refinement

A pressing issue for verification of computational physics and engineering codes is how to estimate discretization error effects when our calculations are not in the asymptotic region of convergence, which is often the case. The following talk will present some new possibilities on how to address this issue. The presenter, Francois Hemez of the Verification Methods Group at LANL, will also briefly mention some other areas of current research and tool development in his group. An open group discussion with Francois will occur from 1:30 – 2:30 in room 1811. The abstract of the talk is given below.

This presentation takes a fresh look at the concept of mesh refinement for spatial or time-varying curves simulated by a computational physics or engineering code. Mesh refinement is used to study the rate at which truncation error converges, where truncation is the difference between discrete solutions obtained by a code with a given level of mesh or grid refinement, Δx, and the (unknown) solution of the continuous partial differential equations.

The idea from which our contribution originates is that discrete solution curves computed by a code can be decomposed on a basis of independent empirical functions. Our contention is that these functions define specific “resolution scales” and that these scales converge at different rates as Δx --> 0. It may, therefore, make more sense to study the asymptotic convergence of these individual resolution scales. A technique based on principal component decomposition is developed to observe the resolution scales that contribute to discrete solutions. A theorem demonstrates that the asymptotic convergence of an entire curve is equivalent to the convergence of its decomposition. The theorem yields a bounded estimate of the rate-of-convergence for entire spatial or time-varying curves. These ideas are applied to simulations performed with a finite element code in the case of a Hertz contact problem where the exact solution is unknown.

Speaker: François Hemez, Los Alamos National Laboratory

Date/Time: Tuesday, May 20, 11:00-12:00 (NM), 10:00-11:00 (CA)

Location: JCEL, Building 899, Room 1811 (Sandia NM), Building 915, Room S145 (CA)

Seminar 9

Title: Verification of the Calore Thermal Analysis Code

Calore is the ASC code developed to model steady and transient thermal diffusion with chemistry and dynamic enclosure radiation. An integral part of the software development process is code verification, which addresses the question “Are we correctly solving the model equations?” This process aids the developers in that it identifies potential software bugs and gives the thermal analyst confidence that a properly prepared input will produce satisfactory output. Grid refinement studies have been performed on problems for which we have analytical solutions. In this talk, the code verification process is overviewed and recent results are presented. Recent verification studies have focused on transient nonlinear heat conduction and verifying algorithms associated with (tied) contact and adaptive mesh refinement. In addition, an approach to measure the coverage of the verification test suite relative to intended code applications is discussed.

Speaker: Kevin Dowding, Dept. 1544

Date/Time: Thursday, April 24, 2:00-3:00 (NM), 1:00-2:00 (CA)

Location: CSRI Room 90 (Sandia NM), Building 915, Room S107 (CA)

Note: The Computer Science Research Institute (CSRI) is located in the Sandia Research and Technology Park, at 1450 Innovation Pkwy. For more information visit: http://www.cs.sandia.gov/CSRI

Seminar 8

Title: Kriging: The Cadillac of Nonlinear Response-Surface Methodologies

Sandia National Laboratories has recently licensed from General Motors a proprietary software package, called the “Kriging Wizard,” that can create experimental designs for computer experiments, fit Kriging response surfaces to the resulting data, and facilitate design exploration with these response surfaces. Given that Kriging is only one of many methods for fitting an approximation to input-output data, it is quite natural to ask what makes Kriging special and why anyone should bother to learn about this new method and the associated software. Answering this question will be the focus of my talk.

As we will see, the most distinctive feature of Kriging is that it has a statistical foundation that allows one to derive meaningful confidence intervals around predicted values. As I will explain, one can not only put confidence intervals around predictions at a single point, but one can also put confidence intervals around quantities computed over many points, such as a “probability of failure” estimated using a Monte Carlo study on the surface.

Another advantage of Kriging is that it has properties that facilitate performing functional analysis of variance (“functional ANOVA”). Like Monte Carlo analysis, functional ANOVA can assume distributions for inputs that are “noise factors” and determine how this variation propagates to variation in the output. Unlike Monte Carlo, however, functional ANOVA can also determine how much of this output variation is due to different noise factors, thereby allowing one to determine which noise factor is most important. If the inputs are control parameters that are assumed to be uniform between a lower and upper design limit, then functional ANOVA can identify which control variable is most important to focus on in order to improve the design.

Speaker: Don Jones, General Motors Technical Fellow

Date/Time: Monday, March 17, 9:30-10:30 (NM), 8:30-9:30 (CA)

Location: 823 Breezeway (Sandia NM), Building 915, Room S107 (CA)

Seminar 7

Title: Bootstrap methods for sensitivity analysis of computer models

The understanding of many physical and engineering phenomena of interest involves running complex computational models (computer codes). With any problem of this type it is important to understand the relationships between the input variables (whose values are often imprecisely known) and the output. The goal of sensitivity analysis (SA) is to study this relationship and identify the most significant factors or variables affecting the results of the model.

In this presentation we suggest an improvement on existing methods for SA of complex computer models when the model is too computationally expensive for a standard Monte-Carlo analysis. In these situations a meta-model or surrogate model can be used to estimate the necessary sensitivity index for each input. A sensitivity index is a measure of the variance in the response that is due to the input. The existing approaches to this problem either do not work well with a large number of input variables or do not satisfactorily deal with estimation error. Here we propose a new approach to variance index estimation which appears to incorporate satisfactory solutions to these drawbacks. The approach uses stepwise regression as well as boostrap methods to generate confidence intervals on the sensitivity indices. Several nonparametric regression procedures such as locally weighted polynomial regression (LOESS), additive models (GAM’s), projection pursuit, and recursive partitioning are considered as well as metamodels such as multivariate adaptive regression splines (MARS), random forests, and the gradient boosting method. An approach for calculating statistical properties of the bootstrap estimator will also be discussed. Several examples will illustrate the utility of this approach in practice.

Speaker: Curtis Storlie, UNM Dept. of Mathematics and Statistics

Date/Time: Thursday, March 6, 2:00-3:00 (NM), 1:00-2:00 (CA)

Location: Computer Science Research Institute, room 90 (Sandia NM), Building 915, Room 107 (CA)

Note: The Computer Science Research Institute (CSRI) is located in the Sandia Research and Technology Park, at 1450 Innovation Pkwy. For more information visit: http://www.cs.sandia.gov/CSRI

Seminar 6

Title: Probabilistic Analysis Method for Quantifying Weapon system Safety in Mechanical Environments

Stronglinks, barriers and weaklinks are safety crticial components and elements that provide assured safety. Stronglinks and barriers protect the system from low energy insults by use of their strength. However, at some environmental energy level, these components will fail, and weaklinks, which control the process of system failure, are used to ensure safety when stronglink performance is no longer assured. Unfortunately, there are no recognized mechanical weaklinks in any of our stockpile systems, even through there are a number of credible mechanical insults that could cause a loss of assured safety. In this presentation we will discuss a system level analysis capability to determine the statistical design constrains of a mechanical weaklink. We begin this presentation by formulating a theory of safety for mechanical environments based on sets of environments whose insults are protected against by a set of safety critical components. The goal of this work is to build a safety cover where the union of all of these sets is the set of all credible environments. Membership to sets is determined through the use of statistical analysis. Weaklink design criteria will be determined by covering the set of environments not covered by stronglinks. Safety margins will be given. An example using W76-1 dynamics will be presented.

Speaker: Jeffrey Dohner, Dept. 12347

Date/Time: Monday, February 11th, 1:00-2:00(NM)

Location: Building 836, room 104A

NOTE: This presentation will limited to Q badges only and not videoconferenced to CA

Seminar 5

Title: (TBD)

Speaker: Scott Ferson, Applied Biomathematics

Date/Time: Wednesday, January 23, 2:00-3:00(NM), 1:00-2:00 (CA)

Location: CSRI (Computer Science Research Institute-Research Park), Room 90 (Sandia NM), Building 916, Room 101 (CA)

Seminar 4

Title: Uncertainty-A Metrologist’s View

Uncertainty is often (perhaps always) a fundamental aspect of the human world view rather than of the world itself. As such, the meaning of uncertainty is inseparable from the problem at hand. In particular, the views on uncertainty differ between the modeling and simulation community and the metrology community. The metrology perspective is formalized in the ISO “Guide to the Expression of Uncertainty in Measurement”. While this document certainly has its limitations, the method it describes works very well in practice when applied to many measurement uncertainty problems. I would like to discuss measurement, uncertainty, the ISO “Guide”, and beyond.

Speaker: Harold Parks, Dept. 2542

Date/Time: Thursday, January 17th, 3:00-4:00 (NM), 2:00-3:00 (CA)

Location: Building 899, 1811 (Sandia NM), Building 915, Room W133 (CA)

Seminar 3

Title: VALMET: A Tool for Computing Validation Metrics

A validation metric is a quantitative measure of agreement between physical reality, as measured by a collection of experiments, and computational predictions. A set of validation metrics based on statistical confidence intervals, and applicable to validation of deterministic models, is described in Oberkampf and Barone (JCP, 217:5-36, 2006, also available as SAND2005-4302). Recently, the VALMET Matlab code was developed to calculate this set of metrics for one-dimensional data sets. This talk will describe VALMET, its features, how its output relates to the described metrics, and how it can be used for practical validation studies. A demonstration of the code on several sample problems will be performed.

Speaker: Matthew Barone, Dept. 1515

Date/Time: Thursday, December 13th, 10:00-11:00a.m. MST

Location: Building 899 (JCEL), Room 1811 (NM), Building 915, Room S101 (CA)

Seminar 2

Title: Model Validation under Both Aleatory and Epistemic Uncertainty

We consider a general measure of validation assessment that can be used to characterize the disagreement between the quantitative predictions from a model and relevant empirical data when predictions and data may contain both aleatory and epistemic uncertainty. This validation assessment metric can characterize the mismatch between predictions expressed as probability distributions and any number of observations and it has a variety of properties useful in engineering. This paper extends the metric for use in pooling observations from multiple system response quantities expressed in different units and dimensions and in accounting for observations that are outside of the range considered “possible” by the model. It explores the application of the metric when the predicted quantity is a scalar (real) value. In such cases, the prediction and observation may still have the forms of probability distributions which represent measurement uncertainty rather than intrinsic variability of the quantity. The metric is also generalized to the case when predictions or data contain epistemic uncertainty that cannot be well characterized with any single probability distribution. We suggest that when the uncertainties of the prediction and the observations overlap, the validation metric between them can be small or even zero, but this does not mean that a model’s predictive capability will necessarily be high. A model’s predictive capability is a function of the acknowledged imprecision of its predictions. This imprecision should be appropriately inflated when the model’s performance is found to be poor when assessed by the validation metric. Thus, although acknowledging epistemic uncertainty in either predictions or observations tends to lower the apparent discrepancy between theory and data and thus result in a smaller value of a validation assessment metric, it will propagate through the extrapolation of the model and express itself as lower precision in the model’s predictive capability.

Speaker: Dr. William Oberkampf, 1544

Date/Time: Friday, November 30th, 10:00-11:00a.m.

Location: Building 823/Breezeway (enter through the lobby of 823)

Download the presentations [PDF] (Proper authorization required):

Seminar 1

Title: Stockpile Assessment Study: QMU with electrical modeling and simulation

This talk presents recent results from a 2007 stockpile assessment QMU study based on electrical simulations and comparison to test data.

Speaker: Matthew Kerschen, Dept. 12346

Date/Time: Thursday, November 8th, 3:00-4:00 (NM), 2:00-3:00 (CA)

Location: Building 899, 1811 (Sandia NM), Building 915, Room S145 (CA)

This was an OUO presentation. Contact Laura Swiler or Matt Kerschen to obtain a copy.