Uncertainty Quantification/Verification & Validation Seminar Series (Internal at Sandia)

This seminar series is sponsored by ASC Uncertainty Quantification Methods Development project. The purpose of these seminars is to foster communication between researchers and applications teams in the areas of UQ/V&V, make people aware of tools and resources that are available, provide a forum for creative discussion about ways to approach problems, identify gaps in our current practices, present current work, and discuss how to more broadly implement the QMU (quantification of margins and uncertainties) mandate.

Previous Seminars

1–6, 7–11, 12–16


Seminar 22

Title: A Rogue’s Gallery of V&V Practice

The purpose of this talk is quite simple: take a snapshot of V&V practice through examining the current state of the archival literature. The snapshot is not a survey, but rather a “random” sample of what sort of verification and validation one might encounter while reading the latest articles in top-notch journals. The methodology taken here is to visit the latest issue (or two) of each journal and examine papers that utilize a computational approach integrally in their investigation. Each journal represents the apex of the research world and successfully publishing in any of these journals would be a notable professional accomplishment (or look good at your performance review or tenure review or to an employer, etc…). As such, these journals not just represent accepted practice, but ``the best work available in the field. Within some of the selected subset of journals, their editorial policy includes standards for numerical calculation necessary for publication. Other journals have an editorial policy on experimental or observational errors. Finally, many journals have no particular statement with regard to either. The conclusions that may be drawn from this study are stark. The state of V&V practice remains quite crude and high standards in V&V are quite far from the norm.

Speaker: Bill Rider, Dept. 1431 Date/Time: Tuesday, Aug. 11, 2:00-3:30(NM), 1:00-2:30(CA) Location: CSRI Room 90 (Sandia NM), Building 915 Room S101 (CA)

Seminar 21

Title: Kalman Filter Tutorial

The Kalman Filter is a classical filtering technique for state estimation of systems with linear dynamics and Gaussian noise (in both the observations and the dynamics). The 'filtering' problem can be viewed as an inverse problem for a random process over a state-space evolving in time. Given incomplete, noisy, observations of the process at different times, we wish to estimate the current state as and/or make predictions of the future state. When the state evolution is linear and the underlying randomness is Gaussian, the Kalman Filter is the optimal (in an L^2 sense) filter; when the randomness is arbitrary the Kalman Filter is the optimal affine predictor/estimator.

In this tutorial, a theoretical derivation of the Kalman filter will be presented with an emphasis on the use of the innovation sequence and Hilbert Space projections. The Kalman filter is an attractive tool for many filtering problems due to its recursive nature and the fact that it produces confidence intervals for the estimates. Several examples will be presented that will demonstrate applicability of the Kalman Filter both in the Linear/Gaussian setting and in other situations.

This is the first part of a two-part tutorial. The next lecture will discuss nonlinear filtering and extensions of the Kalman filter such as ensemble Kalman filters.

NOTE: We videorecorded this tutorial, and the videostream is available at:
http://tiny.sandia.gov/uqvnv

Speaker: Nick West, Stanford University, ICME (Institute for Computational and Mathematical Engineering)
Date/Time: Wednesday, July 22, 10:00-11:30 (NM), 9:00-10:30 (CA)
Location: CSRI Room 90 (Sandia NM), Building 915, Room S145 (CA)

Seminar 20

Title: Sierra UQ: Nonintrusive Application of Sierra Codes to Flat Lapjoint

Recent compelling applications involving heterogeneous volume and surface fields have spurred renewed interest in our function analytic approach to probabilistic analysis. This is due largely to its ready applicability to random fields as a basis for computational uncertainty quantification (UQ).

Code development efforts are underway tasked with developing a means to perform UQ analyses in a non-intrusive context under the umbrella of the SIERRA framework. This framework will include a suite of tools specific to the UQ subprocess, as well as changes to some of the SIERRA codes, including SALINAS, PRESTO, ADAGIO, and ARIA.

I will first present some of the fundamental ideas in this functional approach. I then discuss the basic algorithms for both the non-intrusive and intrusive approaches as applications of these ideas. Finally, I will present preliminary results achieved, in collaboration with Mike Starr, Dept 1526, using this approach for a recent application involving surface irregularities in interface mechanics. In the UQ analysis, we used Dakota as the UQ strategy engine, one-off
tools for generating random fields, and a modified version of ADAGIO for performing the physics calculations.

Speaker: John Red-Horse, Dept. 1544
Date/Time: Wednesday, July 8, 2:00-3:30(NM), 1:00-2:30(CA)
Location: Building 898 (WIF), Room 1446 (Sandia NM), Building 915 Room S107 (CA)

Seminar 19

Title: Frequency Domain Approach to Model Validation in the Presence of High Modal Density

High-performance space vehicles with extremely low levels of on-orbit vibration are required for many Air Force mission areas. Finite element models of these high-performance vehicles must be valid to a higher frequency range than what is required for most other spacecraft. In the low-frequency range, the aerospace community has traditionally relied on modal-based methods for tasks such as sensor placement, model reduction, model correlation, and model updating. At higher frequencies, noise and errors in the test modes, combined with high modal density, produces coupling sensitivity between test and analysis mode shapes. This leads to the breakdown of modal-based correlation and updating techniques.

This work proposes new techniques for the model validation process that are based directly on the structure’s frequency response. Principal component analysis of the analytical frequency response is used to aide in sensor placement and model reduction. Correlation metrics that compare experimental frequency response directly to the reduced impedance matrix are used as validation criteria, as well as in the model updating cost functions. Use of the frequency response avoids the difficult tasks of selecting target modes, mode pairing, and mode extraction, all of which become more difficult in the presence of high modal density. The goal of this work is to provide an alternative to modal analysis for model validation that can be used beyond the low-frequency range.

Bio:
Mr. Nimityongskul is currently a Space Scholar studying the effects of test and model uncertainty on the validation process at the Air Force Research Laboratory, Space Vehicles Directorate. He earned the BS in 2004 and the MS in 2005, both in Engineering Mechanics and Astronautics from the University of Wisconsin-Madison. He is pursuing his PhD “A Frequency Domain Approach to Model Validation in the Presence of High Modal Density” at the University of Wisconsin-Madison under the advice of Professor Daniel Kammer.

Speakers: Aaron Nimityongskul, Air Force Research Laboratory
Date/Time: Monday, June 29, 2:00-3:00(NM), 1:00-2:00(CA)
Location: CSRI, Room 90 (Sandia NM), Building 915 Room S145 (CA)

Seminar 18

Title: A New Interval-Based “Real Space” Approach to Model Validation Involving Aleatory and Epistemic Uncertainty

This talk will describe a pragmatic interval-based approach to model validation where significant aleatory and epistemic sources of uncertainty exist in the experiments and simulations. The validation comparison of experimental and simulation results, and corresponding criteria and procedures for model affirmation or refutation, take place in “real space” as opposed to “difference space” where subtractive differences between experiments and simulations are assessed. The versatile model validation framework handles difficulties associated with representing and aggregating aleatory and epistemic uncertainties from multiple correlated and uncorrelated source types, including:

  • experimental variability from multiple repeat experiments
  • uncertainty of experimental inputs
  • experimental output measurement uncertainties
  • uncertainties that arise in data processing and inference from raw simulation and experiment outputs
  • parameter and model-form uncertainties intrinsic to the model
  • numerical solution uncertainty from model discretization effects

Significantly, the framework and uncertainty processing machinery of the new model validation methodology can serve dual use for model calibration under uncertainty. It will be explained how the framework provides connectivity of sub-scale model validation & calibration activities into hierarchical modeling efforts, such as QMU analysis. Recent applications in the QASPR and fire-modeling programs will be presented, along with several other application examples.

Speaker: Vicente Romero (Dept. 1544)
Date/Time: Thursday, April 30, 1:00-2:00(NM), 12:00-1:00(CA)
Location: 823 Breezeway (Sandia NM), Building 915, Room S145 (CA)

Seminar 17

Title: A Framework for Analyzing Epistemic Uncertainty in Validation

Our basic goal in model validation is to perform an assessment of the usefulness of a model or simulation for its intended purpose. But we
know from the outset that all modeling is approximation, and thus that all simulations exhibit error. Further complicating our efforts is that, for a variety of reasons, we cannot rely on obtaining an exact description for a given physical scenario through experiment either, nor can we perform a sufficiently exhaustive experiment suite to achieve a full characterization for a given class of physical events. Finally, it is difficult, or impossible, to segregate various sources of discrepancy between model-based simulations and experiments. All of the above contribute to the existence of essential uncertainty in the validation process, both inherent and epistemic.

In this discussion, I will present some ideas on building a mathematical framework for developing validation methods and algorithms that are capable of accommodating error and various forms of uncertainty. I also suggest the framework as a basis on which we might build to allow deeper exploration via the assessment process. For example, one within which we can address such issues as: (1) The relevance of acquired data; (2) The performance of cost-benefit trade-offs for ascertaining where best to apportion resources; and, finally, (3) The development of better up-front tools for specifying accuracy requirements.

NOTE: Many people expressed interest in obtaining the slides from the Model Validation/UQ tutorial that was given on January 21st. The slides and videostream are available on the web site.

Speaker: John Red-Horse (Dept. 1544)
Date/Time: Wednesday, March 11, 2:00-3:00 (NM), 1:00-2:00 (CA)
Location: 899 (JCEL) room 1811 (Sandia NM), Building 915, Room S145 (CA)

Previous Seminars

1–6, 7–11, 12–16