0
Research Papers

Optimal Test Selection for Prediction Uncertainty Reduction

[+] Author and Article Information
Joshua Mullins

V&V, UQ, and Credibility Processes Department,
Sandia National Laboratories,
P.O. Box 5800, Mail Stop 0828,
Albuquerque, NM 87185-0828
e-mail: jmullin@sandia.gov

Sankaran Mahadevan

Department of Civil and
Environmental Engineering,
Vanderbilt University,
VU Station B #351831,
Nashville, TN 37235-1831
e-mail: sankaran.mahadevan@vanderbilt.edu

Angel Urbina

V&V, UQ, and Credibility Processes Department,
Sandia National Laboratories,
P.O. Box 5800, Mail Stop 0828,
Albuquerque, NM 87185-0828

Manuscript received October 6, 2015; final manuscript received October 27, 2016; published online December 2, 2016. Assoc. Editor: Scott Doebling.

J. Verif. Valid. Uncert 1(4), 041002 (Dec 02, 2016) (10 pages) Paper No: VVUQ-15-1044; doi: 10.1115/1.4035204 History: Received October 06, 2015; Revised October 27, 2016

Economic factors and experimental limitations often lead to sparse and/or imprecise data used for the calibration and validation of computational models. This paper addresses resource allocation for calibration and validation experiments, in order to maximize their effectiveness within given resource constraints. When observation data are used for model calibration, the quality of the inferred parameter descriptions is directly affected by the quality and quantity of the data. This paper characterizes parameter uncertainty within a probabilistic framework, which enables the uncertainty to be systematically reduced with additional data. The validation assessment is also uncertain in the presence of sparse and imprecise data; therefore, this paper proposes an approach for quantifying the resulting validation uncertainty. Since calibration and validation uncertainty affect the prediction of interest, the proposed framework explores the decision of cost versus importance of data in terms of the impact on the prediction uncertainty. Often, calibration and validation tests may be performed for different input scenarios, and this paper shows how the calibration and validation results from different conditions may be integrated into the prediction. Then, a constrained discrete optimization formulation that selects the number of tests of each type (calibration or validation at given input conditions) is proposed. The proposed test selection methodology is demonstrated on a microelectromechanical system (MEMS) example.

FIGURES IN THIS ARTICLE
<>
Copyright © 2016 by ASME
Your Session has timed out. Please sign back in to continue.

References

Figures

Grahic Jump Location
Fig. 1

Variance in model reliability for replicate validation data sets at the same input: (a) one observation, (b) 10 observations, (c) 100 observations, (d) 1000 observations, (e) 10,000 observations, and (f) 100,000 observations

Grahic Jump Location
Fig. 2

Student's t-distribution of the mean observation for sparse observation sets

Grahic Jump Location
Fig. 5

Parameter uncertainty for a particular data realization: (a) sample calibration of E and (b) distribution of the overall model reliability

Grahic Jump Location
Fig. 6

Family of predictions for a particular data realization d: (a) sample PDF family and (b) sample CDF family

Grahic Jump Location
Fig. 4

Known aleatory distributions of h for the six devices

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In