0

IN THIS ISSUE

### Research Papers

J. Verif. Valid. Uncert. 2018;3(2):021001-021001-10. doi:10.1115/1.4040803.

This paper presents grid refinement studies for statistically steady, two-dimensional (2D) flows of an incompressible fluid: a flat plate at Reynolds numbers equal to 107, 108, and 109 and the NACA 0012 airfoil at angles of attack of 0, 4, and 10 deg with Re = 6 × 106. Results are based on the numerical solution of the Reynolds-averaged Navier–Stokes (RANS) equations supplemented by one of three eddy-viscosity turbulence models of choice: the one-equation model of Spalart and Allmaras and the two-equation models k – ω SST and $k−kL$. Grid refinement studies are performed in sets of geometrically similar structured grids, permitting an unambiguous definition of the typical cell size, using double precision and an iterative convergence criterion that guarantees a numerical error dominated by the discretization error. For each case, different grid sets with the same number of cells but different near-wall spacings are used to generate a data set that allows more than one estimation of the numerical uncertainty for similar grid densities. The selected flow quantities include functional (integral), surface, and local flow quantities, namely, drag/resistance and lift coefficients; skin friction and pressure coefficients at the wall; and mean velocity components and eddy viscosity at specified locations in the boundary-layer region. An extra set of grids significantly more refined than those proposed for the estimation of the numerical uncertainty is generated for each test case. Using power law extrapolations, these extra solutions are used to obtain an approximation of the exact solution that allows the assessment of the performance of the numerical uncertainty estimations performed for the basis data set. However, it must be stated that with grids up to 2.5 (plate) and 8.46 (airfoil) million cells in two dimensions, the asymptotic range is not attained for many of the selected flow quantities. All this data is available online to the community.

Commentary by Dr. Valentin Fuster
J. Verif. Valid. Uncert. 2018;3(2):021002-021002-10. doi:10.1115/1.4041195.

The Noh verification test problem is extended beyond the commonly studied ideal gamma-law gas to more realistic equations of state (EOSs) including the stiff gas, the Noble-Abel gas, and the Carnahan–Starling EOS for hard-sphere fluids. Self-similarity methods are used to solve the Euler compressible flow equations, which, in combination with the Rankine–Hugoniot jump conditions, provide a tractable general solution. This solution can be applied to fluids with EOSs that meet criterion such as it being a convex function and having a corresponding bulk modulus. For the planar case, the solution can be applied to shocks of arbitrary strength, but for the cylindrical and spherical geometries, it is required that the analysis be restricted to strong shocks. The exact solutions are used to perform a variety of quantitative code verification studies of the Los Alamos National Laboratory Lagrangian hydrocode free Lagrangian (FLAG).

Commentary by Dr. Valentin Fuster
J. Verif. Valid. Uncert. 2018;3(2):021003-021003-12. doi:10.1115/1.4041490.

Uncertainty quantification (UQ) is gaining in maturity and importance in engineering analysis. While historical engineering analysis and design methods have relied heavily on safety factors (SF) with built-in conservatism, modern approaches require detailed assessment of reliability to provide optimized and balanced designs. This paper presents methodologies that support the transition toward this type of approach. Fundamental concepts are described for UQ in general engineering analysis. These include consideration of the sources of uncertainty and their categorization. Of particular importance are the categorization of aleatory and epistemic uncertainties and their separate propagation through an UQ analysis. This familiar concept is referred to here as a “two-dimensional” approach, and it provides for the assessment of both the probability of a predicted occurrence and the credibility in that prediction. Unique to the approach presented here is the adaptation of the concept of a bounding probability box to that of a credible probability box. This requires estimates for probability distributions related to all uncertainties both aleatory and epistemic. The propagation of these distributions through the uncertainty analysis provides for the assessment of probability related to the system response, along with a quantification of credibility in that prediction. Details of a generalized methodology for UQ in this framework are presented, and approaches for interpreting results are described. Illustrative examples are presented.

Commentary by Dr. Valentin Fuster
J. Verif. Valid. Uncert. 2018;3(2):021004-021004-18. doi:10.1115/1.4041372.

The objective of this work is to provide and use both experimental fluid dynamics (EFD) data and computational fluid dynamics (CFD) results to validate a regular-wave uncertainty quantification (UQ) model of ship response in irregular waves, based on a set of stochastic regular waves with variable frequency. As a secondary objective, preliminary statistical studies are required to assess EFD and CFD irregular wave errors and uncertainties versus theoretical values and evaluate EFD and CFD resistance and motions uncertainties and, in the latter case, errors versus EFD values. UQ methods include analysis of the autocovariance matrix and block-bootstrap of time series values (primary variable). Additionally, the height (secondary variable) associated with the mean-crossing period is assessed by the bootstrap method. Errors and confidence intervals of statistical estimators are used to define validation criteria. The application is a two-degrees-of-freedom (heave and pitch) towed Delft catamaran with a length between perpendiculars equal to 3 m (scale factor equal to 33), sailing at Froude number equal to 0.425 in head waves at scaled sea state 5. Validation variables are x-force, heave and pitch motions, vertical acceleration of bridge, and vertical velocity of flight deck. Autocovariance and block-bootstrap methods for primary variables provide consistent and complementary results; the autocovariance is used to assess the uncertainty associated with expected values and standard deviations and is able to identify undesired self-repetition in the irregular wave signal; block-bootstrap methods are used to assess additional statistical estimators such as mode and quantiles. Secondary variables are used for an additional assessment of the quality of experimental and simulation data as they are generally more difficult to model and predict than primary variables. Finally, the regular wave UQ model provides a good approximation of the desired irregular wave statistics, with average errors smaller than 5% and validation uncertainties close to 10%.

Commentary by Dr. Valentin Fuster
J. Verif. Valid. Uncert. 2018;3(2):021005-021005-10. doi:10.1115/1.4041687.

In this work, a general methodology and innovative framework to characterize and quantify representativeness uncertainty of performance indicator measurements of power generation systems is proposed. The representativeness uncertainty refers to the difference between a measurement value of a performance indicator quantity and its reference true value. It arises from the inherent variability of the quantity being measured. The main objectives of the methodology are to characterize and reduce the representativeness uncertainty by adopting numerical simulation in combination with experimental data and to improve the physical description of the measurement. The methodology is applied to an industrial case study for demonstration. The case study involves a computational fluid dynamics (CFD) simulation of an orifice plate-based mass flow rate measurement, using a commercially available package. Using the insight obtained from the CFD simulation, the representativeness uncertainty in mass flow rate measurement is quantified and the associated random uncertainties are comprehensively accounted for. Both parametric and nonparametric implementations of the methodology are illustrated. The case study also illustrates how the methodology is used to quantitatively test the level of statistical significance of the CFD simulation result after accounting for the relevant uncertainties.

Commentary by Dr. Valentin Fuster

### Technical Brief

J. Verif. Valid. Uncert. 2018;3(2):024501-024501-6. doi:10.1115/1.4041265.

Model verification and validation (V&V) remain a critical step in the simulation model development process. A model requires verification to ensure that it has been correctly transitioned from a conceptual form to a computerized form. A model also requires validation to substantiate the accurate representation of the system it is meant to simulate. Validation assessments are complex when the system and model both generate high-dimensional functional output. To handle this complexity, this paper reviews several wavelet-based approaches for assessing models of this type and introduces a new concept for highlighting the areas of contrast and congruity between system and model data. This concept identifies individual wavelet coefficients that correspond to the areas of discrepancy between the system and model.

Topics: Wavelets , Signals
Commentary by Dr. Valentin Fuster