## Abstract

Hierarchical sensitivity analysis (HSA) of multilevel systems is to assess the effect of system’s input uncertainties on the variations of system’s performance through integrating the sensitivity indices of subsystems. However, it is difficult to deal with the engineering systems with complicated correlations among various variables across levels by using the existing hierarchical sensitivity analysis method based on variance decomposition. To overcome this limitation, a mapping-based hierarchical sensitivity analysis method is proposed to obtain sensitivity indices of multilevel systems with multidimensional correlations. For subsystems with dependent variables, a mapping-based sensitivity analysis, consisting of vine copula theory, Rosenblatt transformation, and polynomial chaos expansion (PCE) technique, is provided for obtaining the marginal sensitivity indices. The marginal sensitivity indices can allow us to distinguish between the mutual depend contribution and the independent contribution of an input to the response variance. Then, extended aggregation formulations for local variables and shared variables are developed to integrate the sensitivity indices of subsystems at each level so as to estimate the global effect of inputs on the response. Finally, this paper presents a computational framework that combines related techniques step by step. The effectiveness of the proposed mapping-based hierarchical sensitivity analysis (MHSA) method is verified by a mathematical example and a multiscale composite material.

## 1 Introduction

Analysis of complex engineering systems often consists of many variables and multiple disciplines, which are designed and developed by multiple groups or multiple steps [1,2]. However, analysis of the entire system may be complicated because of some factors, including subsystems managed by different groups, subsystem design tools, subsystem analyses that run on different software, and so on, which results in inadequate integration of subsystems. The decomposition strategy, which decomposes a multilevel system into several subsystems in a hierarchical way, or in a concurrent way for studying the global system performance of interest, tends to be an effective method. Hierarchical modeling has been widely adopted in many fields, especially in the field involving the integrated design of material-structure systems [3–5]. With the advancement of hierarchical modeling, the basic theories and application studies of multilevel design in a hierarchical framework are also developed [6–8].

Various uncertainties, widely existing in complex engineering systems owing to the influence of material discreteness, measurement methods, manufacturing processes, simulation models and operating environment, directly affect the system’s performance and thus affect the reliability, robustness, and security of the products [9–12]. The integration of multilevel systems with uncertainties is critical for the analysis and design of complex engineering systems, and it has been an active research area recently [13–16]. Uncertainty analysis and sensitivity analysis (SA) are two main aspects of uncertainty quantification [17–19]. Amaral et al. proposed a decomposition-based approach to uncertainty analysis of complex feed-forward systems and synthesized the local uncertainty analysis into a system uncertainty analysis. In this method, the dependency among variables was recovered by the importance sampling technique [13]. Subsequently, the method was applied to environmental impacts of aviation technology and operation and component-level global SA identified the most important variables [14]. This method is recommended if the computational cost of multilevel system is relatively low. However, the method maybe difficult to deal with the problems with high computational cost, such as the multiscale composite material problem with high-precision simulation models which is shown in Fig. 1 [11]. Polynomial chaos expansion (PCE) has been proven to be a powerful technique for uncertainty analysis of complex structures [15]. Therefore, the PCE method had been adopted in Ref. [16] to acquire the output uncertainties of multilevel systems with relatively few samples.

This paper specifically considers the question of sensitivity analysis of multilevel systems, which is to assess the effect of system’s input uncertainties on the variations of system’s performance based on the sensitivity indices of subsystems. It will provide a good guidance on parameter selection and dimension reduction for researchers in the field of engineering design under uncertainty since SA can rank the importance of input factors at all levels [19]. Take the battery box shown in Fig. 1 as an example, SA method is expected to identify the important property and geometric parameters in microscale and mesoscale on the performance responses in macroscale of battery box. These important parameters are then selected for further study and design to meet performance requirements. All in one (AIO) system-level SA is almost impossible because it involves complicated multiscale co-simulation. So hierarchical SA (HSA), an integration technique of component-level SA to obtain system-level sensitivity indices, has gained attention [20,21]. Although the SA method had been well researched [22], it is difficult to apply it directly to a hierarchical system, because it involves analysis of correlated variables and integration of sensitivity indices. A hierarchical statistical sensitivity analysis (HSSA) method was developed to aggregate the submodel statistical sensitivity analysis results across intermediate levels by a formulation for obtaining the global statistical sensitivity indices of a multilevel system [20]. However, the HSSA method only deals with the hierarchical systems with independent subsystems. Then, HSSA with shared variables (HSSA-SV) method was developed to deal with the engineering systems with dependent subsystem responses which were caused by shared variables [21]. In this method, the dependent responses were grouped by a subset and the covariance of the dependent responses was decomposed into the contributions from individual shared variables. But HSSA-SV only considers bivariate correlations among variables and cannot consider the cross-scale correlations. As a result, it is difficult to deal with multiscale systems with multidimensional correlations by these existing methods.

To explain the multidimensional correlations caused by shared variables, a hierarchical system with a bilevel structure is introduced, which is depicted in Fig. 2 [13]. The hierarchical system has three lower level subsystems, which are denoted as $gL1$, $gL2$, $gL3$ and a upper level subsystem *g*^{U}. The original input vector and final output of the hierarchical system are defined as ** X** = (

*X*

_{1},

*X*

_{2},

*X*

_{3},

*X*

_{4},

*X*

_{5}) and

*Z*. The information of the system is transferred by the intermediate variable vector

**= (**

*Y**Y*

_{1},

*Y*

_{2},

*Y*

_{3}). All variables involved in the system are random. The input variables in the systems can be divided into the following three categories:

Local variables, which is denoted as

*X*. Each subsystem may have its local input variables, e.g.,_{L}*X*_{1},*X*_{2}. and*X*_{3}.Shared variables at the same level, which is denoted as

*X*, e.g.,_{sL}*X*_{4}.Shared variables across levels, which is denoted as

*X*, e.g.,_{sU}*X*_{5}.

The inputs of the upper level subsystem (e.g., *Y*_{1}, *Y*_{2}, *Y*_{3}, and *X*_{5}) are mutually dependent owing to the existence of shared variables. The dependence caused by the shared variables may be multidimensional and nonlinear, and it may introduce a bias in the resulting estimates when neglecting the real dependence structure. In addition, the probability distribution function (PDF) of intermediate variable is unknown. The Gaussian distribution assumption is often adopted to provide a convenient representation of variables dependencies. However, the real PDF, deviating from this assumption, may also introduce an estimation bias in the results, especially the system results after multiple propagations [23]. Therefore, it is necessary to develop an efficient and accurate HSA method with accurate dependence modeling to deal with multilevel uncertainty.

Some advanced uncertainty analysis methods take advantages of mutually independent variables. As for the systems with dependent variables, the commonly used approach is to adopt mapping-based method where measure transformations, such as Rosenblatt transformation and Nataf transformation, are used to map the input vector onto a vector with independent components [24]. Rosenblatt transformation is based on accurate joint PDF, which is typically unknown in actual applications. Nataf transformation is essentially equivalent to the Gaussian copula function, so it may introduce considerable errors when joint PDF follows a non-Gaussian distribution. These two methods both have considerable limitations when applied to the actual engineering problems [23]. Recently, copula models, and vine copula in particular has become a more effective mathematical tool to model dependence [25–29]. The proposition of vine copula, which expresses the multivariate copulas as a product of a series of bivariate copulas, provides reliable and effective ways to multidimensional correlations [26]. Uncertainty and sensitivity analysis methods often require mutually independent inputs. When the joint PDF has been established by vine copula functions, the Rosenblatt transformation can then be adopted to map the input vector to a vector whose components are mutually independent.

In order to deal with the multilevel systems with multidimensional correlations with fewer computational cost, this paper aims to propose a mapping-based HSA (MHSA) method to obtain sensitivity indices of any input variable on multilevel system performances of interest based on previous research on multiscale uncertainty analysis [16]. The variance-based Sobol’ indices are adopted due to their wide applicability in the field of independent variables as well as correlated variables. Moreover, the indices can be well combined with previous studies because of the direct calculation of Sobol’ indices by postprocessing of PCE coefficients [19]. In the proposed MHSA method, a mapping-based SA that consists of vine copula theory, Rosenblatt transformation and PCE technique, is provided to obtain the sensitivity indices of subsystems with dependent variables. The intermediate variables of these multilevel systems are mutually dependent because of the existence of shared variables. The vine copula theory is adopted to model input dependencies accurately. Based on the joint PDF, which is acquired through the dependence model and marginal PDFs, the Rosenblatt transformation can map the dependent variables onto independent variables. The sensitivity indices of new variables, which are called marginal sensitivity indices in this work, can be obtained by postprocessing of PCE coefficients. Then an extended aggregation formulation is developed to integrate the sensitivity indices of subsystems at each level to estimate the global effect of inputs on the response. The proposed MHSA has a high precision since it guarantees an accurate description of the input dependence. In addition, the method is more suitable for engineering problems with high computational costs because fewer samples are required.

The remainder of this paper is organized as follows. Section 2 describes the problem in detail. Section 3 introduces the related technical bases which will be used in the proposed algorithm, including PCE-based SA, vine copula theory, and Rosenblatt transformation. In Sec. 4, mapping-based SA and an aggregation formulation are proposed and then a general framework is given for the implement of HSA combined with the mapping-based SA and aggregation formulation. Section 5 provides two examples to verify the effectiveness of the proposed method. Some conclusions and future work will be presented in Sec. 6.

## 2 Problem Statement

A multilevel system maps that system’s input vector ** x** onto that system’s output

*z*by an intermediate vector

**. For the input vector affected by uncertainty, it is represented by a random vector**

*y***with prescribed PDF. As a result, the intermediate variable vector and the system output are also stochastic denoted by**

*X***and**

*Y**Z*, respectively. In a multilevel framework, the input variables of models at high levels may have the variables upscaled from the lower level or the bottom level. Hence, input variables of system are classified into three categories: local variables, shared variables at the same level, and shared variables across levels. The multilevel system shown in Fig. 2, as a representative system, will be described in detail for demonstration purpose.

A system is a collection of some coupled subsystems across levels. Figure 3 shows the collection of subsystems shown in Fig. 2. Each subsystem corresponds to an input/output relationship. The mapping of the lower level subsystem is denoted as $gLi:Xi\u2192Yi$, while the mapping of the upper level subsystem is denoted as *g*^{U}: ** Y** →

*Z*. In this research, all original input variables of system are mutually independent and modeled by probabilistic model. Then for the subsystems 1–3, the first-order sensitivity indices $SXY$ can be obtained by SA method for independent variables. But for the subsystem 4, the components of the vector

**are mutually dependent owing to the existence of**

*Y**X*

_{4}. Meanwhile,

*Y*

_{3}is correlated with

*X*

_{5}. Therefore, we need to obtain the indices $SYZ$ through SA methods for dependent variables. By a proper and ingenious integration with $SXY$ and $SYZ$, we can eventually obtain the system sensitivity indices $SXZ$.

The research in this work can be expressed as *how to obtain the system sensitivity indices*$SXZ$*by hierarchical integration approach of subsystem sensitivity indices*$SXY$*and*$SYZ$*when given the PDF of input random variables and input/output data at each level*. There are two main challenges lie in this research, which is the processing of multidimensional correlations for obtaining sensitivity indices and integration of subsystems sensitivity indices.

## 3 Technical Bases

Before providing the mapping-based SA for dependent variables, some basic technologies will be introduced in this section. For systems with mutually independent variables, PCE is used to obtain the Sobol’ indices of inputs by the combination of polynomial coefficients. But for systems with mutually dependent variables, vine copula models with multivariate inputs are first needed to be built for capturing complex dependency. And the multivariate dependence model is combined with Rosenblatt transformation to map the dependent vector onto a vector with independent components.

### 3.1 Polynomial Chaos Expansion-based Sensitivity Analysis for Independent Variables

#### 3.1.1 Polynomial Chaos Expansion.

*y*=

*g*(

**), $x={x1,\u2026,xn}T\u2208Rn,n\u22651$. With regard to the input vector affected by uncertainty, it is represented by a random vector**

*x***with the PDF**

*X**f*

_{X}(

**). For random variables which are mutually independent, a truncated PCE with truncated set $A={\alpha \u2208Nn,|\alpha |=\u2211i=1n\alpha i,|\alpha |\u2264p}$ for one subsystem can be defined as**

*x**α*

_{i}is the power of

*X*,

_{i}*q*

_{α}is the polynomial coefficient, the number of polynomial terms with order less than or equal to

*p*is given by (

*p*+

*n*)!/(

*p*!

*n*!), and

*ψ*

_{α}is the multivariate polynomial orthonormal with respect to

*f*

_{X}(

**), i.e.,**

*x**δ*

_{αβ}is the Kronecker delta symbol and E(·) denotes the operator of mathematical expectation. The PCE coefficients can be estimated by minimizing the mean-square error.

#### 3.1.2 Sobol’ Indices From Polynomial Chaos Expansion.

*y*=

*g*(

**), which is given by**

*x**D*is

*g*

_{0}is a constant,

*x*_{γ}denotes a sub-vector of

**containing those components of which the indices belong to**

*x***. The Sobol’ decomposition terms have the orthogonality property**

*γ**δ*

_{γη}is the Kronecker delta symbol.

*g*

_{γ}(

*x*_{γ}) of Eq. (8) can be rewritten as

*A*

_{γ}denotes set of multivariate polynomials which depend only on a subset of

**, i.e., $A\gamma ={\alpha \u2208A:\alpha k\u22600ifandonlyifk\u2208\gamma}$. Consequently, the partial variance can be estimated by the combination of the squares of the coefficients**

*γ*Once the PCE model is provided, the mean value, variance, and Sobol’ indices can be obtained by the combination of coefficients. Nevertheless, the abovementioned method can only be applied to problems with independent variables. For the model with dependent variables, a mapping-based SA using the vine copula theory will be proposed in Sec. 4. In Sec. 3.2, we will introduce the related theories about vine copula for dependence modeling and Rosenblatt transformation.

### 3.2 Independent Transformation of Dependent Variables.

Considering a upper level subsystem $z=gU(y)y=(y1,\u2026,ym)T\u2208Rm,m\u22651$, which is shown in Fig. 2, the components of the vector ** Y** may be mutually dependent owing to the existence of shared variables, so a dependence modeling method needs to be constructed for uncertainty transformation.

#### 3.2.1 Multivariate Dependence Modeling.

**with the PDF**

*Y**f*

_{Y}(

**), its joint CDF**

*y**F*

_{Y}(

**) is**

*y**F*

_{i}(

*y*

_{i}),

*i*= 1, …,

*m*

*C*(·) is the copula function, which is unique if all marginal CDFs are continuous,

**is a vector of copula parameters. Equation (14) can be rewritten as**

*θ***(**

*u =**u*

_{1},

*u*

_{2}, …,

*u*

_{m})

^{T},

*u*

_{i}=

*F*

_{i}(

*y*

_{i}).

*C*(·) has its corresponding copula density function

*c*(·)

*f*

_{Y}(

**) can be derived**

*y***can be rewritten as below**

*Y**f*

_{i|1,2,···,i−1}(

*y*

_{i}|

*y*

_{1},

*y*

_{2}, …,

*y*

_{i−1}),

*i*= 2, 3, …,

*m*is the conditional PDF, which is expressed in the general form

*f*

_{i|sub}for simplicity. Let

*y*be the

_{j}*j*th component from the vector

*y*_{sub}∈

**, let**

*y*

*y*

_{sub}_{-j}be the vector obtained from the vector

*y**by removing*

_{sub}*y*. Then, the marginal conditional PDF can be described as

_{k}The multivariate PDF can be rewritten as a product of bivariate copula density functions with marginal conditional CDFs by substituting Eq. (19) into Eq. (18) recursively. A case with three variables provided in the literature [16] will make the process easy to understand. For an *n*-dimensional joint distribution, there are many different ways to represent *f*_{Y}(** y**) because of different factorization methods. A more effective graphical model known as regular vine (R-vine) is provided in Ref. [33] to help categorize different decompositions. For more information about constructing the multivariate dependence structure, see Ref. [16].

#### 3.2.2 Independent Transformation.

**(**

*u =**u*

_{1},

*u*

_{2}, …,

*u*

_{m})

^{T}can be transformed into independent data by the Rosenblatt transformation [34], which is defined as

*C*(

*u*

_{i}|

*u*

_{1},

*u*

_{2}, …,

*u*

_{i−1}) is the conditional copula of

*U*. Additional transformation is performed for uncertainty propagation

_{i}**space can then be transformed into the variables which are independent in $Y\xaf$ space through**

*Y**u*

_{i}=

*F*

_{i}(

*y*

_{i}) and transformations in Eqs. (20) and (21). Figure 4 summarizes the step procedure of data transformation. The system with mutually dependent intermediate variables will be transformed into the system with independent intermediate variables shown in Fig. 5.

## 4 Mapping-Based Hierarchical Sensitivity Analysis

In order to obtain the sensitivity indices for dependent variables of the upper level subsystem, a mapping-based method is adopted to perform SA. Thus, the marginal sensitivity index of independent variable, which can reflect the effect of its corresponding dependent variable, will be gained by postprocessing of PCE. In addition, aggregation formulations for shared variables and local variables are proposed to integrate all related indices of each subsystem to capture the system’s sensitivity indices. The research also provides a general computational framework for HSA of multilevel systems, including the PCE for SA, dependence modeling by vine copula, Rosenblatt transformation, and aggregation formulations

### 4.1 Mapping-based Sensitivity Analysis for Mutually Dependent Variables.

Polynomial chaos expansion-based SA for independent variables has been introduced in Sec. 3. For dependent inputs, a set of variance-based sensitivity indices had been proposed in Ref. [35] to perform SA. The independent sample set is generated from a dependent one first, then the PCE is used to provide accurate estimates of variance-based sensitivity indices of uncorrelated variables, which are called marginal sensitivity indices here. There are two reasons for adopting the marginal sensitivity indices in this research, one is the marginal sensitivity indices can allow us to distinguish between mutual depend contribution and independent contribution of an input to model response variance, the other is ease of integration to obtain the system sensitivity indices.

In this equation, $S\xafi$ is the sensitivity index of $Y\xafi$. However, several independent variables set can be generated depending on the input ordering in the set and several marginal sensitivity indices can be computed. For the example of the sequence (1, 2, …, *m*), $S\xaf1=S1$ is the full marginal contribution of *Y*_{1} to variance, $S\xaf2=S2\u22121$ is the marginal contribution of *Y*_{2} to variance without its correlative contribution with *Y*_{1}, $S\xafk=Sk\u2212(k\u22121)\u20261(k=2,\u2026,m)$ is the marginal contribution of *Y _{k}* to variance without its correlative contribution with (

*Y*

_{1},

*Y*

_{2}, …,

*Y*

_{k−1}).

**has been transformed into $Y\xaf$. The map**

*Y**g*

^{U}:

**→**

*Y**Z*has been changed into $g\xafU:Y\xaf\u2192Z$. Based on PCE, the model can be rewritten as

### 4.2 Aggregation Formulation.

**, and shared variables across levels**

*Y*

*X**. The components of*

_{sU}**are mutually dependent owing to the existence of shared variables**

*Y*

*X**. And some components of*

_{sL}**are correlated with**

*Y*

*X**. To make the expressions more concise, we define a simplified expression for PDF*

_{sU}*f*=

_{X}*f*(

_{X}**). The joint PDFs $fY+XsU$ can be constructed by the vine copula theory, and Rosenblatt transformation is used to obtain the independent variables $Y\xaf$ with respect to**

*x*

*X**. Then, the upper model will be transformed into $g\xafU(XLU,XsU,Y\xaf)$. When the transformed model is linear with respect to $Y\xaf$ and*

_{sU}

*X**, it can be written as*

_{sU}_{sL}is the set whose element is the index of lower model that contains shared variable

*X*, where $ti=\u222b[Ti(XLU)]\xd7fXLUfXsUfY\xafdXLUdXsUdY\xaf$. $SXsLYi$ represents the sensitivity index of shared variable

_{sL}*X*on the

_{sL}*i*th intermediate variable

*Y*. $SY\xafiZ$ represents the marginal sensitivity index of the

_{i}*i*th intermediate independent variable on the output variable

*Z*. $VXsLYi$ and $VXsLY\xafi$ are the contribution of variable

*X*to the variance of the

_{sL}*i*th intermediate variable

*Y*and its corresponding independent variable $Y\xafi$. $VYi$ is the total variance of the

_{i}*i*th intermediate variable

*Y*. $VY\xafiZ$ is the contribution of the

_{i}*i*th intermediate independent variable $Y\xafi$ to the variance of

*Z*.

_{sU}is the set whose element is the index of lower model that contains shared variable

*X*,

_{sU}*t*is the term corresponding to

_{sU}*X*at the upper level model, $VsUZ$ is the contribution of

_{sU}*X*at the upper level model to the variance of

_{sU}*Z*,

*V*is the total variance of

_{z}*Z*.

*i*th lower level subsystem on the variance of

*Z*. The detailed proofs of Eqs. (26)–(28) are given in Appendix A.

*X**, which can be calculated by*

_{sU}

*X**. Then the sensitivity indices of system whose upper level model is nonlinear can be approximated by substituting the regression coefficient into Eqs. (26)–(28).*

_{sU}### 4.3 Computational Framework.

For a multilevel system composed of complicated shared variables, multiple computational techniques need to be used as a combination to obtain the indices. It can be seen from the Eqs. (26)–(28), these indices contain four parts:

Indices from mapping

*g*^{L}:→*X**Y*, including sensitivity index $SXY$, variance*V*, and the contribution to variance $VXY$._{Y}Indices from mapping $g\xafU:Y\xaf\u2192Z$, including marginal sensitivity index $SY\xafZ$ and the contribution to variance $VY\xafZ$.

Indices from mapping $g\xafL:X\u2192Y\xaf$, including the contribution to variance $VXY\xaf$.

Regression coefficient $t~$ from mapping $g\xafU:Y\xaf\u2192Z$.

The flowchart of the proposed HSA for multilevel systems is shown in Fig. 6.

There are five main steps of the computational framework, the first four of which are to obtain the subsystems indices, and the fifth step is to integrate the acquired subsystems indices into the system indices. It should be noted that all the steps about constructing the PCE and calculating sensitivity indices are processing through the UQLab provided by Ref. [36]. The detailed steps can be summarized as follows:

*Step 1*: Construct the PCE for each lower level subsystem. Then the indices $SXY$, *V _{Y}* and $VXY$ for each lower level subsystem can be obtained by the postprocessing of the polynomial coefficients.

*Step 2*: Considering the correlation among the intermediate variables ** Y**, vine copula theory is adopted to model dependence. A number of stochastic responses of any intermediate variable

*Y*will be obtained based on the established PCE model using Monte Carlo simulation (MCS) with the prescribed PDFs of

**. Since the established PCE model is cheap to compute, enough samples are generated for dependence modeling. The kernel density estimation is adopted to capture the marginal PDF of**

*X**Y*based on these stochastic responses. Then the joint PDF will be acquired based on the dependence model and marginal PDFs. Based on the joint PDF, the Rosenblatt transformation can obtain transformed independent variables from dependent variables. When independent orthogonal sample set is generated from a depend one, SA is executed to obtain the marginal sensitivity index $S\xafY\xafZ$ and variance $VY\xafZ$.

*Step 3*: For the transformed independent variables, the PCE is constructed to acquire the variance $VXY\xaf$.

*Step 4*: Multivariate linear regression is adopted to capture the regression coefficient vector $t~$.

*Step 5*: The proposed aggregation formulas for local variables across levels, shared variables at the same level, shared variables across levels are used to obtain the effect of input variables on variations of system performance $SXZ$ based on the indices of subsystems acquired in Steps 1–4.

Compared with previous methods, i.e., HSSA and HSSA-SV, the proposed MHSA method considers two kinds of shared variables and deals with the multidimensional correlations based on vine copula theory. Traditional methods such as HSSA can only handle independent variables, while HSSA-SV only considers bivariate correlations between dependent responses caused by shared variables from the lower level subsystems. Hence, the MHSA method may have much applicability and high precision since it guarantees an accurate description of the input dependence. In addition, the traditional method based on the importance sampling method needs enough samples to ensure the computational convergence, while the MHSA in this paper adopts the PCE method to construct the stochastic model, which requires much fewer samples. Therefore, the proposed method is more suitable for the problems with high computational cost, such as the multiscale composite material problem and the parts-component-product design problem including the high-precision simulation model.

The errors of the MHSA method come from three sources: the first one is the model approximation of the PCE for obtaining subsystem’s sensitivity indices; the second one is the dependence modeling by the vine copula theory and Rosenblatt transformation; and the last one, which is also the most important one, is the linear assumption of aggregation formulation. The error comes from the first one can be minimized if the number of samples is enough to get an accurate approximate model. More than 10*n* or 20*n* (*n* is the variable dimension) samples are recommended for constructing the PCE model with independent variables or transformed variables. Of course, the number of samples should be considered in combination with the nonlinearity and computational cost of the problem. The accuracy of PCE model can be assessed by some validation methods [19]. As for the error comes from the dependence modeling, it can be reduced by selecting the appropriate copula structure and copula family based on enough samples. Since the samples for dependence modeling are generated based on the established PCE model and there is basically no computational cost in theory, so the samples are sufficient. Previous study also shows that the accuracy of PCE method combining with vine copula theory is high enough for the mean value and variance, which are related to the sensitivity indices [16]. The error that comes from the linear assumption will become a challenge to provide an accurate measure if the upper level model is strongly nonlinear with respect to the transformed variables. This is because strong nonlinearity will bring strong interactions between variables, which needs to be quantified by high-order sensitivity indices. This paper considers only the first-order sensitivity index because the integration and assess of higher-order indices will be a challenge work when combining with the complicated correlations.

## 5 Examples

In this section, a mathematic example is first provided to verify the accuracy and efficiency of the proposed MHSA framework for multilevel systems with two kinds of shared variables. In this example, three other methods are introduced first, namely, AIO, HSSA, and HSSA-SV. The results of the AIO method are used as a benchmark to compare the computational accuracy of HSSA, HSSA-SV, and MHSA. Meanwhile, the example investigates the effects of nonlinearity and variable interaction of the subsystems on sensitivity indices. Then the MHSA method is applied to a 3D woven orthogonal composite (3DWOC) material system, which is cited in Ref. [16], for obtaining the sensitivity indices of the property parameters and geometry parameters on macroscopic elastic properties. The computational results are compared with the results of AIO, HSSA, and HSSA-SV to illustrate the advantages of the MHSA method when faced with the engineering problems.

### 5.1 Example 1.

*X*

_{1}and

*X*

_{2}, which is shared by the subsystems $gL1$, $gL2$, and $gL3$

_{.}Even all the components of

**are assumed to be independent and follow with Gaussian distributions,**

*X**X*

_{1}∼

*N*(1,0.1),

*X*

_{2}∼

*N*(2,0.1),

*X*

_{3}∼

*N*(3,0.1),

*X*

_{4}∼

*N*(4,0.1). These intermediate variables (

*Y*

_{1},

*Y*

_{2},

*Y*

_{3}) are mutually dependent because of the existence of

*X*

_{1}and

*X*

_{2}. Meanwhile, the upper level subsystem

*g*

^{U}and lower level subsystem $gL1$ have a shared variable across levels, i.e.,

*X*

_{3}. Hence, the variable

*X*

_{3}is dependent with intermediate variable

*Y*

_{1}.

#### 5.1.1 Comparison Approaches.

Scenario 1: AIO

All in one methods only care about the direct relationship between inputs, i.e., (*X*_{1}, *X*_{2}, *X*_{3}, *X*_{4}), and output of entire system, i.e., *Z*, but ignore the intermediate variables. The sensitivity results by the AIO method are used as a benchmark to verify the computational accuracy of other methods, i.e., HSSA, HSSA-SV, and MHSA. In the AIO method, MCS is adopted to obtain the Sobol’ indices $(SX1Z,SX2Z,SX3Z,SX4Z)$ and the number of samples is 10^{5}.

Scenario 2: HSSA

*Y*

_{1},

*Y*

_{2},

*Y*

_{3},

*X*

_{3}) will be neglected by the HSSA method. The formulations in the HSSA for input variables can be expressed as

*X*. $t~HSSA(i)$ is an element of the linear regression coefficient vector $t~HSSA$ for responses

**, which can be calculated by Eq. (30), wherein $\xi T=[1,Y1,Y2,Y3]$. Within the HSSA approach, importance sampling technology is used to calculate the sensitivity indices. 10**

*Y*^{5}Monte Carlo samples are generated to obtained the sensitivity results.

Scenario 3: HSSA-SV

*Y*

_{1},

*Y*

_{2},

*Y*

_{3},

*X*

_{3}) and is denoted as Λ. The extended formulations in the HSSA-SV for input variables can be expressed as

_{s}is the set whose element is the index of lower model that contains variable

*X*. $S\Lambda Z$ is the subset sensitivity. $t~SV(i)$ is an element of the linear regression coefficient vector $t~SV$ for responses

_{s}**and shared variables**

*Y*

*X**, which can be calculated by Eq. (30), wherein $\xi T=[1,Y1,Y2,Y3,X3]$. $CovXsYiYj$ denotes the first-order covariance contribution from shared variable. More detailed information can refer the literature [21]. As with the HSSA method, importance sampling technology is used and 10*

_{sU}^{5}samples Monte Carlo samples are generated to obtain the sensitivity indices.

#### 5.1.2 Results.

Following the steps of the proposed computational framework, 50 samples and 200 samples are generated first for constructing the PCE models of the lower and upper level subsystems, respectively. Then 500 stochastic responses of any intermediate variable *Y* are obtained based on the established PCE model using MCS with the prescribed PDFs of ** X**. And the kernel density estimation is used to obtain the marginal PDF of

*Y*based on these stochastic responses. The optimal dependence structure of (

*Y*

_{1},

*Y*

_{2},

*Y*

_{3},

*X*

_{3}) is then constructed combining the vine copula theory with the marginal PDF. Figure 8 shows the tree structure and copula family, copula parameters of each bivariate copula in each tree in the case of

*ω*= 2, υ = 2. The selected bivariate copulas in the tree are

*t*copula, Gaussian copula, and Frank copula, respectively. Figure 9 shows the comparisons between the contour of the bivariate PDF constructed by the vine copula and the origin samples in tree 1 in the case of

*ω*= 2, υ = 2. Results show that the bivariate PDF reflects the distribution characteristic of scatter points very well, which means that the dependence model is constructed with high accuracy.

Tables 1 and 2 gives the indices of the lower level and upper level subsystems that computed by the MHSA method in the case of *ω* = 2, υ = 2. Then the system’s sensitivity indices can be calculated by Eqs. (26)–(28). Different values are assigned to *ω* and υ for studying the impact of functional nonlinearity and variable interaction on the accuracy of the MHSA. Figure 10 shows the system’s sensitivity indices of the four cases. When *ω* keeps unstable and υ increases from 0 to 1, then the magnitudes of sensitivity indices of all inputs changes, which can be found in (a) versus (b) or (c) versus (d). This is because the nonlinearity and interaction of the lower level subsystem 1 is controlled by υ. υ can directly affect the sensitivity indices of *X*_{1}, *X*_{2}, and *X*_{3} for *Y*_{1}, thus indirectly influences the system’s sensitivity indices. And *ω*, which controls the nonlinearity of the upper level subsystem, may further enhance this kind of effect.

X_{1} | X_{2} | X_{3} | Sum | ||
---|---|---|---|---|---|

$SXY$ | Y_{1} | 0.4627 | 0.0751 | 0.4593 | 0.9971 |

Y_{2} | 0.5012 | 0.4988 | – | 1 | |

Y_{3} | 0.9863 | 0.0137 | – | 1 | |

$VXY$ | Y_{1} | 1.0073 | 0.1698 | 1.0000 | 2.1771 |

Y_{2} | 0.0402 | 0.0400 | – | 0.0802 | |

Y_{3} | 0.0900 | 0.0013 | – | 0.0913 | |

$VXY\xaf$ | $Y\xaf1$ | 0.8505 | 0.1360 | 0.0001 | 0.9866 |

$Y\xaf2$ | 0.2118 | 0.0975 | – | 0.3093 | |

$Y\xaf3$ | 0.1353 | 0.7537 | – | 0.8890 |

X_{1} | X_{2} | X_{3} | Sum | ||
---|---|---|---|---|---|

$SXY$ | Y_{1} | 0.4627 | 0.0751 | 0.4593 | 0.9971 |

Y_{2} | 0.5012 | 0.4988 | – | 1 | |

Y_{3} | 0.9863 | 0.0137 | – | 1 | |

$VXY$ | Y_{1} | 1.0073 | 0.1698 | 1.0000 | 2.1771 |

Y_{2} | 0.0402 | 0.0400 | – | 0.0802 | |

Y_{3} | 0.0900 | 0.0013 | – | 0.0913 | |

$VXY\xaf$ | $Y\xaf1$ | 0.8505 | 0.1360 | 0.0001 | 0.9866 |

$Y\xaf2$ | 0.2118 | 0.0975 | – | 0.3093 | |

$Y\xaf3$ | 0.1353 | 0.7537 | – | 0.8890 |

$Y\xaf1$ | $Y\xaf2$ | $Y\xaf3$ | X_{3} | X_{4} | Sum | |
---|---|---|---|---|---|---|

$SY\xafZ$ | 0.4801 | 0.0012 | 0.0006 | 0.3590 | 0.1569 | 0.9978 |

$VY\xafZ$ | 1.9888e5 | 478.9204 | 249.9861 | 1.4872e5 | 6.5002e4 | 4.1333e5 |

$t~$ | 449.3465 | 19.9364 | 14.2820 | 400.9061 | – | – |

$Y\xaf1$ | $Y\xaf2$ | $Y\xaf3$ | X_{3} | X_{4} | Sum | |
---|---|---|---|---|---|---|

$SY\xafZ$ | 0.4801 | 0.0012 | 0.0006 | 0.3590 | 0.1569 | 0.9978 |

$VY\xafZ$ | 1.9888e5 | 478.9204 | 249.9861 | 1.4872e5 | 6.5002e4 | 4.1333e5 |

$t~$ | 449.3465 | 19.9364 | 14.2820 | 400.9061 | – | – |

Figure 10 also depicts the comparison results of the sensitivity indices obtained from the HSSA, HSSA-SV, and MHSA. Table 3 shows the computational results and relative errors. Owing to the values of sensitivity indices which are less than 0.1 is very small, a small difference will bring a great error. Then the sensitivity indices which are bigger than 0.1 will be discussed here. It is easy to see from the Fig. 10 that the HSSA method performs worst in all cases. It does not even provide the same ranking compared with the one using the AIO method in case (b) and case (d). This is because HSSA method only deals with the hierarchical systems with independent subsystems. The HSSA-SV approach performs better than the HSSA approach. In all cases, the HSSA-SV approach provides the same ranking. However, this method has a big relative error, which is 38.279% and 16.961%, in calculating the sensitivity indices of *X*_{1} in case (b) and *X*_{3} in case (c). In terms of overall results, the indices obtained by the proposed MHSA approach are almost the same with the indices obtained by the AIO method. The maximum relative error of MHSA method is 13.50% and almost all the relative errors are less than 10%. The average relative error of the HSSA, HSSA-SV, and MHSA are calculated, which is 20.930%, 6.894%, and 4.349%, respectively. It indicates that the MHSA method has the highest calculation accuracy for system with shared variables. In short, the proposed MHSA approach performs best in terms of computational accuracy compared with the HSSA and HSSA-SV methods.

Indices | X_{1} | X_{2} | X_{3} | X_{4} | |
---|---|---|---|---|---|

(a) | HSSA | 0.032 | 0.006 | 0.188 (2.603%) | 0.734 (3.607%) |

HSSA-SV | 0.090 | 0.010 | 0.191 (0.979%) | 0.722 (1.838%) | |

MHSA | 0.099 | 0.008 | 0.188 (2.534%) | 0.713 (0.680%) | |

AIO | 0.089 | 0.009 | 0.193 | 0.709 | |

(b) | HSSA | 0.201 (57.591%) | 0.032 | 0.195 (8.115%) | 0.699 (3.581%) |

HSSA-SV | 0.177 (38.279%) | 0.033 | 0.184 (1.961%) | 0.672 (0.488%) | |

MHSA | 0.138 (8.473%) | 0.023 | 0.191 (5.731%) | 0.675 (0.005%) | |

AIO | 0.128 | 0.016 | 0.181 | 0.675 | |

(c) | HSSA | 0.041 (69.752%) | 0.011 | 0.582 (3.749%) | 0.262 (3.900%) |

HSSA-SV | 0.125 (7.255%) | 0.008 | 0.502 (16.961%) | 0.252 (0.001%) | |

MHSA | 0.116 (13.500%) | 0.005 | 0.613 (1.402%) | 0.243 (3.394%) | |

AIO | 0.135 | 0.008 | 0.604 | 0.252 | |

(d) | HSSA | 0.483 (17.644%) | 0.079 | 0.489 (34.006%) | 0.200 (25.684%) |

HSSA-SV | 0.394 (3.941%) | 0.063 | 0.372 (2.069%) | 0.162 (2.062%) | |

MHSA | 0.385 (6.315%) | 0.062 | 0.367 (0.575%) | 0.151 (5.182%) | |

AIO | 0.411 | 0.063 | 0.365 | 0.159 |

Indices | X_{1} | X_{2} | X_{3} | X_{4} | |
---|---|---|---|---|---|

(a) | HSSA | 0.032 | 0.006 | 0.188 (2.603%) | 0.734 (3.607%) |

HSSA-SV | 0.090 | 0.010 | 0.191 (0.979%) | 0.722 (1.838%) | |

MHSA | 0.099 | 0.008 | 0.188 (2.534%) | 0.713 (0.680%) | |

AIO | 0.089 | 0.009 | 0.193 | 0.709 | |

(b) | HSSA | 0.201 (57.591%) | 0.032 | 0.195 (8.115%) | 0.699 (3.581%) |

HSSA-SV | 0.177 (38.279%) | 0.033 | 0.184 (1.961%) | 0.672 (0.488%) | |

MHSA | 0.138 (8.473%) | 0.023 | 0.191 (5.731%) | 0.675 (0.005%) | |

AIO | 0.128 | 0.016 | 0.181 | 0.675 | |

(c) | HSSA | 0.041 (69.752%) | 0.011 | 0.582 (3.749%) | 0.262 (3.900%) |

HSSA-SV | 0.125 (7.255%) | 0.008 | 0.502 (16.961%) | 0.252 (0.001%) | |

MHSA | 0.116 (13.500%) | 0.005 | 0.613 (1.402%) | 0.243 (3.394%) | |

AIO | 0.135 | 0.008 | 0.604 | 0.252 | |

(d) | HSSA | 0.483 (17.644%) | 0.079 | 0.489 (34.006%) | 0.200 (25.684%) |

HSSA-SV | 0.394 (3.941%) | 0.063 | 0.372 (2.069%) | 0.162 (2.062%) | |

MHSA | 0.385 (6.315%) | 0.062 | 0.367 (0.575%) | 0.151 (5.182%) | |

AIO | 0.411 | 0.063 | 0.365 | 0.159 |

As for computational efficiency, the HSSA and HSSA-SV methods require more samples, which is 10^{5} in this example, because they use importance sampling technology. In particular, HSSA-SV method may require more samples because it needs to calculate first-order covariance contribution $CovXsYiYj$ accurately. In contrast, the MHSA needs much fewer samples, which is 350 in this example, because it obtains the sensitivity indices using the PCE approach. As for the samples needed for the dependence model construction based on vine copula theory, they are generated based on the established PCE model. Theoretically, there is no computational cost should be concerned.

### 5.2 Example 2: A Multiscale Composite Material.

This example presents a multiscale system of 3DWOC cited in Ref. [16], as shown in Fig. 11. 3DWOC consists of warp yarns, weft yarns, binder yarns, and matrix. The lower scale is the scale of carbon fibers within the yarns. The property of a yarn is computed by the properties of fibers and matrix, and the fiber volume fraction of yarns, which is determined by the geometry parameters. In the upper level subsystem, the properties of unit cell are obtained by the properties of yarns and the geometry parameters.

The HSSA, HSSA-SV, MHSA, and AIO are applied to 3DWOC to obtain the impacts of property parameters and geometry parameters, which is given in Table 4, on macroscopic elastic properties. Figure 11 gives the system structure of this application. For the first upscaling process, the input geometry parameters for warp yarn, weft yarn, and binder yarn are (*W*_{warp}, *H*_{warp}), (*W*_{weft}, *H*_{weft}), and (*W*_{bin}, *H*_{bin}), respectively. The input property parameters are *X** _{sL}* = (

*E*

_{11f},

*E*

_{22f},

*G*

_{12f},

*G*

_{23f},

*v*

_{12f},

*E*

_{m},

*v*

_{m}), which are the shared variables at the same level. The longitudinal and transverse modulus of yarns

**= (**

*Y**C*

_{11warp},

*C*

_{22warp},

*C*

_{11weft},

*C*

_{22weft},

*C*

_{11bin},

*C*

_{22bin}) are selected as the outputs of first upscaling process since they have a stronger effect on properties of unit cell than other parameters. For the second upscaling process, the inputs are composed of geometry parameters and

**. Hence, the geometry parameters are the shared variables across levels, which are denoted as**

*Y*

*X**. This is because the geometry parameters determine not only the fiber volume fraction but also the yarn volume fraction. The selected system’s outputs are (*

_{sU}*E*,

_{x}*E*,

_{y}*Gxy*,

*v*).

_{xy}Parameter | Description | Mean | Classification |
---|---|---|---|

W_{warp} | Width of warp yarn | 1.20 mm | Geometry parameter |

H_{warp} | Height of warp yarn | 0.50 mm | |

W_{weft} | Width of weft yarn | 2.40 mm | |

H_{weft} | Height of weft yarn | 0.30 mm | |

W_{bin} | Width of binder yarn | 0.80 mm | |

H_{bin} | Height of binder yarn | 0.50 mm | |

E_{11f} | Longitudinal modulus | 230 GPa | Property parameter |

E_{22f} | Transverse modulus | 15 GPa | |

G_{12f} | Axial shear modulus | 24 GPa | |

G_{23f} | Transverse shear modulus | 5.03 GPa | |

v_{12f} | Longitudinal Poisson’s ratio | 0.20 | |

E_{m} | Modulus of matrix | 3.00 GPa | |

v_{m} | Poisson’s ratio of matrix | 0.35 |

Parameter | Description | Mean | Classification |
---|---|---|---|

W_{warp} | Width of warp yarn | 1.20 mm | Geometry parameter |

H_{warp} | Height of warp yarn | 0.50 mm | |

W_{weft} | Width of weft yarn | 2.40 mm | |

H_{weft} | Height of weft yarn | 0.30 mm | |

W_{bin} | Width of binder yarn | 0.80 mm | |

H_{bin} | Height of binder yarn | 0.50 mm | |

E_{11f} | Longitudinal modulus | 230 GPa | Property parameter |

E_{22f} | Transverse modulus | 15 GPa | |

G_{12f} | Axial shear modulus | 24 GPa | |

G_{23f} | Transverse shear modulus | 5.03 GPa | |

v_{12f} | Longitudinal Poisson’s ratio | 0.20 | |

E_{m} | Modulus of matrix | 3.00 GPa | |

v_{m} | Poisson’s ratio of matrix | 0.35 |

Regression coefficient vector for responses ** Y** in HSSA method, and regression coefficient vector for responses

**and shared variables**

*Y*

*X**in HSSA-SV method are both obtained by Eq. (30). It is important to note that the elements of*

_{sU}**are dependent with the elements of**

*Y*

*X**, and these dependent variables are regarded as a subset denoted by Λ when using the HSSA-SV method. 10*

_{sU}^{5}samples are generated in the HSSA, HSSA-SV, and AIO methods. Following the steps of the MHSA framework, 100 samples and 500 samples are generated for constructing the PCE of the lower and upper level subsystems, respectively. Owing to the existence of the shared variables, the inputs of the upper level subsystem are mutually dependent. Model dependence needs to be acquired by the vine copula theory. The knowledge about the dependence modeling of 3DWOC had been discussed in detail in Ref. [16] and will not be repeated in this paper.

The sensitivity results of the sensitivity indices obtained from the HSSA, HSSA-SV, MHSA, and AIO method are given in Table 5 in Appendix B and depicted in Fig. 12. Among them, *E _{x}* and

*E*have almost identical sensitivity results. It can be seen from Table 5 that the HSSA method performs worst. The HSSA can perform a good estimation of the mean, but the estimation of the variance will be poor in some cases (

_{y}*G*). This is because the HSSA method neglects the variable correlations. It does not even provide the same ranking compared with the one using the AIO method. In contrast, the HSSA-SV and MHSA methods can provide almost the same ranking. And the MHSA approach works better. The results of some variables obtained based on the HSSA-SV method may have large errors, such as

_{xy}*E*

_{m}in case (c) and

*v*

_{12f},

*E*

_{m}and

*v*

_{m}in case (d). In addition, the HSSA-SV method sometimes produces negative results, such as the sensitivity index of

*W*

_{bin}in case (d), which is −7.52E-04. This is because the first-order covariance contribution $CovXsYiYj$ in Eq. (33) may be negative in some situations. In conclusion, the proposed MHSA provides the best results compared with the HSSA and HSSA-SV methods, even this application has complicated coupling relationships among its subsystems. For the modulus

*E*and

_{x}*E*,

_{y}*H*

_{warp},

*H*

_{weft}, and

*E*

_{11f}are the most important variables, especially

*E*

_{11f}.

*H*

_{warp},

*W*

_{warp}, and

*E*

_{m}have a great influence on the shear modulus of the unit cell

*G*. Meanwhile,

_{xy}*E*

_{11f},

*E*

_{22f},

*v*

_{12f},

*E*

_{m}, and

*v*

_{m}play important roles in the Poisson’s ratio

*v*. Hence, these variables will be considered in the next work about the optimization and design under uncertainty.

_{xy}## 6 Conclusions

A mapping-based hierarchical sensitivity analysis framework is proposed in this paper to handle the engineering systems with complicated correlations among various variables across levels. The input variables of the upper level systems are mutually dependent owing to the existence of shared variables. Consider that PCE takes advantage of mutually independent variables, dependence model and uncertainty transformation methods are adopted to map the dependent variables onto independent variables. Multivariate dependence model can be built effectively by the vine copula theory. And the vine copula model combined with Rosenblatt transformation is to provide independent data. Then, the sensitivity indices of subsystems for the transformed variables, which are called marginal sensitivity indices, can be obtained by the construction and postprocessing of PCE model. Finally, a mapping-based aggregation formulation is developed to integrate the sensitivity indices of the lower level subsystems and the marginal sensitivity indices of the upper level subsystems to estimate the global effect of inputs at different levels on the response.

The effectiveness of the proposed MHSA method is illustrated by a mathematical example and a multilevel composite material. The proposed computational framework provides sufficiently accurate sensitivity indices compared with the AIO method and performs best in terms of computational accuracy compared with the HSSA and HSSA-SV methods. Meanwhile, the MHSA needs much fewer samples because it adopts the PCE method to construct the stochastic model. Therefore, the proposed method is more suitable for the multilevel systems with high computational cost.

However, there are still some challenging tasks needed to discuss and further research in the future, one of which derives from the nonlinearity of the upper level subsystems. Some researches required to deal with nonlinearity includes the quantitative analysis on the estimation error across multiple levels; the estimate of higher-order sensitivity indices caused by nonlinearity. In addition, multilevel systems with mutually dependent input variables are also a common scenario. Further research is needed to extend the proposed method to this kind of problem. These issues will be addressed to improve the accuracy and applicability of the proposed method in future work.

## Funding Data

The authors would like to acknowledge the support from Key National Natural Science Foundation of China (Grant No. U1864211), National Natural Science Foundation of China (Grant No. 11772191), and National Science Foundation for Young Scientists of China (Grant No. 51705312).

## Data Availability Statement

The datasets generated and supporting the findings of this article are obtainable from the corresponding author upon reasonable request. The authors attest that all data for this study are included in the paper. No data, models, or code were generated or used for this paper.

### Appendix A: Derivation of Eqs. (26)–(28)

*t*

_{i}(

*i*= 0, 1, …,

*m*+

*k*) is the mean of function $Ti(XLU)$. If the upper model function is linear with respect to $Y\xaf$ and

*X**, coefficients*

_{sU}*B*

_{i}(

*i*= 0, 1, …,

*m*+

*k*) in Eq. (30) are equivalent to

*t*.

_{i}*X*can be expressed as

_{sUj}*i*th lower level model function, the upper model then can be rewritten as

**, and its mean is**

*X**i*th lower level model function. In order to calculate the variance in Eq. (A6), we need obtain the function shown as

*X*. The global contribution of

_{sL}*X*to the variance of

_{sL}*Z*can be computed as

*X*can be derived by

_{sL}*X*can be expressed as

_{sU}*X*from both lower and upper model can be calculated by the following equation

_{sU}*Z*can be computed as

### Appendix B

Indices | Mean | Var | W_{warp} | H_{warp} | W_{weft} | H_{weft} | W_{bin} | H_{bin} | E_{11f} | E_{22f} | G_{12f} | G_{23f} | v_{12f} | E_{m} | v_{m} | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|

E_{x} | HSSA | 50.06 | 8.21 | 2.33 E-01 | 9.88 E-02 | 9.57 E-02 | 4.22 E-02 | 9.57 E-02 | 2.93 E-02 | 1.61 E-01 | 7.95 E-03 | 3.92 E-05 | 1.01 E-06 | 3.82 E-03 | 1.19 E-03 | 2.10 E-04 |

HSSA-SV | 50.05 | 7.39 | 2.85 E-03 | 1.25 E-01 | 1.55 E-03 | 8.33 E-02 | 1.78 E-03 | 5.81 E-02 | 7.31 E-01 | 4.42 E-04 | 0 | 0 | 1.81 E-04 | 3.25 E-04 | 0 | |

MHSA | 49.97 | 7.85 | 5.65 E-04 | 1.28 E-01 | 5.53 E-03 | 8.69 E-02 | 1.06 E-03 | 5.39 E-02 | 7.31 E-01 | 9.37 E-04 | 7.04 E-05 | 1.08 E-06 | 3.07 E-04 | 3.20 E-04 | 1.95 E-04 | |

AIO | 50.05 | 7.52 | 2.91 E-04 | 1.22 E-01 | 3.48 E-05 | 8.20 E-02 | 2.18 E-05 | 5.72 E-02 | 7.34 E-01 | 1.83 E-03 | 4.2 5E-05 | 1.18 E-06 | 5.97 E-04 | 1.41 E-03 | 1.44 E-04 | |

E_{y} | HSSA | 39.22 | 4.71 | 2.32 E-01 | 9.59 E-02 | 1.02 E-01 | 3.92 E-02 | 9.46 E-02 | 2.82 E-02 | 1.59 E-01 | 3.55 E-03 | 2.95 E-05 | 2.51 E-07 | 1.53 E-03 | 9.06 E-04 | 1.51 E-04 |

HSSA-SV | 39.22 | 4.35 | 2.68 E-03 | 1.48 E-01 | 2.06 E-03 | 7.56 E-02 | 1.68 E-03 | 6.10 E-02 | 7.05 E-01 | 3.37 E-03 | 0 | 0 | 1.38 E-03 | 1.16 E-03 | 0 | |

MHSA | 39.16 | 4.65 | 1.49 E-03 | 1.49 E-01 | 2.96 E-03 | 7.92 E-02 | 1.00 E-03 | 5.70 E-02 | 7.10 E-01 | 4.11 E-03 | 6.84 E-05 | 1.12 E-06 | 1.59 E-03 | 9.11 E-04 | 2.59 E-04 | |

AIO | 39.22 | 4.43 | 1.03 E-04 | 1.44 E-01 | 5.62 E-04 | 7.56 E-02 | 1.09 E-05 | 5.96 E-02 | 7.07 E-01 | 6.80 E-03 | 2.49 E-05 | 1.04 E-07 | 2.07 E-03 | 3.06 E-03 | 3.51 E-04 | |

G_{xy} | HSSA | 2.69 | 6.65 E-01 | 1.61 E-01 | 2.06 E-01 | 3.22 E-02 | 1.91 E-02 | 1.53 E-02 | 1.12 E-02 | 3.22 E-02 | 1.09 E-02 | 7.99 E-06 | 8.52 E-07 | 4.61 E-03 | 3.63 E-03 | 4.61 E-04 |

HSSA-SV | 2.59 | 3.71 E-02 | 2.01 E-01 | 4.31 E-01 | 6.84 E-03 | 2.63 E-02 | 1.08 E-02 | 2.67 E-02 | 3.73 E-02 | 1.46 E-02 | 0 | 0 | 6.32 E-03 | 3.47 E-02 | 0 | |

MHSA | 2.59 | 3.27 E-02 | 1.96 E-01 | 4.35 E-01 | 6.62 E-03 | 3.16 E-02 | 4.04 E-03 | 6.01 E-02 | 4.53 E-03 | 2.97 E-02 | 3.00 E-04 | 7.29 E-05 | 1.45 E-02 | 1.99 E-01 | 4.85 E-03 | |

AIO | 2.59 | 3.72 E-02 | 1.77 E-01 | 4.45 E-01 | 2.20 E-03 | 2.24 E-02 | 1.98 E-03 | 4.09 E-02 | 5.36 E-05 | 1.65 E-05 | 1.54 E-04 | 1.04 E-02 | 5.77 E-03 | 2.56 E-01 | 1.68 E-02 | |

v_{xy} | HSSA | 4.85 E-02 | 2.37 E-05 | 8.03 E-02 | 1.05 E-01 | 8.44 E-02 | 9.57 E-02 | 5.81 E-02 | 5.77 E-02 | 8.25 E-02 | 9.20 E-02 | 3.01 E-05 | 7.13 E-06 | 3.90 E-02 | 3.72 E-02 | 4.54 E-03 |

HSSA-SV | 4.83 E-02 | 2.13 E-05 | 5.35 E-03 | 1.37 E-02 | 5.63 E-03 | 1.26 E-02 | -7.52 E-04 | 1.62 E-03 | 1.59 E-01 | 3.71 E-01 | 0 | 0 | 1.52 E-01 | 1.26 E-01 | 0 | |

MHSA | 4.84 E-02 | 1.86 E-05 | 3.37 E-03 | 1.08 E-02 | 1.31 E-02 | 1.33 E-02 | 2.10 E-03 | 1.15 E-04 | 1.68 E-01 | 3.73 E-01 | 2.24 E-05 | 1.19 E-05 | 1.96 E-01 | 8.75 E-02 | 5.06 E-02 | |

AIO | 4.83 E-02 | 2.10 E-05 | 3.88 E-03 | 1.30 E-02 | 6.79 E-03 | 1.33 E-02 | 7.75 E-05 | 1.65 E-03 | 1.71 E-01 | 3.46 E-01 | 5.48 E-02 | 1.54 E-05 | 1.91 E-01 | 6.53 E-02 | 1.29 E-01 |

Indices | Mean | Var | W_{warp} | H_{warp} | W_{weft} | H_{weft} | W_{bin} | H_{bin} | E_{11f} | E_{22f} | G_{12f} | G_{23f} | v_{12f} | E_{m} | v_{m} | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|

E_{x} | HSSA | 50.06 | 8.21 | 2.33 E-01 | 9.88 E-02 | 9.57 E-02 | 4.22 E-02 | 9.57 E-02 | 2.93 E-02 | 1.61 E-01 | 7.95 E-03 | 3.92 E-05 | 1.01 E-06 | 3.82 E-03 | 1.19 E-03 | 2.10 E-04 |

HSSA-SV | 50.05 | 7.39 | 2.85 E-03 | 1.25 E-01 | 1.55 E-03 | 8.33 E-02 | 1.78 E-03 | 5.81 E-02 | 7.31 E-01 | 4.42 E-04 | 0 | 0 | 1.81 E-04 | 3.25 E-04 | 0 | |

MHSA | 49.97 | 7.85 | 5.65 E-04 | 1.28 E-01 | 5.53 E-03 | 8.69 E-02 | 1.06 E-03 | 5.39 E-02 | 7.31 E-01 | 9.37 E-04 | 7.04 E-05 | 1.08 E-06 | 3.07 E-04 | 3.20 E-04 | 1.95 E-04 | |

AIO | 50.05 | 7.52 | 2.91 E-04 | 1.22 E-01 | 3.48 E-05 | 8.20 E-02 | 2.18 E-05 | 5.72 E-02 | 7.34 E-01 | 1.83 E-03 | 4.2 5E-05 | 1.18 E-06 | 5.97 E-04 | 1.41 E-03 | 1.44 E-04 | |

E_{y} | HSSA | 39.22 | 4.71 | 2.32 E-01 | 9.59 E-02 | 1.02 E-01 | 3.92 E-02 | 9.46 E-02 | 2.82 E-02 | 1.59 E-01 | 3.55 E-03 | 2.95 E-05 | 2.51 E-07 | 1.53 E-03 | 9.06 E-04 | 1.51 E-04 |

HSSA-SV | 39.22 | 4.35 | 2.68 E-03 | 1.48 E-01 | 2.06 E-03 | 7.56 E-02 | 1.68 E-03 | 6.10 E-02 | 7.05 E-01 | 3.37 E-03 | 0 | 0 | 1.38 E-03 | 1.16 E-03 | 0 | |

MHSA | 39.16 | 4.65 | 1.49 E-03 | 1.49 E-01 | 2.96 E-03 | 7.92 E-02 | 1.00 E-03 | 5.70 E-02 | 7.10 E-01 | 4.11 E-03 | 6.84 E-05 | 1.12 E-06 | 1.59 E-03 | 9.11 E-04 | 2.59 E-04 | |

AIO | 39.22 | 4.43 | 1.03 E-04 | 1.44 E-01 | 5.62 E-04 | 7.56 E-02 | 1.09 E-05 | 5.96 E-02 | 7.07 E-01 | 6.80 E-03 | 2.49 E-05 | 1.04 E-07 | 2.07 E-03 | 3.06 E-03 | 3.51 E-04 | |

G_{xy} | HSSA | 2.69 | 6.65 E-01 | 1.61 E-01 | 2.06 E-01 | 3.22 E-02 | 1.91 E-02 | 1.53 E-02 | 1.12 E-02 | 3.22 E-02 | 1.09 E-02 | 7.99 E-06 | 8.52 E-07 | 4.61 E-03 | 3.63 E-03 | 4.61 E-04 |

HSSA-SV | 2.59 | 3.71 E-02 | 2.01 E-01 | 4.31 E-01 | 6.84 E-03 | 2.63 E-02 | 1.08 E-02 | 2.67 E-02 | 3.73 E-02 | 1.46 E-02 | 0 | 0 | 6.32 E-03 | 3.47 E-02 | 0 | |

MHSA | 2.59 | 3.27 E-02 | 1.96 E-01 | 4.35 E-01 | 6.62 E-03 | 3.16 E-02 | 4.04 E-03 | 6.01 E-02 | 4.53 E-03 | 2.97 E-02 | 3.00 E-04 | 7.29 E-05 | 1.45 E-02 | 1.99 E-01 | 4.85 E-03 | |

AIO | 2.59 | 3.72 E-02 | 1.77 E-01 | 4.45 E-01 | 2.20 E-03 | 2.24 E-02 | 1.98 E-03 | 4.09 E-02 | 5.36 E-05 | 1.65 E-05 | 1.54 E-04 | 1.04 E-02 | 5.77 E-03 | 2.56 E-01 | 1.68 E-02 | |

v_{xy} | HSSA | 4.85 E-02 | 2.37 E-05 | 8.03 E-02 | 1.05 E-01 | 8.44 E-02 | 9.57 E-02 | 5.81 E-02 | 5.77 E-02 | 8.25 E-02 | 9.20 E-02 | 3.01 E-05 | 7.13 E-06 | 3.90 E-02 | 3.72 E-02 | 4.54 E-03 |

HSSA-SV | 4.83 E-02 | 2.13 E-05 | 5.35 E-03 | 1.37 E-02 | 5.63 E-03 | 1.26 E-02 | -7.52 E-04 | 1.62 E-03 | 1.59 E-01 | 3.71 E-01 | 0 | 0 | 1.52 E-01 | 1.26 E-01 | 0 | |

MHSA | 4.84 E-02 | 1.86 E-05 | 3.37 E-03 | 1.08 E-02 | 1.31 E-02 | 1.33 E-02 | 2.10 E-03 | 1.15 E-04 | 1.68 E-01 | 3.73 E-01 | 2.24 E-05 | 1.19 E-05 | 1.96 E-01 | 8.75 E-02 | 5.06 E-02 | |

AIO | 4.83 E-02 | 2.10 E-05 | 3.88 E-03 | 1.30 E-02 | 6.79 E-03 | 1.33 E-02 | 7.75 E-05 | 1.65 E-03 | 1.71 E-01 | 3.46 E-01 | 5.48 E-02 | 1.54 E-05 | 1.91 E-01 | 6.53 E-02 | 1.29 E-01 |