Regression mean residual life of a system with three dependent components with normal lifetimes

Regression mean residual life of a system with three dependent components with normal lifetimes

Accepted Manuscript Regression mean residual life of a system with three dependent components with normal lifetimes Mohsen Madadi, Parisa Khalilzadeh,...

277KB Sizes 0 Downloads 30 Views

Accepted Manuscript Regression mean residual life of a system with three dependent components with normal lifetimes Mohsen Madadi, Parisa Khalilzadeh, Ahad Jamalizadeh PII: DOI: Reference:

S0167-7152(15)00053-X http://dx.doi.org/10.1016/j.spl.2015.02.003 STAPRO 7219

To appear in:

Statistics and Probability Letters

Received date: 13 August 2014 Revised date: 4 February 2015 Accepted date: 4 February 2015 Please cite this article as: Madadi, M., Khalilzadeh, P., Jamalizadeh, A., Regression mean residual life of a system with three dependent components with normal lifetimes. Statistics and Probability Letters (2015), http://dx.doi.org/10.1016/j.spl.2015.02.003 This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

*Manuscript Click here to view linked References

Regression Mean Residual Life of a System with Three Dependent Components with Normal Lifetimes Mohsen Madadi, Parisa Khalilzadeh and Ahad Jamalizadeh Department of Statistics, Faculty of Mathematics and Computer, Shahid Bahonar University of Kerman, Kerman, Iran February 4, 2015 Abstract In this paper, we use joint and conditional distributions of order statistics from a trivariate normal distribution to obtain closed expressions for reliability measures such as mean residual lifetime for systems with three dependent components with normal lifetimes.

Keywords and Phrases: Mean residual life time; mean inactivity time; truncated distribution; unified skew-normal distribution; order statistics

1

Introduction

The behavior of biological or engineering systems can be represented by a random vector X = (X1 , ..., Xn )T , whose elements represent the lifetimes of the components in the system. Thus, the lifetime of a series system is X(1) = min(X1 , ..., Xn ), that of a parallel system is X(n) = max(X1 , ..., Xn ) and in generally the lifetimes of the k-out-of-n system is X(n−k+1) , where (X(1) , ..., X(n) )T is the vector of the order statistics arising from X. Numerous studies have been carried out on the reliability properties of systems with i.i.d components (Asadi and Bayramoglu, 2006 and Tavangar and Asadi, 2010). However, in practice, the components in a system are dependent. Recently, some authors have studied systems with dependent structures (e. g. Navarro et al., 2007; Navarro et al., 2010; Navarro et al., 2013; Gupta, 2012). Sometimes the dependencies between the lifetimes of different components in a system are independent of the labeling of the components. These dependencies can then be modeled using order statistics from exchangeable random vectors (Loperfido et al., 2007; Zhang, 2010). Olkin and Viana (1995) and Viana (1998) considered applications of order statistics from an exchangeable normal vector in the visual sciences. In reliability theory and survival analysis, the study of the lifetime properties of a component (or any other living organism) is based on several measures such as the mean residual lifetime (MRL) function and mean inactivity time (MIT). Let T be a lifetime random variable with mean residual lifetime M R(t) and mean inactivity time M I(t). Then M R(t) = E(T − t|T > t) and M I(t) = E(t − T |T < t). In this paper we consider a system consisting of three dependent components and compute expressions for the MRL and MIT of the components when the joint distribution of the lifetimes is normal. For this purpose, we first introduce the unified skew-normal (SUN) and truncated unified skew-normal (TSUN) distributions and show that the joint, marginal and conditional distributions of order statistics from a trivariate normal distribution are mixtures of TSUN distributions. Then, closed expressions for the reliability measures for systems with three dependent components with normal lifetimes are obtained using the properties of TSUN and these mixture representations. 1

The remainder of this article is organized as follows: Section 2 contains a brief review of the SUN and TSUN distributions. In Section 3, we derive the joint, marginal and the conditional distributions of order statistics from trivariate normal distribution and in particular when the variable are exchangeable. In Section 4, we compute the MIT and MRL for a system with three dependent components with normal lifetimes. A conclusion is given in Section 5.

2

Preliminaries

In this section we first introduce multivariate SUN and singular unified skew-normal (SSUN) distributions. We then define TSUN distributions in the univariate and bivariate cases and discuss their properties in two subsections. Let U and V be random vectors of dimensions m and n, respectively, having the joint normal distribution       U η Γ ΛT ∼ Nm+n , . (1) V ξ Ω ΛT  is positive definite, the n-dimensional random vector X is said to have the multivariate SUN Ω distribution with parameter θ = (ξ, η, Ω, Γ, Λ), where ξ∈ Rn , η∈ Rm , Ω ∈ Rn×n , Γ ∈ Rm×m , Λ ∈ Rn×m , if If

 Γ

d

X = V|(U > 0);

(2)

d

where ‘=’ denotes equality in distribution and it is understood that the inequality must hold for each of the components of U. We write X ∼ SU Nn,m (θ). The probability density function (pdf) and the moment generating function (mgf) of X are fX (x; θ) = and

  φn (x; ξ, Ω) Φ η + ΛT Ω−1 (x − ξ) ; Γ − ΛT Ω−1 Λ ; Φm (η; Γ) m

  exp ξ T s + 21 sT Ωs MX (s;θ) = Φm η + ΛT s; Γ ; Φm (η; Γ)

x ∈ Rn ,

s ∈ Rn ,

(3)

(4)

respectively (see Arellano-Valle and Azzalini, 2006), where φn (·; ξ, Ω) is the pdf of Nn (ξ, Ω) (n-variate normal distribution with mean vector ξ and covariance matrix Ω) and Φm (·; Γ) is the cumulative distribution function (cdf) of Nm (0, Γ).  Γ ΛT  Arellano-Valle and Azzalini (2006) considered three cases for the singularity of . All of the results Ω in this paper are based on the following case: Rank(Γ) = m

and Rank(Ω) = n but Rank

 Γ

ΛT  < m + n. Ω

(5)

In this case, the density of X is of type (3), and the only computational difference is that |Γ − ΛT Ω−1 Λ| = 0 and Z  Φm a; Γ − ΛT Ω−1 Λ = P (C0 W ≤ a) = φr (w; Ir )dw, {w:C0 w≤a}

where r = Rank(Γ − Λ Ω Λ) < m, = Γ − Λ Ω Λ and W ∼ Nr (0; Ir ). In this case, the distribution of X in (2) is said to be a SSUN distribution with parameter θ denoted by T

−1

C0 CT0

T

−1

X ∼ SSU Nn,m (θ). 2

(6)

Let X ∼ SU Nn,m (θ), then the n-dimensional random vector Y is said to have a TSUN distribution, denoted by Y ∼ T SU Nn,m (θ; C), if d

where C ⊆ R . From (2), we have n

Therefore, the pdf of Y is, for y ∈ C, fY (y; θ, C) =

Y = X | (X ∈ C),

(7)

d

(8)

Y = V|(U > 0, V ∈ C).

 φn (y; ξ, Ω) Φm η + ΛT Ω−1 (y − ξ); Γ − ΛT Ω−1 Λ . P (U > 0, V ∈ C)

(9)

To obtain the reliability measures mentioned in the introduction we need to study some univariate and bivariate TSUN distributions. In the following two subsections, these distributions will be sudied by using the connection between TSUN and SSUN distributions.

2.1

T SU N1,1 and T SU N1,2 Distributions

Here, we present a brief review of T SU N1,1 and T SU N1,2 distributions. In the next section,we shall see that these distributions play a prominent role in the reliability analysis of systems with three dependent components with jointly normal lifetimes. 2.1.1

T SU N1,1 Distribution  U   η   γ 2 λ  d Let X = V |(U > 0), where ∼ N2 , , η, ξ, λ ∈ R, and ω, γ > 0. Using the stochastic V ξ ω representation in (2), we have X ∼ SU N1,1 (θ), where θ = (ξ, η, ω, γ 2 , λ). Further, from (7) and (8), for t ∈ R, we have  d d Y = X | (X > t) = V | (V > t, U > 0) ∼ T SU N1,1 θ; (t, +∞) . (10)

In the following lemma, by the stochastic representation in (10), we show that Y is a SSU N1,2 distribution. Then, we present experssions for its mgf and mean.  Lemma 2.1 If Y ∼ T SU N1,1 θ; (t, +∞) , we then have   Y ∼ SSU N1,2 ξ,

 γ2 η  , ω, ξ−t

Further, the mgf and mean of Y are, respectively,

 λ  , (λ ω) . ω

2  η + λs   γ 2 exp(ξs + ωs2 ) Φ ; 2  η   γ 2 λ  ξ − t − ωs Φ2 ; ξ−t ω  γ(ξ−t)− λ η  γ φ( γη )Φ q 2  λ ω(γ 2 − λω ) E(Y ) = ξ + ωfY t; θ, (t, +∞) + . γ  η   γ 2 λ  Φ2 ; ξ−t ω

MY (s; θ) =

d

λ  , ω

(11)

(12)

(13)

T

Proof From (10), we have Y = V | (U∗ > 0), where U∗ = (U, V − t) . Hence

λ  , and λ∗ = (λ, ω)T . Now, the result in (11) follows from (2) and (5). ω Using the result in (11) and the mgf of SUN distribution presented in (4), (12) can be obtained easily. The where η(t) = (η, ξ − t)T , Γ =

 γ2

 U∗   η(t)   Γ λ∗  ∼ N3 , , V ξ ω

3

mean in (13) can be derived from the first derivative of the mgf of Y in (12) which can be obtained by Lemma 2 in Arellano-Valle et al. (2014). 2.1.2 T SU N1,2 Distribution  U   η   Γ λ   U   η   γ   λ  1 1 11 γ12 1 Let ∼ N3 , , where U = , η= , Γ= , λ= such V ξ ω U2 η2 γ22 λ2 2 2×2 that η, λ ∈ R , and Γ ∈ R is a positive definite matrix. Consider the univariate random variable X with following stochastic representation d X = V | (U > 0) . From (2), X ∼ SU N1,2 (θ), where θ = (ξ, η, ω, Γ, λT ). Further, using (7) and (8), for t ∈ R,  d d Y = X | (X > t) = V | (V > t, U > 0) ∼ T SU N1,2 θ; (t, +∞) .

(14)

In the next lemma, we show that Y is a SSU N1,3 distribution and then we present an expression for mean of Y.  Lemma 2.2 If Y ∼ T SU N1,2 θ; (t, +∞) , we then have  Y ∼ SSU N1,3 ξ, η ∗ (t), ω, Γ∗ , λ∗T .

Further, the mean of Y is 1  E(Y ) = ξ + Φ3 η ∗ (t); Γ∗1 η2  λ2 Φ2 √ φ √ γ22 γ22

η1 −

(

η2 γ22 γ12

ξ−t−

η2 −

λ1 η1  Φ2 √ φ √ γ11 γ11

η2 γ22 λ2

where η ∗ (t) = (η, ξ − t)T , Γ∗ =

!

;

γ11 −

η1 γ11 γ12

ξ−t− 2 γ12 γ22

η1 γ11 λ1

12 λ1 − λ2 γγ22

ω−

λ22 γ22

 Γ λ  , and λ∗ = (λT , ω)T . ω

!

!!

;

(15)

γ22 −

2 γ12 γ11

λ2 − λ1 γγ12 11 ω−

λ21 γ11

!!

+

(16)

) ξ − t ξ−t λλT  + ωφ √ Φ2 η − λ; Γ− , ω ω ω √

Proof Similar to Lemma 2.1, the result in (15) can be easily shown using the stochastic representation in (14). The mean in (16) can be obtained using the first derivative of the mgf SSU N1,3 distribution (see Lemma 2 in Arellano-Valle et al., 2014) which is, from (4),  ∗ ∗  ωs2  Φ3 η ∗ (t) + λ s; Γ  . (17) MY (s; θ) = exp ξs + 2 Φ3 η ∗ (t); Γ∗

2.2

T SU N2,1 and a generalized mixture of T SU N2,1 Distributions

In what follows we shall see that joint distribution of order statistics from trivariate normal distribution can be expressed in terms of bivariate T SU N2,1 distributions and their mixtures. In this subsection we briefly study these distributions.  U   η   γ 2 λT   V   ξ   ω  λ  ω12  1 1 11 1 Let ∼ N3 , , where V = , ξ= , Ω= , λ= , V ξ V2 ξ2 ω22 λ2 Ω such that η ∈ R, ξ, λ ∈ R2 , γ > 0 and Ω is a positive definite matrix. Further, let d

X = (X1 , X2 )T = (V1 , V2 )T |(U > 0). 4

From (2), X ∼ SU N2,1 (θ), where θ = (ξ, η, Ω, γ 2 , λ). Using (7) and (8), we have

 d   d Y = (Y1 , Y2 )T = X | X ∈ C(y1 , y2 ) = V | V1 < V2 , U > 0 ∼ T SU N2,1 θ; C(y1 , y2 ) ,

(18)

where C(y1 , y2 ) = {(y1 , y2 )T ∈ R2 : y1 < y2 }.  In the following lemma, we present several properties of the T SU N2,1 θ; C(y1 , y2 ) distribution.  Lemma 2.3 If Y ∼ T SU N2,1 θ; C(y1 , y2 ) , then (i)

  Y = (Y1 , Y2 )T ∼ SSU N2,2 ξ,

η

cT ξ



, Ω,

 γ2

cT λ 

cT Ωc

, (λ

 Ωc) ;

(19)

(ii) The marginal distribution of Y1 and Y2 are   Y1 ∼ SU N1,2 eT ξ,   Y2 ∼ SU N1,2 e0T ξ,

η cT ξ η cT ξ

 

, ω11 ,

 γ2

, ω22 ,

 γ2

(iii) The conditional distribution of Y2 given Y1 = y1 is

cT λ 

cT Ωc

cT λ 

cT Ωc

, (eT λ

 eT Ωc ;

(20)

, (e0T λ

 e0T Ωc ;

(21)

 2 Y2 | (Y1 = y1 ) ∼ T SU N1,1 ξ2.1 (y1 ), η2.1 (y1 ), ω22.1 , γ2.1 , λ2.1 ; (y1 , +∞) ;

where, ξ2,1 (y1 )

=

η2,1 (y1 )

=

ω21 (y1 − ξ1 ), ω11 λ1 η+ (y1 − ξ1 ), ω11

2 ω21 , ω11 λ2 = γ2 − 1 , ω11

ω22.1 = ω22 −

ξ2 +

2 γ2.1

λ2.1 = λ2 −

ω21 λ1 . ω11

(iv) For t ∈ R, the mgfs of Y2 given Y1 > t and of Y1 given Y2 < t are, for s ∈R, respectively, ω22 s2 Φ3 (η ∗ (t) + λ∗ s; Γ∗ ) } , 2 Φ3 (η ∗ (t); Γ∗ )  ∗∗ ∗∗ ω11 s2  Φ3 η ∗∗ (t) + λ s; Γ  MY1 |Y2 t (s; θ) = exp{ξ2 s +

(22) (23)

where c = (−1, 1)T , e = (1, 0)T , e0T ,η ∗ (t) = (η, cT ξ, eT ξ − t)T , η ∗∗ (t) = (η, cT ξ, t − e0T ξ)T ,

λ∗ = (λ2 , cT Ωe0 , ω12 )T , λ∗∗ = (λ1 , cT Ωe, − ω12 )T , and 

Γ∗ = 

γ2

cT λ cT Ωc

  2 eT λ γ cT Ωe  , Γ∗∗ =  eT Ωe

cT λ cT Ωc

 −e0T λ −cT Ωe0  . e0T Ωe0

Proof The proof (i) can be derived by the stochastic representation in (18). The results in the other parts can be obtained by part (i) and the properties of SSU N2,2 distributions. Remark 1 Using Lemma 2 in Arellano-Valle et al. (2014), we may obtain E(Y2 |Y1 > t) and E(Y1 |Y2 < t) by the first derivatives of the mgfs in (22) and (23), respectively. In the rest of this section, we briefly study a generalized mixture of T SU N2,1 distributions. For this porpose, consider the following random vector d

Y = (Y1 , Y2 )T = (V1 , V2 )T |(V1 < U < V2 ). 5

(24)

In the following lemma, we show that the distribution of Y in (24) is a generalized mixture of T SU N2,1 distributions. Lemma 2.4 Consider the random vector of Y with stochastic representation in (24), we then have

where A =

  fY (y1 , y2 ) = AfT SU N2,1 y1 , y2 ; θ, C(y1 , y2 ) + A0 fT SU N2,1 y1 , y2 ; θ 0 , C(y1 , y2 ) ,

P (U
1 ,V1
  ω21 − λ1    ω11 − λ1  θ = ξ, ξ2 − η, Ω, γ 2 + ω22 − 2λ2 , , θ 0 = ξ, ξ1 − η, Ω, γ 2 + ω11 − 2λ1 , . ω22 − λ2 ω12 − λ2 Proof We note that P (V1 < U < V2 ) = P (U < V2 , V1 < V2 ) − P (U < V1 , V1 < V2 ), so from (24), the pdf of Y is fY (y1 , y2 ) =

fV (y1 , y2 )P (U < V2 , V1 < V2 |V1 = y1 , V2 =2 ) fV (y1 , y2 )P (V1 < U < V2 |V1 = y1 , V2 = y2 ) =A P (V1 < U < V2 ) P (U < V2 , V1 < V2 ) f (y , y )P (U < V , V < V |V = y , V = y ) V 1 2 1 1 2 1 1 2 2 . + A0 P (U < V1 , V1 < V2 )

Now, by the stochastic representation in (18), it is easy to show that

Similarly,

 fV (y1 , y2 )P (U < V2 , V1 < V2 |V1 = y1 , V2 =2 ) = fT SU N2,1 y1 , y2 ; θ, C(y1 , y2 ) . P (U < V2 , V1 < V2 )  fV (y1 , y2 )P (U < V1 , V1 < V2 |V1 = y1 , V2 = y2 ) = fT SU N2,1 y1 , y2 ; θ 0 , C(y1 , y2 ) , P (U < V1 , V1 < V2 )

which completes the proof.

3

Joint and conditional distributions of order statistics from a trivariate normal distribution

The connection between order statistics and skewed distributions was first noticed by Loperfido (2002), while there was independent rediscovery of the result in Roberts (1966). This was later generalized by Azzalini and Capitanio (2003). Other connections between order statistics and generalizations of the normal distribution were dealt with in Crocetta and Loperfido (2005), Arellano-Valle and Genton (2007, 2008), Jamalizadeh and Balakrishnan (2010) and Arellano-Valle et al. (2014). Jamalizadeh and Balakrishnan (2009) obtained the exact distributions of order statistics from trivariate normal and trivariate Student-t distributions based on generalized skew-normal and skew-t distributions. Arellano-Valle and Genton (2007) and Jamalizadeh and Balakrishnan (2010) discussed distributions of L-statistics from multivariate normal and elliptical distributions. However, little is known about the joint distribution of order statistics from these distributions. Recently, Amiri et al. (2014) discussed joint distributions of order statistics from an exchangeble multivariate normal. In this section, we investigate the exact joint, marginal and conditional distributions from a general trivariate normal distribution, and, in particular, for the exchangeable case. We show that these distributions are related to the SUN and TSUN distributions.

6

3.1

General case

Let X ∼ N3 (µ, Σ), where



    X1 µ1 σ11 X =  X2  , µ =  µ2  , Σ =  X3 µ3

 σ13 σ23  , σ33

σ12 σ22

and µ ∈ R3 and Σ ∈ R3×3 is a positive definite matrix (Rank(Σ) = 3). Also X(1) , X(2) and X(3) will denote the first, second and third order statistics, respectively. In what follows Eij = (e3,i , e3,j ) ∈ R3×2 , i 6= j, where e3,1 , e3,2 , e3,3 are, in order, the columns of the identity matrix I3 . The following Theorem shows that the joint distribution of consecutive order statistics in a trivariate normal distribution is a mixture of T SU N2,1 distributions, but the joint distribution of X(1) and X(3) is a generalized mixture of T SU N2,1 distributions. The proof of the theorem is in the Appendix. Theorem 3.1 If X ∼ N3 (µ, Σ), then we have the following, for j = 1, 2, (a)

FX(j) ,X(j+1) (yj , yj+1 ; µ, Σ) =

6 X i=1

where, for i = 1, 2, ..., 6, θ j,j+1,i =

 pj,j+1,i FT SU N2,1 yj , yj+1 ; θ j,j+1,i , C(yj , yj+1 ) ,

2 (ξ i , ηj,j+1,i , Ωi , γj,j+1,i , λj,j+1,i ),

!

ηj,j+1,i

pj,j+1,i = Φ2

cT ξ i

(25)

cT λj,j+1,i

2 γj,j+1,i

;

c T Ωi c

!!

,

2 ξ i = ATi µ, η1,2,i = cTi µ, Ωi = ATi ΣAi , γ1,2,i = cTi Σci , λ1,2,i = ATi Σci , T

T

2 η2,3,i = c0 i µ, γ2,3,i = c0 i Σc0 i , λ2,3,i = ATi Σc0 i ,

and c1 = −c2 = (0, −1, 1)T , c3 = −c4 = (−1, 0, 1)T , c5 = −c6 = (−1, 1, 0)T , A1 = E12 , A2 = E13 , A3 = E21 , A4 = E23 , A5 = E31 , A6 = E32 ,

c0 1 = −c0 6 = c4 , c0 2 = −c0 4 = c6 , c0 3 = −c05 = c2 . (b)

FX(1) ,X(3) (y1 , y3 ; µ, Σ) =

6 h X i=1

where, for i = 1, 2, ..., 6, θ 1,3,i =

 p1,3,i FT SU N2,1 y1 , y3 ; θ 1,3,i , C(y1 , y3 )

i +p01,3,i FT SU N2,1 y1 , y3 ; θ 0 1,3,i , C(y1 , y3 ) ,

2 (ξ i , η1,3,i , Ωi , γ1,3,i , λ1,3,i ),

−cTi µ

p1,3,i = Φ2

T

c00 i µ

!

T

0 02 θ 0 1,3,i = (ξ i , η1,3,i , Ωi , γ1,3,i , λ0 1,3,i ),

η1,3,i =

−cTi µ,

2 γ1,3,i

=

dTi Σdi

p01,3,i = −Φ2 0T

0T

(i)

+ e Ωi e − 2e λ , 0

T

0 02 η1,3,i = c0 i µ, γ1,3,i = dTi Σdi + eT Ω0 i e − 2eT λ(i) ,

c0 i µ T

c00 i µ

;

−cTi Σc00 i T

c00 i Σc00 i

!

T

c0 i Σc00 i T

c00 i Σc00 i !

T

T

e0 Ωi e0 − e0 λ(i) eT Ωi e − eT λ(i) 0T

e Ωi e − e λ T

0

(i)

!

,

and c00 1 = −c00 3 = c5 , c00 2 = −c00 5 = c3 , c00 4 = −c00 6 = c1 , d1 = d3 = e3,3 , d2 = d5 = e3,2 , 7

!!

T

c0 i Σc0 i

;

eT Ωi e0 − eT λ(i)

λ1,3,i =

λ0 1,3,i =

cTi Σci

(26)

,

!!

,

,

λ(i) = ATi Σdi ,

d4 = −d6 = e3,1 .

Now, we derive the marginal distributions. The theorem, easily derived from Theorem 3.1 and Lemma 2.3 is as follows. Theorem 3.2 The assumptions of the previous theorem, imply that for j = 1, 2, 3, the cdf of X(j) is FX(j) (yj ; µ, Σ) =

6 X

pj,j+1,i FSU N1,2 (yj ; θ j,i ),

(27)

i=1

where, for i = 1, 2, ..., 6, p3,4,i = p2,3,i and θ j,i =

θ 3,i =

!

ηj,j+1,i

e ξi , T

cT ξ i η2,3,i

T

e0 ξ i ,

cT ξ i

!

, e Ωi e, T

T

, e0 Ωi e0 ,

2 γj,j+1,i

cT λj,j+1,i c T Ωi c

cT λ2,3,i

2 γ2,3,i

c T Ωi c

!

,



!



, e λj,j+1,i T

T

e Ωi c

T

e0 λ2,3,i

e0 Ωi c

T



!



!

,

.

The conditional density functions are derived below. We see that the conditional distribution of consecutive order statistics is a mixture of T SU N1,1 distributions, while the conditional distribution of X(3) given X(1) is a generalized mixture of T SU N1,1 distributions. Theorem 3.3 For j = 1, 2, (i) FX(j+1) |X(j) =yj (yj+1 ; µ, Σ) =

6 X i=1

 pj+1.j,i (yj )FT SU N1,1 yj+1 ; θ j+1.j,i (yj ), (yj , +∞) ,

 2 where, for i = 1, 2, ..., 6, θ j+1.j,i (y1 ) = ξi (yj ), ηj+1.j,i (yj ), ωi , γj+1.j,i , λj+1.j,i , T

ξi (yj ) = e0 ξ i +

T

e0 Ωi e (yj − eT ξ i ), eT Ωi e

ηj+1.j,i (yj ) = ηj,j+1,i +

T

ωi = e0 Ωi e0 −

eT λj,j+1,i (yj − eT ξ i ), eT Ωi e

(eT Ωi e0 )2 , eT Ωi e

2 2 γj+1.j,i = γj,j+1,i −

T

T

λj+1.j,i = e0 λj,j+1,i −

(eT λj,j+1,i )(e0 Ωi e) , eT Ωi e

pj+1.j,i (yj ) =

(28)

(eT λj,j+1,i )2 , eT Ωi e

fSU N1,2 (yj ; θ j,i ) pj,j+1,i , fX(j) (yj )

such that, FSU N1,2 (yj ; θ j,i ) defined in (27). (ii) FX(3) |X(1) =y1 (y3 ; ; µ, Σ) =

6 X i=1

"

p3.1,i (y1 )FT SU N1,1 y3 ; θ 3.1,i (y1 ), (y1 , +∞)

+p03.1,i (y1 )FT SU N1,1

y3 ; θ

0

3.1,i (y1 ), (y1 , +∞)



#



,

where, for i = 1, 2, ..., 6,  2 θ 3.1,i (y1 ) = ξi (y1 ), η3.1,i (y1 ), ωi , γ3.1,i , λ3.1,i ,

η3.1,i (y1 ) = η1,3,i +

 2 0 θ 0 3.1,i (y1 ) = ξi (y1 ), η3.1,i (y1 ), ωi , γ 0 3.1,i , λ03.1,i ,

eT λ1,3,i (y1 − eT ξ i ), eT Ωi e

0 0 η3.1,i (y1 ) = η1,3,i +

8

eT λ0 1,3,i (y1 − eT ξ i ), eT Ωi e

(29)

2 2 γ3.1,i = γ1,3,i −

(eT λ1,3,i )2 , eT Ωi e

2

2

γ 0 3.1,i = γ 0 1,3,i − T

(eT λ1,3,i )(e0 Ωi e) , eT Ωi e fSU N1,2 (y1 ; θ i ) p3.1,i (y1 ) = p1,3,i , fX(1) (y1 ) T

(eT λ0 1,3,i )2 , eT Ωi e T

(eT λ0 1,3,i )(e0 Ωi e) , eT Ωi e 0 fSU N1,2 (y1 ; θ i ) 0 p03.1,i (y1 ) = p1,3,i , fX(1) (y1 ) T

λ3.1,i = e0 λ1,3,i −

λ03.1,i = e0 λ0 1,3,i −

and 2   η1,3,i   γ1,3,i θ i = eT ξ i , , eT Ωi e, cT ξ i 0    η1,3,i  θ 0 i = eT ξ i , , eT Ωi e, cT ξ i

3.2

2 γ 0 1,3,i

cT λ1,3,i  c T Ωi c

, (eT λ1,3,i

cT λ0 1,3,i  cT Ωi c

 eT Ωi c) ,

, (eT λ0 1,3,i

 eT Ωi c) .

The exchangeable case

In the rest of this section, we consider the special case, when X ∼ N3 (µ, Σ), with µ = µ13 , Σ = σ 2 ((1 − ρ)I3 + ρ13 1T3 ), where µ ∈ R, σ > 0, − 21 < ρ < 1, and 13 and I3 , respectively, denote the vector of 1’s and the identity matrix of dimension 3. The following corollaries follow from the theorems in subsection 3.1. Corollary 3.4 For j = 1, 2, (i) where

(ii)

 FX(j) ,X(j+1) (yj , yj+1 ; µ, σ 2 , ρ) = FT SU N2,1 yj , yj+1 ; θ j,j+1 , C(yj , yj+1 ) ,   1   1 θ 1,2 = µ , 0, σ 2 1 ρ

  1   1 θ 2,3 = µ , 0, σ 2 1 ρ

FX(1) ,X(3) (y1 , y3 ; µ, σ 2 , ρ) = 2FT SU N2,1

where, θ 0 1,3 = θ 2,3 and

 0  ρ  , 2σ 2 (1 − ρ), −σ 2 (1 − ρ) , 1 1

 1  ρ  , 2σ 2 (1 − ρ), σ 2 (1 − ρ) , 1 0   y1 , y3 ; θ 1,3 , C(y1 , y3 ) − FT SU N2,1 y1 , y3 ; θ 0 1,3 , C(y1 , y3 ) ,

  1   1 θ 1,3 = µ , 0, σ 2 1 ρ

 0  ρ  , 2σ 2 (1 − ρ), σ 2 (1 − ρ) . 1 1

Corollary 3.5 For j = 1, 2, 3, FX(j) (yj ; µ, σ 2 , ρ) = FSU N1,2 (yj ; θ j ), where, θ 1 = θ(−1, 0, −1), θ 2 = θ(−1, −1, 1), θ 3 = θ(−1, 0, 1) and   0   2 θ(a, b, c) = µ, , σ 2 , σ 2 (1 − ρ) a 0

 a  2 , σ (1 − ρ)(b c) . 2

Corollary 3.6 For j = 1, 2, (i)

 FX(j+1) |X(j) =yj (yj+1 ; µ, σ 2 , ρ) = FT SU N1,1 yj+1 ; θ j+1.j (yj ), (yj , +∞) , 9

where   θ 2.1 (y1 ) = µ + ρ(y1 − µ), 0, σ 2 (1 − ρ2 ), 2σ 2 (1 − ρ), −σ 2 (1 − ρ) ,   θ 3.2 (y2 ) = µ + ρ(y2 − µ), (1 − ρ)(y2 − µ), σ 2 (1 − ρ2 ), σ 2 (1 − ρ2 )), −σ 2 ρ(1 − ρ) . (ii)

FX(3) |X(1) =y1 (y3 ; µ, σ 2 , ρ) =

h  1 2fSU N1,2 (y1 ; θ)FT SU N1,1 y3 ; θ 3.1 (y1 ), (y1 , +∞) fSU N1,2 (y1 ; θ 1 ) i −fSU N1,2 (y1 ; θ 0 )FT SU N1,1 y3 ; θ 0 3.1 (y1 ), (y1 , +∞) ,

such that, θ 0 3.1 (y1 ) = θ 3.2 (y1 ), θ = θ(1, 0, −1), θ 2 = θ(−1, 1, −1) and   θ 3.1 (y1 ) = µ + ρ(y1 − µ), 0, σ 2 (1 − ρ2 ), 2σ 2 (1 − ρ), σ 2 (1 − ρ) .

4

Reliability measures

Consider a system consisting of three dependent components with lifetimes X1 , X2 and X3 . Hence the lifetimes of a series, parallel and 2-out-of-3 system are T1 = X(1) , T3 = X(3) and T2 = X(2) , respectively. For i ≥ j and i, j = 1, 2, 3, M Ri,j (t) = E(Ti − t|Tj > t), (30) and, for i > j and i, j = 1, 2, 3,

mri,j (t) = E(Ti − t|Tj = t),

(31)

are MRL functions at the system level. M Ri,j (t) is the MRL of an i-out-of-3 system when at least i − j + 1 of its components are still working at time t or, in other words, when the j-th failure has not occurred up to time t. In reliability theory, mri,j (t) is known as the regression-mean residual life. It is the average residual life time of an i-out-of-3 system if the j-th failure occurs at time t. For i < j and i, j = 1, 2, 3, M Ii,j (t) = E(t − Ti |Tj < t), (32)

are MIT functions. M Ii,j (t) is the mean time clapsed since Ti given that Tj < t. Let X = (X1 , X2 , X3 )T ∼ N3 (µ, Σ) be the lifetimes of the components in the system, Theorem 3.3 shows that the conditional distribution of X(i) given X(j) , for i > j, is a mixture of T SU N1,1 distributions. So from (12), we can obtain mri,j (t) = E(Ti |Tj = t) − t. Similarly it follows from Theorem 3.2 and (16) that, for j = 1, 2, 3,

M Rj,j (t) = E(Tj |Tj > t) − t, are obtained. The corollary below follows from Theorem 3.1 and Lemma 2.3. Corollary 3.7 If X ∼ N3 (µ, Σ). Then, for j = 1, 2, 6



X 1 M Rj+1,j (t) = ¯ pj,j+1,i F¯SU N1,2 (t; θ j,i )E(Yj+1,i |Yj,i > t) − t, FX(j) (t) i=1



M Ij,j+1 (t) = t −

1

FX(j+1) (t)

6 X i=1

pj,j+1,i FSU N1,2 (t; θ j+1,i )E(Yj,i |Yj+1,i < t),

10

 where, for i = 1, 2, ..., 6, (Yj,i , Yj+1,i ) ∼ T SU N2,1 θ j,j+1,i , C(yj , yj+1 ) , and •

Xh 1 p1,3,i F¯SU N1,2 (t; θ i )E(Y3,i |Y1,i > t) M R3,1 (t) = ¯ FX(1) (t) i=1



M I1,3 (t) = t −

6

i 0 0 +p01,3,i F¯SU N1,2 (t; θ 0 i )E(Y3,i |Y1,i > t) − t,

1 FX(3) (t)

6 h X i=1

i 0 0 0 (t)E(Y p1,3,i FY3,i (t)E(Y1,i |Y3,i < t)) + p01,3,i FY3,i 1,i |Y3,i < t) ,

  0 0 ) ∼ T SU N2,1 θ 0 1,3,i , C(y1 , y3 ) . , Y3,i where, for i = 1, 2, ..., 6, (Y1,i , Y3,i ) ∼ T SU N2,1 θ 1,3,i , C(y1 , y3 ) and (Y1,i The reliability measures in the exchangeable case can also be expressed in a closed form. For example, we have the following corollary.   Corollary 3.8 Let X ∼ N3 µ13 , σ 2 ((1 − ρ)I3 + ρ13 1T3 ) , •mr3,1 (t) =

1

fSU N1,2 (t; θ 1 )

h  i 2fSU N1,2 (t; θ)g1 (t) − fSU N1,2 (t; θ 0 ) mr3,2 (t) + t − t,

where g1 (t) = µ + ρ(t − µ) + σ 2 (1 − ρ2 )fT SU N1,1 (t; θ 3.1 (t), (t, +∞)) + and

  q 2(1−ρ) t−µ 1+2ρ ( σ ) σ 2 (1 − ρ) Φ − , 4π g(t, 0, ρ − 1, 2, 1, 1 − ρ)

  a   (1 − ρ)c g(t, a, b, c, d, e) = Φ2 (µ − t) ; σ2 b

• M R3,3 (t) = µ − t + where

hσ 1 Φ3 (η(t), Γ(-1,0,1)) 2

η(t) = (µ − t)e3,3 ,

5

r

r

(1 − ρ)d  . (1 + ρ)e

i 3 1 1 µ−t 1−ρ g(t, 0, 1, , − , ) + σφ( )g(t, 0, ρ − 1, 2, −1, 1 − ρ) , 2 2 2 2 σ 

 Γ(a,b,c) = σ 2 (1 − ρ) 

2

a 2

b c 1 1−ρ



  .

Conclusion

In this paper, we study joint and conditional distributions of order statistics from a trivariate normal distribution and use them to obtain reliability measures. It may be mentioned that although we have presented the results for normal case, we conjecture that the results in this paper can be extended to trivariate elliptical distributions. More work is needed in this direction.

Acknowledgments The authors are grateful to the Editor-in-Chief, the Associate Editor and an anonymous referee for their many helpful comments and suggestions on an earlier version of this paper which led to this improved version.

11

Appendix: Proof of Theorem 3.1 We shall prove (b). The proof of (a) is similar. To prove (26), we note that FX(1) ,X(3) (y1 , y3 ; µ, Σ)

= P (X(1) ≤ y1 , X(3) ≤ y3 ) = P (X1 < X3 < X2 )P (X1 ≤ y1 , X2 ≤ y3 |X1 < X3 < X2 ) + P (X1 < X2 < X3 )P (X1 ≤ y1 , X3 ≤ y3 |X1 < X2 < X3 )

+ P (X2 < X3 < X1 )P (X2 ≤ y1 , X1 ≤ y3 |X2 < X3 < X1 )

+ P (X2 < X1 < X3 )P (X2 ≤ y1 , X3 ≤ y3 |X2 < X1 < X3 )

+ P (X3 < X2 < X1 )P (X3 ≤ y1 , X1 ≤ y3 |X3 < X2 < X1 )

+ P (X3 < X1 < X2 )P (X3 ≤ y1 , X2 ≤ y3 |X3 < X1 < X2 ).

(33)

We will only compute the first term of RHS of (33). The other terms can be derived in a similar manner. Let U = X3 , V1 = X1 and V2 = X2 , then U = dT1 X and V = (V1 , V2 )T = AT1 X, hence 

U V



∼ N3



dT1 µ AT1 µ

  T d1 Σd1 ;

dT1 ΣA1 AT1 ΣA1



,

So from (24) and Lemma 2.4, the first term is   P (U < V2 , V1 < V2 )FT SU N2,1 y1 , y3 ; θ 1,3,1 , C(y1 , y3 ) − P (U < V1 , V1 < V2 )FT SU N2,1 y1 , y3 ; θ 0 1,3,1 , C(y1 , y3 ) .

On the other hand,

P (U < V2 , V1 < V2 ) = P

=P

cT1 X T

c00 1 X

X3 − X2 X3 − X1

!

!

>0

!

!

>0

= Φ2

cT1 X

=P cT1 µ T

c00 1 µ

cT3 X !

;

!

>0

cT1 Σc1

! cT1 Σc00 1 T

c00 1 Σc00 1

!!

= p1,3,1 .

Similarly Therefore, the first term is equal to

−P (U < V1 , V1 < V2 ) = p01,3,1 .

  p1,3,1 FT SU N2,1 y1 , y3 ; θ 1,3,1 , C(y1 , y3 ) + p01,3,1 FT SU N2,1 y1 , y3 ; θ 0 1,3,1 , C(y1 , y3 ) .

References

[1] Amiri, M., Jamalizadeh, A., Towhidi, M. (2014). Some multivariate singular unified skew-normal distributions and their application. Commun. Statist.-Theory Methods. accepted. [2] Arellano-Valle, R. B., Azzalini, A. (2006). On the unification of skew-normal distributions. Scand. J. Statist. 33, 561-574. [3] Arellano-Valle, R. B., Genton, M. G. (2007). On the exact distribution of linear combination of order statistics from dependent random variables. J. Multiv. Anal. 98, 1876-1894. [4] Arellano-Valle, R. B., Genton, M. G. (2008). On the exact distribution of the maximum of absolutely continuous dependent random variables. Statist. Probab. Lett. 78, 27-35. [5] Arellano-Valle, R. B., Jamalizadeh, A., Mahmoodian, H., Balakrishnan, N. (2014). L-statistics from multivariate unified skew-elliptical distributions. Metrika 77, 559-583.

12

[6] Asadi, M., Bayramoglu, I. (2006). The mean residual life function of a k-out-of-n structure at the system level. IEEE Trans. Reliab. 55, 314-318. [7] Azzalini, A., Capitanio, A. (2003). Distributions generated by perturbation of symmetry with emphasis on a multivariate skew-t distribution. J. R. Statist. Soc. Ser. B. 65, 367-389. [8] Crocetta, C., Loperfido, N. (2005). The exact sampling distribution of L-statistics. Metron. 63, 213-223. [9] Gupta, R. C. (2012). Regression mean residual life of a parallel system of dependent components. J. Statist. Plann. Inference. 142, 1063-1072. [10] Jamalizadeh, A., Balakrishnan, N. (2009). Order statistics from trivariate normal and t-distributions in terms of generalized skew-normal and skew-t distributions. J. Statist. Plann. Inference. 139, 3799-3819. [11] Jamalizadeh, A., Balakrishnan, N. (2010). Distributions of order statistics and linear combinations of order statistics from an elliptical distribution as mixtures of unified skew-elliptical distributions. J. Multiv. Anal. 101, 1412-1427. [12] Loperfido, N. (2002). Statistical implications of selectively reported inferential results. Statist. Probab. Lett. 56, 13-22. [13] Loperfido, N., Navarro, J., Ruiz, J. M. and Sandoval, C. J. (2007). Some relationships between skew-normal distributions and order statistics from exchangeable normal random vectors. Commun. Statist.-Theory Methods. 36, 1719-1733. [14] Navarro, J., Ruiz, J. M., Sandoval, C. J. (2007). Properties of coherent systems with dependent components. Commun. Statist.-Theory Methods. 36, 175-191. [15] Navarro, J., del Aguila, Y., Sordo, M., A., Suarez-Llorenz, A. (2013). Stochastic ordering properties for systems with dependent identically distributed components. Appl. Stoch. Models Bus. Ind. 29, 264-278. [16] Navarro, J., Samaniego, F. J., Balakrishnan, N. (2010). The joint signature of coherent systems with shared components. J. Appl. Probab. 47, 235-253. [17] Olkin, M., Viana, M., (1995). Correlation analysis of extreme observations from multivariate normal distribution. J. Amer. Statist. Assoc. 90, 1373-1379. [18] Roberts, C., (1966). A correlation model useful in the study of twins. J. Amer. Statist. Assoc. 61, 1184-1190. [19] Tavangar, M., Asadi, M. (2010). A study on the mean past lifetime of the components of (n − k + 1)-outof-n-system at the system level. Metrika. 72, 59-73. [20] Viana, M. (1998). Linear combinations of ordered symmetric observations with application to visual acuity. In: Balakrishnan, N., Rao, C. R., eds. Handbook of Statistics 17. Order Statistics: Applications. Amsterdam: North-Holland, pp. 513-524. [21] Zhang, Z. (2010). Ordering conditional general coherent systems with exchangeable components. J. Statist. Plann. Inference. 140, 454-460.

13