P. R~ Krishnaiah, ed., Handbook of Statistics, Vol. I ©North-H011and Publishing Company (1980) 513-570
1 t1~ It
Likelihood Ratio 'rests for Mean Vectors and Covariance Matrices P. R. Krishnaiah* and Jack C. Lee** 1.
Introduction
Likelihood ratio tests play an important role in testing various hypotheses under A N O V A and M A N O V A models. A comprehensive review of the literature until 1957 on the likelihood ratio procedures for testing certain hypotheses on mean vectors and covariance matrices was given by Anderson (1958). In this chapter, we describe various likelihood ratio tests on mean vectors and covariance matrices and discuss computations of the critical values associated with the above tests. Throughout this chapter, we assume that the underlying distribution is multivariate normal. In Section 2, we discuss the likelihood ratio test for testing for the equality of mean vectors of several multivariate normal populations. Also, the test specifying the mean vector is discussed. The approximation of Rao (1951) for the distribution of the determinant of the multivariate beta matrix is also discussed in this section. Likelihood ratio test for multiple independence of several sets of variables is discussed in Section 3. In Section 4, we discuss the likelihood ratio tests for sphericity, specifying the covariance matrix as well as the test for the equality of the covariance matrices. Likelihood ratio test for the multiple homogeneity of the covariance matrices is also discussed in this section. I n Section 5, we discuss the likelihood ratio procedure specifying the mean vector and covariance matrix simultaneously. Also, likelihood ratio test for the equality of the mean vectors and the equality of covariance matrices simultaneously is *The work of this author is supported in part by the Air Force Officeof Scientific Research under Contract F49620-79-C-0161. Reproduction in whole or in part is permitted for any purpose of the United States Government. **The work of this author is supported in part by the National Science Foundation under Grant MCS79-02024. Part of the work of this author was also done at the Air Force Flight Dynamics Laboratory under Contract F33615-76-C-3145.
513
514
P. R. Krishnaiahand Jack C. Lee
discussed in this section. Likelihood ratio test for the equality of the means, equality of the variances and the equality of the covariance matrices simultaneously is discussed in Section 6. In Section 7, we discuss the likelihood ratio test for c o m p o u n d symmetry whereas Section 8 is devoted to the likelihood ratio tests for certain linear structures on the covariance matrices. Applications of the tests on linear structures in the area of components of variance are also discussed in Section 8. Tables which are useful in the application of most of the tests discussed in this chapter are given at the end of the chapter (before References). Most of these tables are constructed by Chang, Krishnaiah and Lee b y approximating certain powers of the likelihood ratio test statistics with Pearson's type I distribution. In discussing the distributions associated with the likelihood ratio tests, we emphasized on approximations partly because the exact expressions are complicated and partly because the accuracy of the above approximations is good. F o r a detailed discussion of some of the developments on exact distributions, the reader is referred to Mathai and Saxena (1973).
2.
T e s t s o n m e a n vectors
In a number of situations, it is of interest to test the hypotheses on m e a n vectors of multivariate normal populations. For example, an experimenter m a y be interested in testing whether the means of various variables are equal to specified values. Or, he m a y be interested in finding out as to whether there is significant difference between the populations when the covariance matrices are equal. In this section, we discuss the likelihood ratio tests on mean vectors. We will first discuss certain distributions useful in the application of these procedures.
2.1.
Hotelling's T 2 statistic and determinant of multivariate beta matrix
Let y be distributed as a p-variate normal with mean vector p and covariance matrix Z. Also, let A be distributed independently of y as the central Wishart distribution with n degrees of freedom a n d E ( A ) - - n E . Then, T 2= ny'A-~y is known to be Hotelling's T 2 statistic. W h e n p =0, Hotelling (1931) showed that T Z ( n - p + 1)/pn is distributed as the central F distribution with (p,n - p + 1) degrees of freedom. It is known (e.g., see Wijsman, 1957; Bowker, 1961) that T Z ( n - p + 1)/pn is distributed as the noncentral F distribution with ( p , n - p + 1) degrees of freedom a n d with p ' E - ~ p as the noncentrality parameter.
Likefihood ratio tests f o r mean vectors and covariance matrices
515
Next, let
IAI[
Bp,q,n"-[A~+A2[
(2.1)
where A l : p × p and A2: p × p are independently distributed as central Wishart matrices with n and q degrees of freedom respectively and E ( A i / n ) = E ( A z / q ) = E. The matrix A l ( A 1 -t-A2)- 1 is someth-nes referred to in the literature as a multivariate beta matrix. Schatzoff (1966), Pillai and Gupta (1969), Mathai (1971) and Lee (1972) considered the evaluation of the exact distribution of Bp,q, n and computed percentage points for some values of the parameters. We will now discuss approximations to the distribution of Bp,q, n. Rao (1951) approximated the distribution of
Cms+2X~¢l \ 11,
-i/s ~-- hi/,
-- Dp,q,n)/ZHJp,q, n
with F distribution with (2r, ms + 2~) degrees of freedom where 2~--- - ¼(pq I - 2), r = 5pq, m = n - ½(p + q + 1) and s = [(p2q2_ 4 ) / ( p 2 + q 2 _ 5)]1/2. Here, we note that n ( 1 - Bl,q,n)/qBl,q, n is the F distribution with (q,n) degrees of freedom. Also, it is known (see, Wilks, 1935) that I"n \
D/1 1\
-
R 1/2 ~/.~
1/2
U 2 , q , n / / "lU2,q,n
is distributed as F distribution with 2q and 2(n - 1) degrees of freedom. Table 1 gives a comparison of the exact values with the values obtained by using approximation suggested by Rao. The exact values are taken from Lee (1972).
Table 1 Comparison of Rao's approximation with the exact distribution of Bp.q,,, for a =0.05 p=5,q=7 n
Exact
Rao
5 8 9 10 12
0.06920 0.0014212 0.0032991 0.0062212 0.0153692
0.053394 0.0014755 0.0033610 0.0062627 0.0152890
516
P. R. Krishnaiah and Jack (Z Lee
When the F approximation is used, it is important to keep in m i n d that the denominator degree of freedom (that is ms + 2X) could be noninteger, and hence careful interpolation is in order. The entries under "Exact" give the exact values of c for a =0.05 where e [ Bp, q,n < C ] = ( ] - o O .
(2.2)
The entries in the last column are approximate values of c obtained by using the approximation of Rao. Here we note that a n u m b e r of likelihood ratio test statistics including Bp, q,n can be expressed as a product of beta variates. Tukey and Wilks (1946) approximated the distribution of the product of beta variables with a single beta variate. Using the m e t h o d of R a o (1951), Roy (1951) obtained an asymptotic expansion for the product of beta variates; the first term in this series is beta distribution. Using the first four moments, Lee et al. (1977) approximated the distribution of Bp,1/b q,n with Pearson Type I distribution where b is a properly chosen constant. They chose b to be equal to 8 or 2 according as M < 5 or M > 30 where M - - n - p + 1; for the remaining values of M, they chose b to be equal to 4. Using the approximation described above, Lee et al. (1977) computed the values of c 1 for a--0.05, 0.01, M - - 1 ( 1 ) 20, 24, 30, 60, 120 and certain values of (p, q) where P[ C, < c, ] = ( 1 - a), C I = - { n - -1i ( p -
(2.3)
q+
where loga denotes the natural logarithm of a. Tile above values are given in Table 7 at the end of this chapter. Also given in Table 7 are some values of c I computed by Schatzoff (1966) f o r p and q less than 11. For tables for some other values of p, q and n, the reader is referred to Pillai and G u p t a Table 2 Comparison of LKC approximation with exact values for a = 0.05 p=3,q=3
p=5,q=7
M
cI
Exact
cI
Exact
1 5 10 16 20 40
1.354 1.034 1.010 1.005 1.003 1.001
1.359 1.035 1.011 1.005 1.003 1.001
1.530 1.090 1.035 1.017 1.012 1.003
1.535 1.090 1.035 1.017 1.012 1.003
Likeflhood ratio tests for mean vectors and covariance matrices
517
(1969), Mathai (1971) and Lee (1972). The above authors computed the tables by using methods different from the one used by Lee, Krishnaiah and Chang (LKC). Here we note that the distribution of Bp, q, n is the same as the distribution of Bq,p,n+q_ p. Table 2 gives a comparison of the exact values of c~ for a = 0.05 with the corresponding values obtained using L K C approximation. The above table is taken from Lee et al. (1977). We now discuss the likelihood ratio tests on mean vectors. 2.2.
Test specifying m e a n vector
Let x l , . . . , x u be distributed independently as multivariate normal with mean v e c t o r / t and covariance matrix Y,. Also, let Hi:/~=/~o where/~o is known. In addition, let h 1= N ( N -
1)(2-.- I%)'A - ' ( ~ -
l~o)
(2.4)
where N:~ = Y.~=i a9 and A = G~Y=l(xj - Y.)(xj - ~.)'. Then, it is known that the likelihood ratio test for H 1 involves accepting or rejecting H l according as
where P[~I < F~IIHI ] = (1 -- a).
(2.5)
The statistic ;k1 is Hotelling's T 2 statistic and (n - p + l ) ; k l / n p is distributed as central F distribution with (p, n - p + 1) degrees of freedom when H I is true, where n = N - 1. As an illustration, we use Example 4.1 of Morrison* (1967, p. 122) where p = 2 , N = 101, :~. = (55.24, 34.97)' A = 100(210.54 126.99
and/~0=(60,50)'.
126.99) 119.68
The test statistic in this case is
X~=101(55.24-60,34.97-50)[ 210.54126.99
126.99 ] - l ( 55.24-60 ] 119.68 k 34.97-50 ]
=357.43. *From Multivariate Statistical Methods by D. F. Morrison. Copyright © 1967, by McGraw-Hill, Inc. Used with permission of McGraw-Hill Company, Inc.
518
P. R. Krishnaiah and Jack C. Lee
Suppose a = 0 . 0 1 . T h e n f r o m the F table, F2,99(0.01)=4.8 where F~,b(a ) denotes u p p e r 100a% value of F distribution with ( a , b ) degrees of freedom. H e n c e F0.01' 1= 9.7. Since h I > 9.7, the null hypothesis that ~' = (60, 50) is rejected. 2.3.
Test for the equality o f m e a n vectors
F o r i = 1,2 ..... q, let x~l . . . . . xm, be distribated independently as multivariate n o r m a l with m e a n vector ~ and covariance matrix E. Also, let Hz: ~l . . . . . ~q. Then, the likelihood ratio test statistic (WiNs, 1932) for testing H 2 is given b y X2= ~
(2.6)
where N = N.q. 1Ni, - _ iv, . .N.2. = Z q_ l ~ t iv ~ l X i t , Nixi.--~,t=lXit , q N~a = Z i=1
q
n=N-q,
Ni Z ( x i t - .2i.)(xit- xi.) t, t=l
iv,
N :o = E E (xi,- .2 i~l
t~l
W h e n H 2 is true, the statistic h2 is distributed a s Bp,q ~ l,n where Be,q, r was defined b y (2.1). N o w , let C 2 = - ( n l ( p _ q+2))log)t2/Xff(q_l)(a ). Then, we accept or reject H 2 according as C2X c 2 where
e[ C2
(2.7)
As an illustration we give an artificial example. Suppose there are four three-dimensional populations with samples of sizes N i --5, i = 1, 2, 3, 4 a n d the null hypothesis H 2 is to be tested at 0.05 level. If [Zal--2.45 a n d [~o,[ =3.12, then )~2 :=0.79. In our case N = 2 0 , p = 3 , q = 4 , n= N-q=20-4 = 16. In order to complete the test, we have to c o m p u t e C 2 = - { 1 6 - 3(3 1 -4+2))1og0.79/16.9190=0.2160. Now M=16-3+1=14, and we have c 2 = 1.006 from the table in Lee (1972). Thus the null hypothesis that the p o p u l a t i o n m e a n vectors are equal is accepted. W e will now consider testing t h e hypothesis H 2 when q - - 2 . In this case, let F = N'N2(NN - 2) (.21. - .2z.)'A - 1(.2 L -- -~2.)
= 16. In order to complete the test, we have to c o m p u t e
(2.8)
-{ 16-
519
Likelihood ratio tests for mean vectors and covariance matrices
where
N,.~,.--.= ~JY£, xlj, N2.~2. = N,,)vz, x2y and 2
Ni
(2.9)
A = ~,, ~ ( x , - £i.)(xi,- ~.)'. i=1
t=l
~I1ae hypothesis H 2 is accepted or rejected according as
FX G2 where
?[F<&,21H2]=(1-o0.
(2.1o)
When H 2 is true, ( N - p - - I ) F / p ( N - 2 ) is distributed as central F with (p,N-p-1) degrees of freedom. The above test procedure is due to Hotelling and it is equivalent to the likelihood ratio test. As an illustration, we use Example 4.2 of Morrison* (1967, p. 126) where p = 4 , N l =37, N z = 12, ~1 =(12.57, 9.57, 11.49, 7.97)', :g2.= (8.75, 5.33, 8.50, 4.75)' and [11.2553 4 I A = 7 9.4042 7.1489 [ 3.3830
J
9.4042 13.5318 7.3833 2.5532
7.1489 7.3833 11.5744 2.6170
3.3830] 2.5532 / 2.6170|" / 5.8085J
F r o m the data, we have F = 22.05. Suppose a = 0.01. Then from F table, F0.m,2~3.8. Since {(N - - 2 ) p F o . o l , 2 / ( N - - p - 1)} ~- 15.6 which is less than 22.05, the null hypothesis that two mean vectors/L 1 and ~2 are equal is rejected at a = 0.01 level.
3° 3.1.
Test on independence of sets of variates The likelihood ratio test f o r independence
Let x ji - ( x tv ..... x¢/) for j = l , 2 . . . . . N. We assume that x 1 x N are distributed independently as multivariate normal with m e a n vector # and .....
*From Multivariate Statistical Methods by D.F. Morrison. Copyright ©, 1967, by McGraw-Hill Company, Inc. Used with the permission of McGraw-Hill, Company, Inc.
P• R. Krishnaiah and J a c k C. Lee
520
covariance matrix X. Also, let E ( x i ) = t t i and E ((Xi--l, gi)(Xj--l.tj) t )=X/j where x i is of order Pi × 1 a n d s = P l + " " " +Pq. I n this section, we wish to test the hypothesis H 3 where H3: X o = O
(3.1)
for i vaj = 1..... q. T h e p r o b l e m of testing H 3 is of interest in testing certain hypotheses on m e a n vectors• W h e n p l . . . . . pq = p and H 3 is true, the test procedure in Section 2.3 can be used to test the h y p o t h e s i s / t 1. . . . . /~q. N o w , let
All
•. .
Alq
A21
•..
A2q
Aql
•••
Aqq
A --
where N Aim = E j=l
N ( X U -- X i . ) ( X m j - Xm.) t ,
Nxc=
X j=l
xij,
n=N-1.
Wilks (1935) derived the likelihood ratio test for testing H a. A c c o r d i n g to this test, we accept or reject H 3 according as - 2 logX 3 X c a
(3.2)
P [ - 2 l o g ~ 3 < c 3IH 3 ] = (1 - ,x)
(3.3)
•3 =
(3.4)
where
and qIAI
II IA.I i=l
W h e n q = 2, the distribution of ~k3 u n d e r H 3 is the same as the distribution of Bpl,P2,n_p2. Box (1949) o b t a i n e d an a s y m p t o t i c expression for a class of likelihood ratio test statistics as a linear c o m b i n a t i o n of chi-square variables• This class includes test statistics for the equality of m e a n vectors, multiple
521
Likelihood ratio tests for mean vectors and covariance matrices
independence of sets of variables, sphericity, homogeneity of the covariance matrices, etc. Here we note that Wald and Brookner (1941) and Rao (1948) approximated the distribution function of the determinant of multivariate beta matrix with a linear combination of the distribution functions of chi-square variables. The number of terms given by Box for the asymptotic expression is not sufficient to get the accuracy desired in several practical situations. So, Lee et al. (1976) obtained higher order terms. Consul (1969) and Mathai and Rathie (1971) obtained exact expressions for the distribution of X3. Lee et al. (1977) approximated the distribution of ?t]/4 with Pearson's Type I distribution. The following table gives a comparison of the values obtained by approximation of Lee, Chang and Krishnaiah (LCK) with the corresponding values obtained using Box's expansion by taking terms up to order n - 13 The entries in Table 3 are taken from Lee et al. (1977). In Table 3, a 1 and a 2 are the values of a if we use L C K approximation and Box's asymptotic expression respectively where P[ V3
-Ep;, j=l
ao= 1
D1 = ~ ( 2 G 3
+ 3 C2),
1 D2= 12N~(G4+2G3
- G2),
1 fl = "~ G2"
We then check the sign of D 2 - D~. If D 2 - D~ is positive, then F = V 3 / b is distributed approximately as an F distribution with fl and f2 degrees of Table 3 Comparison of LCK approximation with asymptotic expression of order n - 13 q
3
5
11
¢3
°~1
1x2
£3
°~i
°~2
10 15 20 30
1.913 1.187 0.860 0.555
0.05 0.05 0.05 0.05
0.0499 0.0500 0.0500 0.0500
4.978 2.947 2.099 1.333
0.05 0.05 0.05 0.05
0.0488 0.0497 0.0499 0.0500
522
P. IL Krishnaiah and Jack C. Lee
freedom where f2 = ( f l + 2 ) / ( D z - D~) and b = f l / ( 1 - D 1 - ( f l / f 2 ) ) . On the other hand, if D E - D1z is negative, then F = f 2 V 3 / f l ( b V3) is distributed approximately as an F distribution with fl and f2 degrees of freedom where f2--(fl + 2 ) / ( D ~ - 0 2 ) , b = j ~ / ( 1 - 01 +(2/f2) ). The null hypothesis is re o jected at a level if F >Fy,,f2(a) where FA,A(a ) is the upper 100a% value of the F distribution with fl and f2 degrees of freedom. Recently, Mathai and Katiyar (1977) computed the percentage points of )t3 by using an exact expression. To illustrate the likelihood ratio test for mutual independence of variables, we use the data of Abruzzi (1950) on an industrial time study with a sample of size 33 instead of 76. Data was collected o n the length of time taken by several operators on the following variables: (i) pick up and position garment, (ii) press and repress short dart, (iii) repositio n garment on ironing board, (iv) press three quarters of length of long dart, (v) press balance of long dart, (vi) hang garment on rack. The following summaI2¢ of the data on sample mean vector and covariance matrix based upon the first three variables are taken from Anderson* (1958, pp. 240-241). f 9.47] 57= 125.561' [ 13.25j
[ 2.57 s = / 0.85 [ 1.56
0.85 37.00 3.34
1.56] 3.34/. 8.44j
In this problem we have s = 3 , p i = l , q = 3 , X3=0.8555 and hence -21og)~3=0.31214. From Table 8 with a = 0 . 0 5 we find that c3=0.518 as M = N - s - 4 = 33 - 3 - 4 = 26. Thus the null hypothesis of mutual independence is accepted at level of significance a = 0.05.
4.
Tests
on covariance
matrices
The test discussed in Section 2.3 for the equality of mean vectors is valid when the covariance matrices of the populations are equal. So, it is of interest to test for the homogeneity of the covariance matrices. It is also of interest to test for the structure of the covariance matrix in single sample case since we can take advantage of the structure in testing the hypothesis on mean vector. In this section we shall discuss tests concerning the covarianee matrix of a single population as well as the covariance matrices of q independent populations. For the single population case we shall discuss the sphericity test where the covariance matrix is proportional to a given matrix, and shall also discuss the test of the hypothesis that the *From An Introduction to Multivariate Statistical Analysis by T. W. Anderson. Copyright © 1958 by John Wiley and Sons, Inc. Used with permission of the publisher.
Likelihood
ratio tests Jbr mean vectors and covariance matrices
523
covariance matrix is equal to a given matrix. F o r the case of several populations, we shall discuss the homogeneity of covariance matrices across the populations, as well as the multiple homogeneity of covariance matrices where several subgroups of homogeneous covariance matrices are established. 4.1.
T h e sphericity test
The likelihood ratio test of the null hypothesis H 4 : ~] = 02~]0
(0 2 is unknown, :E0 known)
(4.1)
on the basis of the r a n d o m sample x~ . . . . . x N from a p-variate normal distribution with m e a n vector/L and covariance matrix E is based on the statistic
)t4 =
IA~o'l
(trAZol/p)"
(4.2)
o and trB denotes the trace of where A _- N,N= 1( x t - x , ) ( x t - ~ .),' N 2 . =Y.N=lx square matrix B. The hypothesis H 4 is accepted or rejected according as X4<>c4 where P [ X 4 / > e 4 I H 4 ] = ( 1 - a ) . The above statistic ~4 for testing H 4 was derived by Mauchly (1940). The null hypothesis (4.1) is equivalent to the canonical f o r m H4: ~ ] - o 2 I ( I being an identity matrix) as we can transform x I..... x u to Yl . . . . . YN by .l~ = Gxj where G is a matrix such that G Z o G ' = I . Thus, the null hypothesis H 4 is equivalent to the hypothesis that we have a set of p independent r a n d o m variables with a c o m m o n variance v 2. Consul (1969), Mathai and Rathie (1970) and Nagarsenker and Pillai (1973a) obtained expressions for the exact distribution of ~4. Table 9 gives the values of c 4 computed by Nagarsenker and Pillai f o r p = 4(1)10 and a = 0.01, 0.05 where e4 is defined
by e [ ~ 4 i> c41H4] = (I - ~).
(4.3)
Lee et al. (1977) approximated the distribution of ~t41/4with Pearson's Type I distribution. Table 4 gives a comparison of the values of c 4 obtained by using L C K approximation with the corresponding values obtained by Nagarsenker and Pillai using exact expression. F o r the parameter sets not listed in Table 4, we can also use the following asymptotic expression of
P. 1L Krishnaiah and Jack C. Lee
524
Table 4 Comparison of LCK approximation with exact values for a ~ 0.05 p
4
n 6 10 15 21 33 41
5
7
LCK
Exact
LCK
Exact
LCK
Exact
0.0169 0.1297 0.2812 0.4173 0.5833 0.6507
0.0169 0.1297 0.2812 0.4173 0.5833 0.6508
0.0013 0.0492 0.1608 0.2877 0.4663 0.5453
0.0013 0.0492 0.1608 0.2876 0.4663 0.5453
0.0029 0.0368 0.1111 0.2665 0.3515
0.0030 0.0368 0.1111 0.2665 0.3515
Box (1949): P[ - nplogX 4
=
= P { d ~z)'~(.,02(P (x~+4~z ~ - P ( ~ < z ) ) + O ( n -3)
(4.4)
where X~ denotes the X2 random variable with f degrees of freedom,
f=½p(p+ l ) - l, n = N - 1 , 0=1
2p2+p+2
(4.5)
6pn (p + 2)(p - 1)(p - 2)(2p 3 + 6p 2 + 3p + 2)
~2
(4.6)
288p2rt2p 2
The null hypothesis is rejected at level a if the computed value of -nplog~ 4 (used as z in (4.4)) provides probability greater than 1 - a. Of course, a rougher approximation will be to approximate X2= -nplog~ 4 as a X2 distribution with f degrees of freedom, with the error of the remaining terms of the order n -a. The null hypothesis is rejected at level a if X2 >X~(a), where X~(a) is the upper 1004% value of X2 distribution with f degrees of freedom. For illustration we use the data described in Section 3.2 by taking all six variables. In this case, ~.=(9.47, 25.56, 13.25, 31.44, 27.29, 8.80)', [ 2.57 [0.85 ,,/1.56 A--,511.79
/1.33 L0.42
0.85 37.00 3.34 13.47 7.59 0.52
1.56 3.34 8.44 5.77 2.00 0.50
1.79 13.47 5.77 34.01 10.50 1.77
1.33 7.59 2.00 10.50 23.01 3.43
0.42 ] 0.52[ 0.50 / 1.77 / 3.43[ 4.59J
Likelihood ratio tests for mean vectors and covariance matrices
525
and we w a n t to test the null hypothesis that the p o p u l a t i o n covariance matrix Z has the f o r m ~ = o2I. Substituting into (4.2) with Z o = I we have ~4--0.0366 which is m u c h smaller t h a n 0.6641 (for N = 8 0 ) and hence the null hypothesis that Z = o2I is rejected at level of significance a = 0.05. 4.2.
Test specifying the covariance matrix
Consider the p r o b l e m of testing the null hypothesis Hs: Y~= Z 0
(~o is known),
(4.7)
on the basis of a r a n d o m sample x~ ..... x u f r o m a p-variate n o r m a l distribution with m e a n vector # a n d covariance matrix E. T h e likelihood ratio test for testing H 5 was derived b y A n d e r s o n (1958). The modified likelihood ratio test statistic (obtained b y c h a n g i n g N to n = N - 1 in the likelihood ratio statistic) is ~5=(e/n)Pn/ZlAZol,n/2etr(_
1 AZ0_ i)
(4.8)
where n = N - 1 a n d A = Y,tu= l(xt - ~ ) ( x t -- ~.)'. Since the null hypothesis (4.7) is a special case of (4.1) with 0 2 = 1, we see that (4.7) is equivalent to the hypothesis that we have a set of p independent r a n d o m variables each with unit variance. A c c o r d i n g to the likelihood ratio test, we accept or reject H 5 according as -21og~sXc 5 where P [ - 2 1 o g ~ 5 < c51H5 ] = (1 - a).
(4.9)
K o r i n (1968) and N a g a r s e n k e r and Pillai (1973b) gave values of c 5 for p = 2 ( 1 ) 1 0 and a = 0 . 0 1 , 0.05. Lee et al. (1977) a p p r o x i m a t e d the distribution of 2~1/34 with Pearson's T y p e I distribution. Table 5 taken f r o m the Table 5 Comparison of LCK approximation with exact expression for a = 0.05 p n 6 7 10 11 13
4
6
10
LCK
Exact
n
LCK
Exact
n
LCK
Exact
25.76 24.06 21.75 21.35 20.77
25.76 24.06 21.75 21.35 20.77
8 9 10 15 20 25
49.24 45.82 43.62 38.71 36,87 35.89
49.25 45.83 43.63 38.71 36.86 35.88
12 13 14 15 20 25
119.08 111.15 105.76 101.83 91.28 86.51
119.07 111.15 105.76 101.82 91.28 86.52
P. tL Krishnaiah and Jack C. Lee
526
above paper gives a comparison of the exact values with the values obtained by using L C K approximation for a = 0.05. Table 10 gives the values of c 5 for a = 0 . 0 1 , 0.05; these values are taken from Nagarsenker and Pillai (1973b). When n exceeds the values given in the table, either X2 or F approximation can be used. These approximations are given in Korin (1968). (a) X2 approximation. The statistic X2= - 2 ( 1 - D 1 ) l o g ) ~ 5 is approximated as a X2 distribution with fl = ½P(P + 1) degrees of freedom, where Dim- ( 2 p + 1 - 2 / ( p + 1)}/6n (b) F approximation. The statistic F = - - 2 1 o g 2 t s / b is approximated as an F distribution with fl and f2 degrees of freedom where f~+2 D2- D 2 ' D 2 = (p - 1)(p + 2 ) / 6 n 2, b=
f, 1-D,-(f~/f2)
"
Again in the approximation, the null hypothesis is rejected if the computed value is too large, i.e., X2 >X:,(a) 2 or F>Fy,,:2(a ), where X~(a) is the upper 100a% values of X2 distribution with fl degrees of freedom and Fy,,A(a ) is the upper 100a% value of F distribution with fl and f2 degrees of freedom.
4.3.
Test for the homogeneity of covariance matrices
As in Section 2.3, let Xil..... XgN, ( i = 1,2 ..... q) be distributed independently as multivariate normal with mean vector/~; and covariance matrix Z i. We are interested in testing the hypothesis H 6 where H6: X 1.....
Xq.
(4.10)
Wilks (1932) derived the likelihood ratio test for 1-f6. The modified likelihood ratio statistic (obtained by interchanging N; with n i = N i - 1 in the likelihood ratio statistic) is q
r[ IA.l~/~n ~n/~ ~6 :=:
i=l q
IE i=1
q
Aii] n/2 ~[ npng/2 g=l
(4.11)
Likelihood ratio tests f o r mean vectors and covariance matrices
527
where q
n= •
Ni
hi,
Ni
Ni2"c = ~ x,) and
i=1
j=l
Aii= ~_, (xg-2"L)(x~j--~c-/.)'o j=l
Lee et aL (1977) gave values of c6 for ni=no=(p+l)(1)20(5)30 , p = 2(1)6, q--2(1)10 and a=0.01, 0.05, 0.10, where c6 is defined by (4.12)
P [ - - 21og~ 6 ~c61H6] == 1 - 0~.
Values of c6 for some values of the parameters are given in Table 11. We reject the null hypothesis at level a when the computed value of -21og2t 6 is larger than c6. For the parameter sets not given in the table, either X: or F approximation due to Box (1949) can be used. The F approximation has been adopted by SPSS in the Option Statistic 7 of the Subprogram Discriminant (Chapter 23 Discriminant Analysis). (a) X2 approximation. The statistic X2=(-210g~k6)(l-D1) is approximately distributed as X2 with f l = ½ p ( p + 1 ) ( q - 1 ) degrees of freedom, where Di=
_,,_-_,
j = l nj
(2p2 + 3 p - 1)(q+ 1)
n
if nj = n o ( j = 1..... q)o
6(p + 1)qn 0
(b) F approximation. In order to use this approximation, we compute
D2=(P-1)(p+2){
6-(q--i5
~
1
i=, 4
= ( p - 1)(p+2)(q2+q+ 1) 6q2n ~
1 }
n2 if r9 = n o ( j = 1..... q).
We then check the sign of D 2-- D~ where D l was given in (a). Case 1: D 2 - D1z >0. F=(--21ogX6)/b is distributed approximately as an F distribution with fl and f2 degrees of freedom, where f2 = (fl + 2)/ (D 2 - Di2) and b = f / ( 1 - D 1-(f~/f2))Case 2: D 2 - D~ < 0. The statistic F=(-2f21ogX6)/fl(b+21ogX6) is approximately distributed as the F distribution with fl and f2 degrees of freedom where f2 = (fl + 2)/(D~ - D2) and b = f z / ( 1 - D l + 2/fz).
P. R. Krishnaiahand Jack C. Lee
528
As an illustration we use Example 4.11 in Morrison* (1967, p.153), where p --2, q--2, n 1= n2=31,
9'881,
(,90 1009°,
1 2.52
and -21og)~ 6 = 2.82. Since c 6 is about 8 which is m u c h larger than 2.82, the null hypothesis that the covariance matrices are homogeneous is accepted at the level of significance a = 0.05.
4.4
Test for the multiple homogeneity of covariance matrices
When the null hypothesis of the homogeneity of q covariance matrices is rejected, the multiple homogeneity of covariance matrices considered in this section should be the next hypothesis to test concerning the covariance matrices. This hypothesis is also of interest in studying certain linear structures on the covariance matrices (see Krishnaiah and Lee, 1976). Here we will use the notations of Section 4.3. The null hypothesis to be tested is
~ql+l
=...
~ql ~ ~q~
L y'q~ '+ I
..... --
Zq,
H7:
where qJ=O, (~* =E~=lqi and q~=q. If the null hypothesis is true, then among the covariance matrices of q independent populations, there are k groups of homogeneous covariance matrices, and qi denotes the n u m b e r of populations in/tit group. The modified likelihood ratio statistic is q
H IAiilnil "~/z 3k7 --
i= 1
k
H
j-1
(4.14)
~*
E
I"712
Aii/n;
i~qff_l + l
_
qY'
where ni and Aii were defined in Section 4.2, and n~--E;J=q;_,+ln i.
*From Multivariate Statistical Methods by D.F. Morrison. Copyright © 1967 by McGrawHill, Inc. Used with permission of the publisher.
Likelihood ratio tests for mean vectors and covariance matrices
529
Lee et al. (1977) gave values of c 7 for n i = no; m = no-- p = 1(1)20(5)30; p - - 1,2,3,4; q = dk; k = 2 , 3 ; a n d a =0.05, where c 7 is defined by (4.15)
P [ - 2 Iog)k 7 ~
These values are given in Table 12 for a = 0 . 0 5 . We reject the null hypothesis at level a when the computed value of - 2 1 o g A 7 is larger than c 7. Because of the restriction q = dk, with k and q specified, d is automatically specified. Thus, in the table, for example, if k = 2 and q = 6, then d = 3. This means that there are two groups of covariance matrices each with three homogeneous covariance matrices. F o r the parameter sets not given in the table, the simple X2 approximation can be used. We approximate - 2 1 o g ) ~ as a X2 distribution with f = ½ p ( p + l ) ( q - k ) degrees of freedom where 2~ is obtained from ~7 by replacing n i by N i and nj* by N j - Xr=qT_ m i.
As an illustration we give an artificial example where p = 2, n o = n I = n z = n 3 = n a = 3 1 , k = 2 , d---2, q=4, Al, A 2 are as given in the previous section, and A 3 - 3 1(2.11 3.12
2.1118.88 ] and
A4--31( 4"012.44 10.55) 2.44
Substituting into (4.14) we have - 2 log)~ 7 --3.724. Since M = 3 1 - 2 - - 2 9 we have c7= 13.05 for a = 0 . 0 5 . Thus, the null hypothesis that Z 1=Z2, Y~3=Y'4 is accepted.
5.
Tests on mean vectors and eovariance matrices simultaneously
In this section we shall cover tests concerning the m e a n vector and covariance matrix of a single population as well as those of q independent populations. 5.1.
Test specifying mean vector and covariance matrix
Let x~ ..... xN be a r a n d o m sample from a p-variate normal distribution with mean vector/L and covariance matrix E. The null hypothesis to be tested is
H8:
{ X = X0
(5.1)
where ~ and Y~0 are known. When the null hypothesis is rejected, then either/~V~#o, or E ~ E o , or # ~ # o and Z # E o.
530
P. R. Krishnaiah and dack C. Lee
The null hypothesis is equivalent to the canonical form H~: ~ = 0, X = I as we can transform x 1..... xN to yl ..... YN by .~ = G ( x j - / % ) where G is a matrix such that GY~oG'=I. Thus, the null hypothesis (5.1) is equivalent to the hypothesis that we have a set of p independent r a n d o m variables each with zero mean and unit variance. We, therefore, have complete knowledge of the population under consideration: The likelihood ratio statistic for testing the null hypothesis H 8 is known (see Anderson (1958)) to be - 21ONX8= Nlog[Xot + NtrXo-1 [ S + ( 2 -/L0)(2f-/~0)' ] - Nlog[S[-
Np
(5.2)
where N
N
NS= E (xj-x.)(xj--x.)',
N2----E xj.
j=l
j=l
Nagarsenker and Pillai (1974) gave values of c 8 for p = 2(1)6, a = 0.01, 0.05 where c s is defined by
e [ - 21ogX8.<
(5.3)
= (1 - . ) .
These values are given in Table 13. Chang et al. (1977) approximated the distrfbution of X]/b with Pearson's Type I distribution where b is a properly chosen constant. T a b l e 6 gives an idea about the accuracy of the above approximation. The values under the column C K L are the values obtained by using the approximation of Chang, Krishnaiah and Lee, whereas the values under the column Exact are the values given by Nagarsenker and Pillai (1974). The following approximations were suggested by Korin and Stevens (1973). Table 6 Comparison of CKL approximation with exact expression for a = 0.05 p=2
p=3
p=6
N- 1
CKL
Exact
N- 1
CKL
Exact
N- 1
CKL
Exact
6 7 10 11 13 21
13.69 13.26 12.55 12.40 12.19 11.75
13.69 13.27 12.55 12.40 12.19 11.75
8 9 10 15 21
20.91 20.38 19.98 18.85 18.26
20.92 20.38 19.98 18.85 18.26
12 13 14 15 21 25
51.70 50.48 49.50 48.69 45.79 44.75
51.70 50.48 49.50 48.69 45.79 44.75
531
Likelihood ratio tests for mean vectom and covariance matrices
(a) X2 approximation. The statistic X2= - 2 1 o g ~ 8 / ( 1 - - D I ) 2 is approximated as a X2 distribution with fl = ½P(P+ 3) degrees of freedom, where
Dl=(2pZ+9p+ ll)/6N(p+ 3).
(b) F approximation. The statistic F = - 2 1 o g h s / b is approximated as an F distribution with fl and f2 degrees of freedom where
f2=(f,+2)/(D2--D?)~ D2=(p3+6p2+llp+4)/(6N2(p+3)) and b=fl/(1-D,-(f]/f2)). In the approximation the null hypothesis is rejected if the computed value is too large, i.e., XZ> X~(a) or F >F/,,/2(o0. As an illustration, we give an artificial example where N =28, p = 2, ~ (0.5, 1.2)' and
s_-(15 05) 0.5
0.9
Suppose we want to test the null hypothesis
H= (/_t = 0, Z=I at level of significance a =0.05. Substituting into (5.2) we obtain - 2 1 o g X 8 -- 55.85 which is m u c h larger than c 8 = 11.591 and hence the null hypothesis that/L = 0 and Z = I is accepted at level of significance a = 0.05.
5.2. Testfor the homogeneityofpopulatiom In this section, we discuss the likelihood ratio test for H 9 where Hg: / ~l . . . . . t
~gl . . . . .
Zq, /~q
(5.4)
and ~i and Xi are the m e a n vector and covariance matrix of ith population. In this section, we use the same notation as in Section 4.3. The likelihood ratio test statistic for testing H 9 was derived by Wilks (1932). The modified likelihood ratio test statistic (see Anderson, 1958) is given by npn/2 X9 = - -
q II
IAiil n'/2
i= 1
i~=lnP'~/zi~=lAiiWi~=lNi(xi.- x..)(x,- x..)'ln/2 where N2.. = ~]Y~x0.
(5.5)
P. R. Krishnaiah and Jack C. Lee
532
Chang et al. (1977) gave values of c9 for ni=n o, M = n o - P = !(1)20,25,30; p = 1(1)4; q=2(1)8 and a =0.05, where c9 is defined by P[ - 2 tog X9 <
c9[H9] =
(5.6)
1 - a.
We reject the null hypothesis at level a when tile computed value of -21og?t 9 is larger than c9. Table 14 gives the values of c 9 computed by Chang et al. (1977). For the parameter sets not given in the table, we can use Box's asymptotic expansion P [ - 2plogX 9
(5.7)
where X¢2 denotes the X2 random variable with f degrees of freedom, l q - 1)p(p + 3), f--- 5(
6(-~- 1)-(p+ 3)
g =1 ng
¢o2= 2 - ~ p 2 6
n(p + 3 ) '
(5.9)
l t ( p 2 - 1)(p +2)
1 n2
(5.8)
n2
n. 36(q-1)(p-q+2) 2 n2(p + 3)
1. 1 2 ( q - 1) n2
× (7 q-2q2 + 3pq--2p2-6p-4)
,
The null hypothesis is rejected at 100a% level if the computed value of -2plogX9 (used as z in the formula (5.7)) provides probability greater than 1 - a. Of course, a rougher approximation will be to approximate X2= -2plogX 9 as a Xa distribution with f degrees of freedom, with error of remaining terms of the order n -2. The null hypothesis is rejected if
L i k e l i h o o d ratio tests f o r m e a n vectors a n d covariance matrices
533
For illustration purpose we give an artificial example where p - - 2 , q = 2, N 1--- N 2-- 11, :~1.= (0.5, 1.2)', ~2. = (0.7, 1.4)' 0.8 (/~1--1) -1A 11=( 0:55 ~:~) and (N 2- 1)-1A22 ~--(0.81"71.5 )" Suppose we want to test the null hypothesis
{
~1 =/x2,
H10: E l = Z 2 at level of significance a = 0.05. Substituting values of x~., x2. and Aii into (5.5) we obtain - 2 1 o g ~ = 0 . 8 7 1 1 which is much smaller than c9--12.01 and hence the null hypothesis t h a t / h =/L2 and Z l = Z a is accepted at level of significance a = 0.05.
6.
Test for equality of means, variances and covariances
Let/~' = ( ~1..... ~v) and E = (o~j) respectively denote the mean vector and covariance matrix of p-variate normal population. Also, let
H~':
I'Ll .....
H~': oll . . . . . H~': %-=o12
P~
opp
(i4=j).
Likelihood ratio test criteria for H~ f)H~ and HI~N H~ f)H~ were derived by Wilks (1946). The likelihood ratio test statistics for H ~ N H ~ and H~ A H~' fq H~ are given by Xl0 and Xll respectively where )~10=
ISI (s2~(1 - r ~ - 1 ( 1
L
(6.1) -t- ( p - -
l)r)'
(6.2)
(p -- 1)s2(1 - r) + •
/=1
( x i . - x..) 2 J
NS = A, and A was defined earlier in Section 2.2. Also ps2= tr S, pY. = ]~,e-=l~i., x.=(:~ 1...... ~p.)' and p(p-1)rsZ=Ei4=:sij. Wilks (1946) obtained the exact distribution of Xl0 and Xl~ for p = 2, 3. Varma (1951) obtained the exact distribution of Xl0 and Xl~ for some other values of p and also computed a few tables. Using the methods similar to those used by Box
534
P. R. Krishnaiah and.lack C. Lee
(1949) and Nair (1938), Nagarsenker (1975a, b) derived the exact distributions of Xl0 and X11- He has also constructed percentage points of )'10 and Air Tables 15 and 16 give the values of Clo and e~l respectively where
P[X,o>>.qolH~n H~] =(1 - a), P[ X11 ~Cl,IH~n H'~ n H~ ] = ( 1 - a ) . 7.
Test for compound symmetry
Let x ' = ( x ' 1..... x£+2) be distributed as a t-variate normal with mean vector/x' and covariance matrix Z. Also, let E(xi)=l~i, E ( ( x i -t~i)(xj-/~j)'} = Z o. In addition, let x i be of order 1 x 1 for i-<
2 for i-- 1,2. Let H12 denote the hypothesis that the variances are equal and the covariances are equal within any set of variables, and the covariances of any two variables belonging to any two distinct sets are equal. Votaw (1948) derived the likelihood ratio test for H12 and it is given by •12 where X,2=
I Vl/I ~'1-
(7.1)
In the above equation, V=0'/J )' ~'u=Z-=l(X/---x/-)(xJ~-~9.), where (x~ . . . . . . X'b+Z,,),~(a= 1..... N), are independent observations on x'. Also Nxi.=Y~,=lXi, ~, V= (tTu), ~%,= t%,, vsio=p£-lYTop,j, N
--
N
~
.
/
~
1
Pi i = Pa Z 1Jjaja' Ja•
--
1
P~cda-- p,,(p,~ -- 1) Z
P+.m.'
la .,Lr%
1
/ " , j . . = - - - 2 u,omo., PaPa" t.,m.. for s,s'= 1..... b; a , a ' = 1,2, a:/:a'. Also,
ia, l.,j., m ~ = b + f i a + 1..... b+fia+l,fia=Pl + . . .
+p._l,fil=0.
Now, let
P[~kl2 > c121H12] =(1 -- 0/).
(7.2)
Lee et al. (1976) approximated the distribution "12)tl/bwith Pearson's type I distribution where b is a properly chosen constant. The accuracy of the above approximation is good. Using the above approximation, they computed the values of c12 for some values of the parameters and they are given at the end of this chapter in Table 17.
Likelihood ratia tests for mean vectors and covariance matrices
8.
535
Tests on linear structm'es of covariance matrices
In the preceding sections, we discussed likelihood ratio tests on some special structures of the covariance matrices. In this section, we discuss the tests of hypotheses on more general structures of the covariance matrices. Let x ' = ( x 1. . . . . xp) be distributed as a multivariate normal with m e a n vector p and covariance matrix Z. It is of interest to test the hypothesis H13 where 2 Hi3: ~ = o~G~ + • • o + aiGko
(8.1)
In Eq. (8.1), o12..... a~ are unknown and G l . . . . . G k are k n o w n matrices of order p × p and are at least positive semidefinite. Beck and Bargrnann (1966) and Srivastava (1966) considered the problem of testing H when G l ..... Gk commute with each other. Anderson (1969) derived the likelihood ratio test for H13 when G~..... Gk do not necessarily commute with each other. We will now consider more general models. Let x'ffi (x'l ..... Xp) be distributed as a multivariate normal with m e a n vector p ' = (~'1..... pp) and covariance matrix E = (Y0)" Here x i is of order q × 1, E(xg)=p~ and E [ ( x i - p g ) ( x j - pj)'] = E#. W e wish to test the hypothesis H14 where H14:Z=GI®ZI+""
(8.2)
+ G k ® Z k.
In the above equation ® denotes the Kronecker product, G 1..... Gk are known matrices of order p × p , and ~ I . . . . . Z k are unknown matrices of order q × q. We will now discuss the procedures discussed b y Krishnaiah and Lee (1976) for testing Hi4. We will first assume that G 1..... Gk c o m m u t e with each other. Then, there exists an orthogonal matrix F: p × p such that F G I F ' = diag(~ 1. . . . . ~.p). Now, let y = ( F ® % ) x . Then y is distributed as multivariate normal with mean vector p * = ( F ® I q ) p and covariance matrix E* where E* -- ( F ® Iq)E(F' ® Iq), and
Z*=
(8.3)
When H14 is true, Z* =diag(Z~l . . . . . E~p) where X$ = h l j E 1+ - . . k f f i p , the hypothesis H14 is equivalent to H15: X ~ = O
+ h k j Z k. If (i¢=j= t,2 ..... p).
This is equivalent to the problem of testing for multiple independence of
536
P. R. Krishnaiahand.lack C. Lee
several sets ot varia01es wlaen their joint distribution is multivariate normal. So, we can test the hypothesis H14 by using the test procedure in Section 3. Next, consider the problem of testing the hypothesis HI4 when k
x~2zq x2j~
o~o
Xjq (8.4)
~,1~
~,iq
...
X~,lqj
Then
=A
ix;,
!
(8.5)
.
X~
Let E: ( p - k)q × p q be of rank ( p - k)q and EA--0. Then, the hypothesis Hla under the above assumption is equivalent to H~5 fq H16 where H15 was defined earlier and
1
H16: E
=0.
x:,j In particular, if
[ ~i1
i,p + 1
,ps_l
+ 1
m..
•
.
=
" " "
~,pl+P2 ~
(8.6)
~--" . . .
for i = 1,2,..., k, then
[
x~,~
:Pl+ 1,pl+ 1
Hl6:J [ ~;s_t+ l,p..t+ 1
.....
".. .....
~;[
+P2,Pl
+P2'
Likelihood ratio tests for mean vectors and covariance matrices
537
The p r o b l e m of testing H16 is equivalent to testing for multiple h o m o g e n e ity of the covariance matrices of multivariate n o r m a l populations when the underlying populations are multivariate normal. So, the procedure discussed in Section 4.4 can be used to test this hypothesis. W e will n o w consider some special cases of the structure Hla. Let Z1
Z2
-..
Z2
E2
Zl
. ..
Z2
Z 2
E 2
" • .
Z 1
(8.7)
If the covariance matrix is of the a b o v e structure, it is called block version of intraclass correlation model. T h e structure is seen to be a special case of (8.2) if we let Gt = I a n d G 2 = J - I where J is a matrix with all its elements equal to unity. W h e n q = 1, Z denotes the covariance m a t r i x with all its diagonal elements equal a n d all off diagonal elements equal. Next, let Z have the structure of the f o r m
El
E2
Z2
.. •
z,
..
Z3
•. .
Zp
(8.8) Z1
where Z j = E p _ j + 2. Also, let W o = I p and
wj=
(0
/pO_j)
forj=l
.....
p--1.
The matrix (8.8) c a n be expressed as
= w0® , + ( w , +
(8.9)
Since the matrices W 0, ( W~ + W~) ..... c o m m u t e with each other, the structure given by (8.8) is of the same f o r m as (8.2). T h e structure (8.8) is k n o w n to be block version of circular symmetry. Olkin (1973) considered the p r o b l e m of testing the hypotheses that E is of the structure (8.7) a n d (8.8). Now, let (xlj . . . . . Xpj), ( j = 1. . . . . N), be a r a n d o m sample of size N d r a w n f r o m a multivariate n o r m a l p o p u l a t i o n with m e a n vector gt a n d covariance matrix N. We wish to test the hypothesis H14 where G 1. . . . . G k do not
538
P. R. KrishnaiahandJack C. Lee
necessarily commute with each other. The likelihood ratio test for given by
x=
1-~14is
(8.1o)
ICl~/~
i~l ai @~ilN/2 where the m a x i m u m likelihood estimates )2~ of ~Ei are given by
tr[
k
]-1
i=1
=tr
~
i=l
[ @ @(I~ab]-1 Ek G i ~ i 1-t i=l
Gi®E i
(8.11) for a, b --- 1..... q, j = 1,2 ..... k and f~ab is a q X q matrix with 1 in (a, b)th position and zero elsewhere. Also, N C = Y.~v=l(xi - ~ . ) ( x i - £ . ) ' and N ~ = Y'/n-l:9. The statistic - 2 1 o g X is distributed asymptotically as a Xz with p degrees of freedom where v---½pq(pq+ 1 ) - ½ k q ( q + 1). We will n o w discuss the N e w t o n - R a p h s o n method of solving E~. (8.11) iteratively; Let S I..... S~ be the initial approximation to N1..... Z~ a n d let Y'i = Si + R~ for i-- I, 2 ..... k. Then
(Gi®2,
= *+ E Gi®s,
l
i~I
= I-
~, G i ® S i
i=1
•
i=1
Z c,®
E c,®
i=I
i~l
Gi®R i +...
X
•
i=1
Gi®S i
(8.12)
Using Eq. (8.12) in Eq. (8.11) and expanding the left side of Eq. (8.11) into terms linear in R 1. . . . . R k, the following equations are obtained:
k
t~( ~l ~® ~i) ~° l I~~ ~ °o~° ~ - ~~ ®Oa~+ ~ o ' ( ~ ~ o ~ 1~o ~ = tr ~:o~[ C~:o ' - I] ( g ~ % )
(8.13)
for j = l ..... k, where ~^0 = ~ i =k1 Gi@Si. The above equation can be used to solve for R l ..... Rk. Let R u . . . . . Rkl be the values of R I , . . . , R ~ obtained by solving Eq. (8.13). Then Sil=Si-t-Ril is a second approximation to the m a x i m u m likelihood estimate of Z,. where ¢1.01=Ei=lGi@Sil is a second
539
L i k e l i h o o d ratio tests f o r m e a n vectors a n d covariance matrices
approximation to the maximum likelihood estimate of Z. L e t R~.2 be the value of R i obtained by solving Eq. (8.13) after replacing ~20 with E01- Then Si2--S;+ Ri2 is the third approximation to E i. We can continue this procedure to the desired degree of accuracy. For initial approximations of X~..... Z k, we can use their unbiased estimates. The unbiased estimates can be obtained from
N E { t r C ( G g ® ~ a b ) } = t r X" N-1 i=1
Gi@~-"i)(G. @~ab)
g = 1..... k; a , b = 1..... q.
(8.14)
When q = 1, the above procedure of testing l-Ii4 when G 1..... Gk are not necessarily commutative is equivalent to the procedure of Anderson (1969). Next, consider the problem of testing the hypothesis Hi7 where H17: s ~ = U l X 1 U ; + . . .
(8.15)
+ UkE~U~
and U~ (i= 1..... k) are known matrices of order p ×s i. Also, p ( p + 1) Y.k_1si(s~+ 1) and Z~ ..... Zk are unknown matrices. The likelihood ratio test statistic for H15 is known (see Krishnaiah and Lee (1976)) to be
2t*=
ICtN/2 I k
A
t
u,Y,G
(8.16)
IN~2
i=l
where C was defined earlier and Zi are the maximum likelihood estimates of E i given by
E i=l
iv;
=
E
,
Z v,2i
i=l
G,
i=l
g=l ..... k.
(8.17)
The statistic - 2 1 o g ~ * is distributed as X2 with ½p(p + 1 ) - i Y1 ' ik= lSi(Si + 1) degrees of freedom. When si = 1, the above test statistic 2~* was given in Anderson (1969). Consider the following multivariate components of variance model: y~-" UlCltl -{'- U2~flt2--[- - • - -+- U k _ l O t k _ l ' + O l
k
(8.18)
where U~: p × si ( i = 1..... k - 1 ) are known. Also, a l , . . . , % are distributed independently as multivariate normal with mean vector 0 and covariance matrices given by E(attt~)= E i. Then, the covariance matrix Y. of y is of the
540
P. R. Krishnaiah and Jack C. Lee
s t r u c t u r e g i v e n b y (8.15). T h e p r o b l e m o f t e s t i n g t h e h y p o t h e s i s Xt . . . . . Ek--1 = 0 is e q u i v a l e n t to t h e p r o b l e m o f t e s t i n g t h e h y p o t h e s i s t h a t E is of ' 1 u n d e r t h e m o d e l (8.18). R a o t h e s t r u c t u r e U1E1 U ; + • • • + U t_ l e t _ 1 U ~(1971) a n d S i n h a a n d W i e a n d (1977) c o n s i d e r e d t h e p r o b l e m of e s t i m a t i n g Xg u s i n g M I N Q U E p r o c e d u r e . F o r v a r i o u s d e t a i l s of the M I N Q U E p r o c e d u r e , t h e r e a d e r is r e f e r r e d to t h e c h a p t e r b y R a o in this v o l u m e . Next consider the model
x = V 1 ® 1 8 1 + . - . + V k _ ~ ® / 3 ~ _ l + Ip®/3k where V 1. . . . , Vk_ 1 a r e k n o w n m a t r i c e s of o r d e r p × p . Also, w e a s s u m e t h a t 181. . . . . 18k are d i s t r i b u t e d i n d e p e n d e n t l y as q - v a r i a t e n o r m a l w i t h m e a n Vector 0 a n d u n k n o w n c o v a r i a n c e m a t r i c e s E(18i18~)=f~ I ( i = 1 . . . . . k - 1 ) . T h e c o v a r i a n c e m a t r i x of x in this case is of the structure (8.2).
Table 7* Percentage points of the distribution of the determinant of the multivariate beta matrix pffi3, q = 4
p=4, q=4
p=5, q=4
pffi6, q = 4
0.050
0.010
0.050
0.010
0.050
0.010
0.050
0.010
1 2 3 4 5
1.422 1.174 1.099 1.065 1.046
1.514 1.207 1.116 1.076 1.054
1.451 1.194 1.114 1.076 1.055
1.550 1.229 1.132 1.088 1.063
1.483 1.216 1.130 1.089 1.065
1.589 1.253 1.150 1.I01 1o074
1.517 1.240 1.148 1.102 1.076
1.628 1.279 1.168 1.115 1.085
6 7 8 9 10
1.035 1.027 1.022 1.018 1.015
1.040 1.031 1.025 1.021 1.017
1.042 1.033 1.027 1.022 1.018
1.048 1.037 1.030 1.025 1.021
1.050 1.040 1.032 1.027 1,023
1.056 1.044 1.036 1.030 1.025
1.059 1.047 1.038 1.032 1.027
1;066 1.052 1.043 1.036 1.030
12 14 16 18 20
1.011 1.008 1.007 1.005 1.004
1.012 1.010 1.007 1.006 1.005
1.014 1.010 1.008 1.007 1.006
1.015 1.012 1.009 1.008 1.006
1.017 1.013 1.010 1.008 1.007
1.019 1.014 1.012 1.009 1.008
1.020 1.016 1.013 1.010 1.009
1.023 1.018 1.014 1.012 1.010
24 30 40 60 120
1.003 1.002 1.001 1.001 1.000
1.004 1.002 1.001 1.001 1.000
1.004 1.003 1.002 1.001 1.000
1.004 1.003 1.002 1.001 1.000
1.005 1.003 1.002 1.001 1.000
1.006 1.004 1.002 1.001 1.000
1.006 1.004 1.002 1.001 1.000
1.007 1.005 1.003 1.001 1.000
oo
1.000
1.000
1.000
1.000
1.000
1.000
1.000
1.000
26.2962
31.9999
31.4104
37.5662
,~q
21.0261 26.2170
36.4151 42.9798
Likelihood ratio tests for mean vectors and covariance matrices
541
Table 7 (continued) p=7,q=4
p=8,q=4
p=9,q=4
p = lO, q = 4
o:o oolo
0.050
0.010
0.050
0.010
0.050
0.010
1 2 3 4 5
1.550 1.263 1.165 1.116 1.087
1.667 1.305 1.188 1.130 1.097
1.583 1.286 1.183 1.130 1.099
1.704 1.330 1.207 1.146 1.109
1.614 1.309 1.201 1.144 1.110
1.740 1.355 1.226 1.161 1.122
1.644 1.331 1.218 1.159 1.122
1.774 1.379 1.244 1.176 1.134
6 7 8 9 10
1.068 1.055 1.045 1.038 1.032
1.076 1.061 1.050 1.042 1.036
1.078 1.063 1.052 1.044 1.038
1.086 1.070 1.058 1.048 1.041
1.088 1.071 1.060 1.050 1.043
1.096 1.078 1.065 1.055 1.047
1.097 1.080 1.067 1.057 1.049
1.107 1.088 1.073 1.062 1.054
12 14 16 18 20
1,024 1.019 1.015 1.013 1.011
1.027 1.021 1.017 1.014 1.012
1.029 1.023 1.018 1.015 1.013
1.031 1.025 1.020 1.016 1.014
1.033 1.026 1.021 1.018 1.015
1.036 1.029 1.023 1.019 1.016
1.038 1.030 1.024 1.020 1.017
1.041 1.033 1.026 1.022 1.019
24 30 40 60 120
1.008 1.005 1.003 1.001 1.000
1.008 1.006 1.003 1.002 1.000
1.009 1.006 1.004 1.002 1.000
1.010 1.007 1.004 1.002 1.001
1.011 1.007 1.004 1.002 1.001
1.012 1.008 1.005 1.002 1.001
1.013 1.009 1.005 1.003 1.001
1.014 1.009 1.006 1.003 1.001
oo
1.000
1.000
1.000
1.000
1.000
1.000
1.000
1.000
X~q
41.3372
48.2782
46.1943
53.4858
50.9985
58.6192
55.7585
63.6907
P. R. Krishnaiah and.lack C. Lee
542 Table 7 (continued) p~3, q=6
p=5, q=6
p=6, q=6
p = 7 , q-~6
0.050
0.010
0.050
0.010
0.050
0.010
0.050
0.010
1 2 3 4 5
1.535 1.241 1.145 1.099 1.072
1.649 1.282 1.167 1.113 1.082
1.514 1.245 1.154 1.108 1.081
1.625 1.284 1.175 1.121 1.090
1.520 1.255 1.163 1.116 1.088
1.631 1.294 1.183 1.129 1.097
1.530 1.266 1.173 1.124 1.095
1.642 1.306 1.194 1.138 1.105
6 7 8 9 10
1.056 1.044 1.036 1.030 1.025
1.063 L050 1.041 1.034 1.028
1,063 1.051 1.042 1.035 1.030
1.070 1.056 1.046 1.039 1.033
1.069 1.056 1.046 1.039 1.034
1.076 1.061 1.051 1.043 1.037
1.075 1.062 1.051 1.043 1.037
1.083 1.067 1.056 1.047 1.041
12 14 16 18 20
1.019 1,014 1.012 1.009 1.008
1.021 1.016 1.013 1.011 1.009
1.023 1.018 1.014 1.012 1.010
1.025 1.019 1.016 1.013 1.011
1.025 1.020 1.016 1.013 1.011
1.028 1.022 1.018 1.014 1.012
1.029 1.023 1.018 1.015 1.013
1.031 1.024 1.020 1.016 1.014
24 30 40 60 120
1.006 1.004 1.002 1.001 1.000
1.006 1.004 1.002 1.001 1.000
1.007 1.005 1.003 1.001 1.000
1.008 1.005 1.003 1.001 1.000
1.008 1.006 1.003 1.002 1.000
1.009 1.006 1.004 1.002 1.000
1.009 1.006 1.004 1.002 1.000
1.010 1.007 1.004 1.002 1.001
oo
1.000
1.000
1.000
1.000
1.000
1.000
1.000
1.000
X~q
28.8693
34.8053
43.7730
50.8922
50.9985
58.6192
58.1240
66.2062
543
Likelihood ratio tests for mean vectors and covariance matrices
Table 7 (continued) p~8,q=6
p=9,q=6
p = lO, q = 6
p=3,q~8
0.050
0.010
0.050
0.010
0.050
0.010
0.050
0.010
1 2 3 4 5
1.543 1.279 1.184 1.134 1.103
1.656 1.319 1.205 1.148 1.113
1.558 1.293 1.196 1.144 1.112
1.671 1.333 1.218 1.158 1.122
1.573 1.307 1.208 1.154 1.120
1.687 1.348 1.230 1.169 1.131
1.632 1.302 1.190 1.133 1.100
1.763 1.350 1.216 1.150 1.112
6 7 8 9 10
1.082 1.068 1.057 1.048 1.042
1.090 1.074 1.062 1,052 1.045
1.090 1.074 1.062 1.053 1.046
1.098 1.080 1.067 1.058 1.050
1.097 1.081 1.068 1.059 1.051
1,106 1.087 1.074 1.063 1.055
1.078 1.063 1.052 1.043 1.037
1.087 1.070 1.058 1.048 1.041
12 14 16 18 20
1.032 1.025 1.021 1.017 1.014
1.035 1.027 1.022 1.018 1.015
1.035 1.028 1.023 1.019 1.016
1,037 1.031 1.024 1.020 1.017
1.040 1.032 1.026 1.021 1.018
1.042 1.034 1.027 1.023 1.019
1.028 1.022 1.018 1.014 1.012
1.031 1.024 1.019 1.016 1.013
24 30 40 6O 120
1.011 1.007 1.004 1.002 1.001
1.011 1.008 1.005 1.OO2 1.001
1.012 1.008 1.005 1.002 1.001
1.013 1.009 1.005 1,003 1.001
1.014 1.010 1.006 1.003 1.001
1.015 1.010 1.006 1.003 1.001
1.009 1,006 1.004 1.002 1.000
1.010 1.007 1.004 1.002 1.000
oo
1.000
1.000
1.000
1.000
1.000
1.000
! .000
1.000
X~q
65.1708
73.6826
72.1532
81.0688
79.0819
88.3794
36.4151
42.9798
544
P. R. Krishnaiah and Jack C. Lee
Table 7 (continued) p---5, q = 8
p ~ 7 , q=8
p-----8,q = 8
0.050
0.010
0.050
0.010
0.050
0.010
1 2 3 4 5
1.556 1.280 1.182 1.131 1.100
1.672 1.321 1.200 1.145 1.110
1.538 1.282 1.189 1.139 1.108
1.648 1.321 1.210 1.152 1.117
1.538 1.288 1.195 1.144 1.113
1.646 1.326 1.215 1.158 1.123
6 7 8 9 10
1.079 1.065 1.054 1.046 1.039
1.087 1.071 1.059 1.050 1.043
1.086 1.071 1.060 1.051 1.044
1.094 1.077 1.065 1.055 1.048
1.091 1.076 1.064 1.055 1.048
1.099 1.082 1.069 1.059 1.051
12 14 16 18 20
1.030 1.024 1.019 1.016 1.013
1.033 1.026 1.021 1.017 1.015
1.034 1.027 1.022 1.018 1.016
1.037 1.028 1.023 1.020 1.017
1.038 1.030 1.025 1.021 1.017
1.040 1.031 1.026 1.022 1.018
24 30 40 60 120
1.010 1.007 1.004 1.002 1.001
1.011 1.007 1.004 1.002 1.001
1.012 1.008 1.005 1.002 1.001
1.012 1.009 1.005 1.003 1.001
1.013 1.009 1.005 1.003 1.001
1.014 1.009 1.006 1.003 1.001
o0
1.000
1.000
1.000
1.000
1.000
1.000
55.7585 63.6907
74.4683 83.5134
83.6753 93.2169
Likelihood ratio tesls for mean vectors and covarianve matrices
Table 7 (continued) p ' ~ 3 , q = 10
p = 5, q = 10
p = 7 , q ~ 10
0.050
0.010
0.050
0.010
0.050
0.010
1 2 3 4 5
1.716 1.359 1.232 1.167 1.127
1.862 1.413 1.262 1.187 1.141
1.600 1.315 1.211 1.155 1.120
1.721 1.359 1.235 1.171 1.131
1.557 1.303 1.208 1.155 1.122
1.666 1.342 1.229 1.169 1.132
6 7 8 9 10
1.101 1.082 1.068 1.058 1.050
1.112 1.091 1.075 1.064 1.055
1.097 1.080 1.067 1.057 1.050
1.105 1.087 1.073 1.062 1.054
1.099 1.083 1.070 1.060 1.053
1.107 1.089 L075 1.065 1.056
12 14 16 18 20
1.038 1.030 1.024 1.020 1.017
1.042 1.033 1.027 1.022 1.019
1.038 1.031 1.025 1.021 1.018
1.041 1.033 1.027 1.022 1.019
1.042 1.034 1.028 1.023 1.019
1.044 1.036 1.029 1.024 1.020
24 30 40 60 120
1.012 1.009 1.005 1.002 1.001
1.014 1.009 1.006 1.003 1.001
1.013 1.009 1.005 1.003 1.001
1.014 1.010 1.006 1.003 1.002
1.014 1.010 1.006 1.003 1.001
1.015 1.011 1.007 1.003 1.001
oo
1.000
1.000
1.000
1.000
1.000
1.000
y~q
43.7730
50.8922
67.5048
76.1539
90.5312 100.4250
545
546
P. R. Krishnaiah and Jack C. Lee
Table 7 (continued)
p-~9,q= ll
p~9,q=13
p=9,q=15
0.050
0.010
0.050
0.010
0.050
0.010
1 2 3 4 5
1.542 1.311 1.219 1.168 1.134
1.650 1.348 1.240 1.182 1.144
1.554 1.326 1.234 1.182 1.147
1.659 1.363 1.255 1.195 1.157
1.568 i.343 1.250 1.196 1.160
1.673 1.380 1.271 1.210 1.170
6 7 8 9 10
1.111 1.093 1.080 1.070 1.061
1.118 1.100 1.085 1.074 1.065
1.122 1.104 1.090 1.078 1.069
1.130 1.110 1.095 1.082 1.073
1.134 1.115 1.099 1.087 1.077
1.142 1.121 1.105 1.092 1.081
12 14 16 18 20
1.048 1.039 1.033 1.028 1.024
1.051 1.042 1.035 1.029 1.025
1.055 1.045 1.038 1.032 1.028
1.058 1.047 1.040 1.034 1.029
1.062 1.051 1.043 1.037 1.032
1.065 1.054 1.045 1.038 1.033
24 30 40 60 120
1.018 1.013 1.008 1.004 1.001
1.019 1.013 1.008 1.004 1.001
1.021 1.015 1.009 1.005 1.001
1.022 1.016 1.010 1.005 1.001
1.024 1.017 1.011 1.006 1.002
1.026 1.018 1.012 1.006 1.002
o0
1.000
1.000
1.000
1.000
1.000
1.000
~q(a)
123.2250 134.6420
143.2460 155.4960
163.1160 176.1380
Likelihood ratio tests for mean vectors and covariance matrices
547
Table 7 (continued)
p=9, q=17
p= ll,q=ll
p=ll,q=13
0.050
0.010
0.050
0.010
0.050
0.010
1 2 3 4 5
1.585 1.360 1.266 1.210 1.173
1.690 1.398 1.287 1.225 1.184
1.536 1.316 1.226 1.175 1.142
1.638 1.351 1.246 1.189 1.152
1.538 1.324 1.236 1.185 1.151
1.637 1.359 1.256 1.198 1.161
6 7 8 9 10
1.146 1.126 1.110 1.096 1.086
1.154 1.132 1.115 1.102 1.090
1.118 1.100 1.086 1.075 1.067
1.125 1.106 1.092 1.080 1.070
1.127 1.108 1.094 1.082 1.073
1.134 1.114 1.099 1.087 1.077
12 14 16 18 20
1.070 1.058 1.049 1.042 1.036
1.073 1.060 1.051 1.043 1.038
1.053 1.044 1.036 1.031 1.027
1.056 1.046 1.038 1.031 1.028
1.059 1.048 1.041 1.035 1.030
1.061 1.051 1.042 1.036 1.031
24 30 40 60 120
1.028 1.020 1.013 1.007 1.002
1.029 1.021 1.013 1.007 1.002
1.020 1.014 1.009 1.005 1.001
1.021 1.015 1.010 1.005 1.001
1.023 1.017 1.011 1.005 1.002
1.024 1.017 1.011 1.005 1.002
oo
1.000
1.000
1.000
1.000
1.000
1.000
~t
M
X~q(Ot) 182.8650 196.6090
147.6740 160.1000
171.9070 185.2560
*The entries in this table are the values of c I where P [ C 1 < c d = ( 1 - a ) , M=n-p+l, C 1 = - { n - ½ ( p - q + 1)} logBp,q,n/X~q(a ). For an explanation of the notations, the reader is referred to Section 2.1. T h e entries w h e n both p and q are less than l 1 are reproduced from Schatzoff (1966) with the kind permission of Biometrika Trustees, whereas the remaining entries are reproduced f r o m Lee et al. (1977) with the permission of the editor of the South
African Statistical Journal.
548
P. t~ Krishnaiah and Jack C. Lee
Table 8* Percentage points of the likelihood ratio statistic for multiple independence M~q)
(3,3)
(1,4)
(2,4)
0,4)
(1,5)
(2,5)
(3,5)
(1,3)
(2,3)
2 3 4 5
3.023 2.534 2.180 1.913 1.704
6.371 10.037 4.376 9.417 14.932 5.784 12.600 20.006 5.509 8.857 3.721 8.285 13.408 4.978 11.228 18.192 4.857 7,937 3.238 7.406 12.190 4.371 10.146 16.716 4.345 7.198 2.867 6.701 11.189 3.900 9.261 15.477 3.933 6.587 2.572 6.122 10.347 3.520 8.525 14.425
6 7 8 9 10
1.537 1.400 1.284 1.187 1.103
3.593 3.307 3.064 2.854 2.672
6.075 2.333 5 . 6 3 7 5.639 2.134 5.225 5.262 1.967 4.870 4.933 1.825 4.560 4.643 1.701 4.289
9.627 9.004 8.460 7.980 7.552
3.208 2.949 2.727 2.537 2.372
7.900 7.364 6.897 6.487 6.125
11 12 13 14 15
1.030 0.967 0.910 0.860 0.815
2.511 2.369 2.242 2.128 2.025
4.386 4.157 3.950 3.763 3.593
1.593 1.498 1.414 1.338 1.271
4.048 3.833 3.639 3.465 3.306
7.169 6.823 6.510 6.225 5.964
2.227 2.099 1.985 1.882 1.790
5.801 10.313 5.510 9.852 5.247 9.432 5.009 9.047 4 . 7 9 1 8.691
16 17 18 19 20
0.775 0.739 0.705 0.675 0.647
1.932 1.847 1.769 1.697 1.631
3.438 3.296 3.165 3.044 2.932
1.210 1.154 1.104 1.057 1.015
3.162 3.030 2.908 2.796 2.692
5.725 5.503 5.299 5.110 4.933
1.706 1.631 1.561 1.497 1.438
4.592 4.409 4.240 4.084 3.938
8.364 8.061 7.779 7.518 7.272
22 24 26 28 30
0.597 0.555 0.518 0.486 0.458
1.513 1.412 1.323 1.244 1.175
2.732 2.557 2.403 2.267 2.145
0.939 0.874 0.817 0.767 0.723
2.506 4.615 2.344 4.336 2.202 4.088 2.076 3.868 1.964 3.671
1.333 1.243 1.163 1.094 1.032
3.677 3.448 3.247 3.067 2.906
6.828 6.435 6.086 5.772 5.491
= 0.05 1
13.515 12.717 12.011 11.386 10.821
549
L i k e l i h o o d ratio tests f o r m e a n vectors a n d covariance matrices
Table 8 (continued) M ~ )
(1,3)
(2,3)
(3,3)
(1,4)
1 2 3 4 5
4.390 3.678 3.165 2.778 2.475
7.991 6.899 6.076 5.432 4.914
11.838 10.427 9.334 8.457 7.735
6 7 8 9 10
2.231 2.031 1.864 1.723 1.602
4.487 4.129 3.825 3.563 3.335
11 12 13 14 15
1.496 1.403 1.322 1.249 1.183
16 17 18 19 20 22 24 26 28 30
(2,4)
(3,4)
(1,5)
(2,5)
(3,5)
5.861 11.195 4.980 9.831 4.330 8.777 3.832 7.936 3.438 7.245
16.905 15.155 13.762 12.618 11.660
7.368 6.333 5.557 4.954 4.470
14.500 12.899 11.639 10.616 9.767
22.106 20.067 18.421 17.043 15.876
7.130 6.615 6.171 5.784 5.443
3.118 2.852 2.628 2.437 2.272
6.669 6.179 5.757 5.391 5.068
10.846 10.140 9.524 8.981 8.498
4.073 3.742 3.461 3.219 3.009
9.047 8.430 7.893 7.423 7.006
14.864 13.981 13.204 12.513 11.890
3.134 2.956 2.798 2.655 2.527
5.142 4.871 4.629 4.409 4.210
2.128 2.001 1.888 1.788 1.697
4.783 4.528 4.299 4.093 3.905
8.066 7.676 7.323 7.001 6.708
2.826 2.663 2.518 2.388 2.270
6.635 6.301 6.000 5.727 5.477
11.332 10.822 10.359 9.935 9.545
1.125 1.072 1.024 0.979 0.939
2.410 2.304 2.207 2.117 2.035
4.028 3.861 3.707 3.566 3.434
1.616 1.541 1.474 1.412 1.355
3.734 3.578 3.434 3.302 3.179
6.437 6.188 5.958 5.745 5.546
2.165 2.068 1.980 1.898 1.823
5.249 5.039 4.846 4.667 4.501
9.185 8.850 8.541 8.252 7.983
0.867 0.805 0.752 0.705 0.664
1.888 1.761 1.650 1.552 1.465
3.199 2.995 2.814 2.655 2.512
1.254 1.167 1.091 1.025 0.966
2.959 2.768 2.600 2.451 2.318
5.188 4.874 4.596 4.348 4.125
1.690 1.576 1.475 1.387 1.309
4.202 3.940 3.709 3.504 3.321
7.495 7.063 6.679 6.335 6.025
a =0.01
*The entries in this table are the values of c 3 when M = N - s - 4 and P [ - 2 log )~3
550
P. R. Krishnaiah and Jack C. Lee
Table 9* Percentage points of the likelihood ratio test statistic for sphericity 4
5
6
7
8
9
10
0.067219 0.045149 0.033631 0,021233 0.022924 0,025613 0.029379 0.01423 0.02011 0.02693 0.03460 0.04299 0.06154 0.08178 0.1030 0.1248 O.1467 0.1898 0.2697 0.3389 0.4112 0.5196 0.5955 0.6935 0.7757 0.8452
0.062326 0.041817 0.031397 0.055114 0,021295 0.022629 0.024616 0.027314 0.01074 0.01489 0.01973 0.03129 0°04494 0.06022 0.07667 0.09392 0.1296 0.2006 0.26 0.3376 0.4499 0.5317 0.6405 0.7342 0.8151
0.077722 0.056455 0.045370 0.032107 0,035667 0.021214 0.032235 0.023692 0.025630 0.028071 0.01448 0.02282 0.03287 0.04435 0.05698 0.08468 0.1444 0.2035 0.2715 0,3840 0.4694 0.5870 0.6913 0.7833
a=0.05 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 22 24 26 28 30 34 42 50 60 80 100 140 200 300
0.049528 0.023866 0.01687 0.03866 0.06640 0.09739 0.1297 0.1621 0.1938 0.2244 0.2535 0.2812 0.3074 0.3321 0.3533 0.3772 0.4173 0.4530 0.4848 0.5134 0.5390 0.5833 0.6508 0.6998 0.7447 0.8037 0.8406 0.8842 0.9179 0.9447
0.042578 0.021262 0.026400 0.01650 0.03110 0.04919 0.06970 0.09174 0.1146 0.1378 0.1608 0.1835 0.2058 0.2273 0.2482 0.2876 0.3240 0.3575 0.3882 0.4164 0.4663 0.5453 0.6046 0.6603 0.7354 0.7835 0.8413 0.8868 0.9234
0.057479 0.034267 0.022553 0.027004 0.01435 0.02433 0.03653 0.05051 0.06583 0.08210 0.09900 0.1163 0.1337 0.1511 0.1854 0.2t85 0.2501 0.2800 0.3081 0.3594 0.4442 0.5106 0.5749 0.6641 0.7228 0.7948 0.8525 0.8996
0.052284 0.051473 0.039434 0.022950 0.026524 0.01179 0.01870 0.02712 0.03682 0.04761 0.05927 0.07161 0.08446 0.1111 0.1383 0.1654 0.1920 0.2178 0.2665 0.3515 0.4211 0.4910 0.5916 0.6597 0.7453 0.8153 0.8734
551
L i k e l i h o o d ratio tests' f o r m e a n vectors a n d covariance matrices
Table 9 (continued) 4
5
6
7
8
9
10
a=0.01 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 22 24 26 28 30 34 42 50 60 80 100 140 200 300
0.053665 0.036904 0.025031 0.01503 0.03046 0.05010 0.07258 0.09679 0.1218 0.1471 0.1721 0.1966 0.2204 0.2434 0.2655 0.2867 0.3264 0.3626 0.3956 0.4257 0.4531 0.5013 0.5769 0.6331 0.6856 0.7558 0.8006 0.8541 0.8961 0.9297
0.069837 0.032184 0.031828 0.036123 0.01361 0.02416 0,03730 0.05248 0.06915 0.08685 0.10518 0.1239 0.1426 0.1613 0.1797 0.2156 0.2497 0.2819 0.3120 0.3402 0.3910 0.4741 0.5383 0.6001 0.6852 0.7407 0.8085 0.8626 0.9066
0062970 0.047187 0.036758 0.022498 0.026033 0.01148 0.01880 0.02782 0.03830 0.04998 0.06261 0.07595 0.08982 0.1040 0.1330 0.1620 0.1904 0.2180 0.2445 0.2940 0.3789 0.4475 0.5157 0.6129 0.6782 0.7598 0.8262 0.8811
0.078604 0.042424 0.032520 0.031017 0.022646 0.035369 0.029296 0.01444 0.02073 0.02807 0.03635 0.04541 0.05514 0.07612 0.09845 0.1215 0.1447 0.1677 0.2125 0.2939 0.3632 0.4348 0.5408 0.6144 0.7088 0.7874 0.8535
0.072760 0.058306 0.049438 0.034120 0.021149 0.022476 0.024516 0.027343 0.01098 0.01542 0.02062 0.02652 0.04017 0.05580 0.07287 0.09092 0.1096 0.1475 0.2211 0.2876 0.3594 0.4705 0.5505 0.6562 0.7465 0.8239
0.089216 0.052879 0.043544 0.031663 0.034943 0.021126 0.032160 0.033669 0.025705 0.038300 0.01146 0.01940 0.02933 0.04095 0.05392 0.06795 0.09805 0.1611 0.2221 0.2912 0.4034 0.4879 0.6028 0.7040 0.7927
0.083573 0.031004 0.041332 0.046681 0.032108 0.035065 0.021018 0.031804 0.032914 0.024386 0.038498 0.01421 0.02145 0.03007 0.03989 0.06231 0.1136 0.1671 0.2311 0.3411 0.4275 0.5495 0.6605 0.7600
*The entries in this table give the values of ¢4 where P[~4 ~c41H4] =(1 - a). For an explanation of the notations the reader is referred to Section 4.1. The entries in this table are reproduced from Nagarsenker and Pillai (1973a) with the kind permission of the Academic Press, Inc.
552
P. R. Krishnaiah and Jack C. Lee
Table 10" Percentage points of the likelihood ratio test statistic specifying the covariance matrix p
n
a=,O.05
a=O.O1
p
n
4
4 5 6 5 6 7 8
38.184 29.013 25.763 53,820 41.223 36.563 34.056
51.207 37.658 33.082 70.113 51.780 45.382 42.048
8
9 10 11 12 13 14 15
90.127 80.323 74.809 71.151 68.517 66.516 64.939
107.164 93.830 86.901 82.402 79.204 76.797 74.912
6 7 8 9 10 11 7 8 9 10 11 12 13 14 15 16 17
71.781 55.455 49,247 45.829 43,627 42.076 91.376 71.701 63.829 59.407 56.513 54.450 52,898 51.683 50.704 49.898 49,221
91.068 67.949 59.600 55.158 52.351 50.400 112.490 86,150 75.745 70.104 66.486 63.942 62.044 60.568 59.384 58.413 57.601
16 17 18 19 20 22 10 11 12 13 14 15 16 17 18
63.661 62.603 61.712 60.949 60.290 59.206 110.217 98.734 92.051 87.563 84.298 81.800 79.818 78.203 76.859
73.391 72.137 71.084 70.186 69.410 68.138 128.599 113.850 105,564 100,116 96,205 93,239 90.901 89.005 87.434
5
~=0.05
a=O.Ol
p
10
9
n
tx=O.05
a=O.Ol
19 20 22 24 26 28
75.722 74.747 73.158 71.918 70.922 70.104
86.108 84.973 83.131 81.697 80.548 79.605
11 12 13 14 15 16 17 18 19 20 22 24 26 28 30 32
132.403 119.070 111.146 105.764 101.816 98.772 96.343 94.354 92.691 91.279 89.005 87.251 85.855 84.716 83.768 82.967
152.510 135.813 126.102 119.643 114.966 111.392 108.558 106.249 104.326 102.697 100.084 98.075 96.480 95.181 94.103 93.192
*The entries in this table give the values of c5 where P [ - 2 log~ s ~ cslHs] = ( 1 - a). For an explanation of the notations, the reader is referred to Section 4.2. The entries in this table are reproduced from Nagarsenher a n d Pillai (1973b) with the kind permission of Biometrika Trustees.
553
Likelihood ratio tests for mean vectors and covariance matrices
Table 11" Percentage points of the likelihood ratio test statistic for homogeneity of the covariance matrices
n•0q 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 25 30
2
3
4
12.18 10.70 9.97 9.53 9.24 9.04 8.88 8.76 8.67 8.59 8.52 8.47 8.42 8.38 8.35 8.32 8.28 8.26 8.17 8.11
18.70 16.65 15,63 15.02 14.62 14,33 14.11 13.94 13.81 13.70 13.60 13.53 13.46 13.40 13.35 13.30 13.26 13.23 13.10 13.01
24.55 22.00 20.73 19.97 19.46 19.10 18.83 18.61 18.44 18.30 18.19 18.10 18.01 17.94 17.87 17.82 17.77 17.72 17.55 17.44
35.00 30.52 28.24 26.84 25.90 25.22 24.71 24.31 23.99 23.73 23.50 23.32 23.16 23.02 22.89 22.78 22.69 22.33 22.10
46.58 40.95 38.06 36.29 35.10 34.24 33.59 33.08 32.67 32.33 32.05 31.81 31.60 31.43 31,26 31.13 31.01 30.55 30.25
4 22.41 5 19.19 6 17.57 7 16.59 8 15.93 9 15.46 10 15.11 11 14.83 12 14,61 13 14.43 14 14.28 15 14.15 16 14.04 17 13.94 18 13.86 19 13.79 20 13.72 25 13.48 30 13.32
5
6
7
8
9
10
ffiO.05,p = 2 30.09 35.45 27.07 31.97 25.57 30.23 24.66 29.19 24.05 28.49 23.62 27.99 23.30 27.62 23.05 27.33 22.85 27.10 22.68 26.90 22.54 26.75 22.42 26.61 22.33 26.50 22.24 26.40 22.17 26.31 22;10 26.23 22.04 26.16 21.98 26.10 21,79 25.87 21.65 25,72
40.68 36.75 34.79 33.61 32.83 32.26 31.84 31.51 31.25 31.03 30.85 30.70 30.57 30.45 30.35 30.27 30.19 30.12 29.86 29.69
45.81 41.45 39.26 37.95 37.08 36.44 35.98 35.61 35.32 35.08 34.87 34.71 34.57 34.43 34.32 34.23 34.14 34.07 33.78 33.59
50.87 46.07 43.67 42.22 41.26 40.57 40.05 39.65 39.33 39.07 38.84 38.66 38.50 38.36 38.24 38.13 38.04 37.95 37.63 37.42
55.86 50.64 48,02 46.45 45.40 44.64 44.08 43.64 43.29 43.00 42.76 42.56 42.38 42.23 42.10 41.99 41.88 41.79 41.44 41.21
a = o.05,p = 3 57.68 68.50 50.95 60.69 47.49 56.67 45.37 54.20 43.93 52.54 42.90 51.33 42.11 50.42 41.50 49.71 41.00 49.13 40.60 48.65 40,26 48.26 39.97 47.92 39.72 47.63 39.50 47.38 39.31 47.16 39.15 46.96 39.00 46.79 38.44 46.15 38.09 45.73
79.11 70.26 65.69 62.89 60.99 59.62 58.57 57.76 57.11 56.56 56.11 55,73 55.40 55.11 54.86 54.64 54.44 53.70 53.22
89.60 79.69 74.58 71.44 69.32 67.78 66.62 65.71 64.97 64.36 63.86 63.43 63.06 62.73 62.45 62.21 61.98 61.16 60.62
99.94 110.21 89.03 98.27 83.39 92.09 79.90 88.30 77.57 85.73 75.86 83.87 74.58 82.46 73.57 81.36 72.75 80.45 72.09 79.72 71.53 79.11 71.05 78.60 70.64 78.14 70.27 77.76 69.97 77,41 69.69 77.11 69.45 76.84 68.54 75.84 67.94 75.18
554
P. R. Krishnaiah and Jack C. Lee
Table 11 (continued) 2
3
4
5
6
7
5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 25 30
35.39 30.06 27.31 25.61 24.45 23.62 22.98 22.48 22.08 21.75 21.47 21.24 21.03 20.86 20.70 20.56 20.06 19.74
56.10 48.62 44.69 42.24 40.57 39.34 38.41 37.67 37.08 36.59 36.17 35.82 35.52 35.26 35.02 34.82 34.06 33.59
75.36 65.90 60.89 57.77 55.62 54.04 52.84 51.90 51.13 50.50 49.97 49.51 49.12 48.78 48.47 48.21 47.23 46.61
6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 25 30
51.11 43.40 39.29 36.71 34.93 33.62 32.62 31.83 31.19 30.66 30.22 29.83 29.51 29.22 28.97 28.05 27.48
a = 0.05,p = 5 81.99 110.92 138.98 166.54 193.71 220.66 247.37 273.88 71.06 97.03 122.22 146.95 171.34 195.49 219.47 243.30 65.15 89.45 113.03 136.18 159.04 181.65 204.14 226.48 61.39 84.62 107.17 129.30 151.17 172.80 194.27 215.64 58.78 81.25 103.06 124.48 145.64 166.56 187.37 208.02 56.85 78.75 100.02 120.92 141.54 161.98 182.24 202.37 55.37 76.83 97.68 118.15 138.38 158.38 178.23 198.03 54.19 75.30 95.82 115.96 135.86 155.54 175.10 194.51 53.23 74.05 94.29 114.16 133.80 153.21 172.49 191.68 52.44 73.01 93.02 112.66 132.07 151.29 170.36 189.38 51.76 72.14 91.94 111.41 130.61 149.66 166.53 187.32 51.19 71.39 91.03 110.34 129.38 148.25 166.99 185.61 50.69 70.74 90.23 109.39 128.29 147.03 165.65 184.10 50.26 70.17 89.54 108.57 127.36 145.97 164.45 182.81 49.88 69.67 88.93 107.85 126.52 145.02 163.38 181.65 48.48 67.86 86.70 105.21 123.51 141.62 159.60 177.49 47.61 66.71 85.29 103.56 121.60 139.47 157.22 174.87
a = 0.05,p = 4 93°97 112.17 130.11 82.60 98.93 115.03 76.56 91.88 106.98 72.77 87.46 101.94 70.17 84.42 98.46 68.26 82.19 95.90 66.81 80.48 93.95 65.66 79.14 92.4l 64.73 78.04 91.15 63.95 77.13 90.12 63.30 76.37 89.26 62.76 75.73 88.51 62.28 75.16 87.87 61.86 74.68 87.31 61.50 74.25 86.82 61.17 73.87 86.38 59.98 72.47 84.78 59.21 '71.58 83.74
8
9
10
147.81 130.94 121.90 116.23 112.32 109.46 107.27 105.54 104.12 102.97 101.99 101.14 100.42 99.80 99.25 98.75 96.95 95.79
165.39 146.69 136.71 130.43 126.08 122.91 120.46 118.55 116.98 115.69 114.59 113.67 112.87 112.17 111.56 111.02 109.01 107.71
182.80 162.34 151.39 144.50 139.74 136.24 133.57 131.45 129.74 128.32 127.14 126.10 125.22 124.46 123.79 123.18 120.99 119.57
*The entries in this table are the values of c6 where P [ - 2 1 o g ~ 6 < c r l n 6 ] = ( l - a ) . For an explanation of the notations, the reader is referred to Section 4.2. The entries in this table are reproduced from Lee, Chang and Krishnaiah (1977) with the kind permission of North-Holland Publishing Company.
555
Likelihood ratio testa'for mean vectors and covariance matrices
~
~
~
~
~
~
~
~
~
,
~
0
c5 II
t~ 0 0
o
0
~ d ~ d ~ ~ N M M M M N N M ~ M N M N 0 0
0
I
"~ ~
556
P. R. Krishnaiah and Jack C. Lee Table 13" Percentage points of the likelihood ratio test statistic for Z = ~0 and 1~=P.0 2
3
4
5
6
a = 0.05 4 5 6 7 8
17.381 15.352 14.318 13.689 13.265
27.706 24.431 22.713 21.646
39.990 35.307 32.787
54.261 48.039
70.475
9 10 11 12 13
12.960 12.729 12.549 12.404 12.285
20.915 20.382 19.975 19.655 19.396
31.190 30.080 29.261 28.631 28.131
44.610 42.400 40.843 39.683 38.782
62.660 58.222 55.321 53.254 51.698
14 15 16 17 18
12.186 12.101 12.029 11.966 11.911
19.181 19.002 18.848 18.716 18.601
27.723 27.384 27.098 26.854 26.642
38.061 37.470 36.977 36.559 36.200
50.480 49.499 48.691 48.013 47.436
19 20 22 24 26
11.862 11.819 11.745 11.684 11.633
18.499 18.410 18.258 18.134 18.031
26.457 26.294 26.019 25.797 25.614
35.888 35.614 35.157 34.790 34.489
46.938 46.504 45.785 45.212 44.745
28 30 32 34 36
11.591 11.554 11.522 11.494 11.469
17.944 17.870 17.806 17.750 17.701
25.460 25.329 25.215 25.117 25.030
34.237 34.023 33.840 33.681 33.541
44.357 44.029 43.748 43.505 43.292
38 40 45 50 55
1L447 11.427 11.386 11.353 11.327
17.657 17.618 17.536 17.471 17.419
24.954 24.885 24.742 24.630 24.539
33.417 33.307 33.079 32.900 32.755
43.105 42.938 42.594 42.324 42.107
60 65 70 75 80
11.305 11.286 11.271 11.257 11.245
17.375 17.339 17.308 17.281 17.258
24.465 24.402 24.348 24.302 24.262
32.636 32.537 32.452 32.379 32.316
41.929 41.780 41.654 41.546 41.451
85 90 95 100
11.235 11.225 11.217 11.210
17.237 17.219 17.203 17.188
24.227 24.196 24.168 24.143
32.261 32.211
557
Likelihood ratio tests for mean vectors and covariance matrices
Table 13 (continued) M ~
2
3
a
4
5
6
=0.01
4 5 6 7 8
24.087 21.114 19.625 18.729 18.129
36.308 31.682 29.318 27.871
50.512 44.073 40.713
66.728 58.348
84.937
9 10 11 12 13
17.700 17.377 17.125 16.923 16.758
26.890 26.180 25.642 25.219 24.878
38.621 37.184 36.133 35.328 34.692
53.885 51.063 49.100 47.650 46.531
74.530 68.874 65.244 62.690 60.784
14 15 16 17 18
16.620 16.503 16.403 16.316 16.239
24.597 24.361 24.161 23.988 23.838
34.176 33.748 33.388 33.080 32.814
45.639 44.911 44.305 43.793 43.353
59.302 58.114 57.139 56.324 55.631
19 20 22 24 26
16.172 16.112 16.010 15.927 15.857
23.706 23.589 23.392 23.231 23.098
32.582 32.378 32.035 31.758 31.529
43.973 42.639 42.083 41.637 41.272
55.035 54.517 53.658 52.977 52.422
28 30 32 34 36
15.798 15.747 15.703 15.665 15.631
22.986 22.890 22.807 22.734 22.671
31.337 31.174 31.033 30.911 30.803
40.967 40.708 40.486 40.294 40.125
51.961 51.573 51.240 50.953 50.701
38 40 45 50 55
15.601 15.574 15.517 15.473 15.436
22.614 22.564 22.458 22.375 22.307
30.708 30.623 30.447 30.308 30.196
39.976 39.844 39.568 39.353 39.179
50.480 50.284 49.877 49.559 49.303
60 65 70 75 80
15.406 15.381 15.359 15.341 15.324
22.252 22.205 22.165 22.131 22.101
30.103 30.025 29.959 29.903 29.853
39.036 38.916 38.815 38.727 38.651
49.094 48.919 48.770 48.643 48.532
85 90 95 100
15.310 15.297 15.286 15.276
22.074 22.051 22.030 22.011
29.810 29.771 29.737 29.706
38.585 38.526
*The entries in this table are the values of ca where P [ - 2 1 o g X s < c s ] = a). For an explanation of the notations, the reader is referred to Section 5.1. The entries in the table are reproduced from Nagarsenker and Pillai (1974) with the kind permission of the Academic Press, Inc.
(1 -
558
P. R. Krishnaiah and Jack C. Lee
Table t4" Percentage points of the likelihood ratio test statistic for the homogeneity of multivariate normal populations M
k=2
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 25 30
8.12 8.03 7.97 7.94 7.92 7.91 7.89 7.89 7.88 7.87 7.87 7.86 7.86 7.86 7.85 7.85 7.85 7.85 7.85 7.85 7.84 7.84
1 2 3 4 5 6 7 8 9 10 ll 12 13 14 15 16 17 18 19 20 25 30
19.87 18.41 17.68 17.25 16.96 16.75 16.60 16.47 16.38 16.30 16.23 16.18 16.13 16.09 16.05 16.02 15.99 15.97 15.94 15.92 15,84 15.79
k=3
k=4
k=5
p = 1,a =0,05 t2.47 12.49 12.51 12.52 12.53 12.53 12.54 12.55 12.55 12.56 12.56 12.56 12.56 12.56 12.57 12.56 12.5'7 12.57 12.57 12.57 12.57 12.58
16.36 16.50 16.59 16.64 16.68 16.71 16.73 16.75 16,77 16.78 16.79 16.86 16.81 16.81 16.82 16.83 16.83 16.84 16.84 16.85 16.86 16.87
20.06 20.30 20.45 20.55 20,62 20,67 20,71 20.74 20.77 20.79 20.81 20.82 20.84 20.85 20.86 26.87 20.88 20.89 20.89 20.90 20.92 20.94
42.59 40.42 39.37 38.75 38.35 38.06 37.85 37.69 37.55 37.44 37.36 37.28 37.22 37.16 37.12 37.07 37.04 37.01 36.97 36.95 36.85 36.78
53.10 50.62 49.43 48.74 48.29 47.97 47.74 47.56 47.41 47.30 47.21 47.12 47.05 47.00 46.94 46.90 46.86 46.82 46.79 46.76 46.65 46.57
p = 2 , a =0.05 31.66 29.83 28.92 28.39 28.03 27.78 27.60 27.45 27.33 27.24 27,16 27.09 27.04 26.99 26.94 26.90 26.87 26.84 26.81 26.79 26.69 26.63
L i k e l i h o o d ratio tests f o r m e a n vectors a n d covariance matrices
Table 14 (continued) M
k=2
k=3
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 25 30
36.46 32.90 31.04 29.90 29.13 28.56 28.13 27.79 27.52 27.30 27.11 26.96 26.82 26.70 26.60 26.51 26.42 26.35 26.28 26.22 25.99 25.83
p = 3 , a =0.05 59.39 54.56 52.04 50.49 49.42 48.65 48.06 47.61 47.24 46.94 46.68 46.46 46.27 46.11 45.97 45.84 45.73 45.63 45.54 45.45 45.14 44.92
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 25 30
58.06 51.79 48.40 46.26 44.77 43.68 42.83 42.17 41.63 41.19 - 40.81 40.50 40.22 39.97 39.76 39.57 39.41 39.25 39.12 38.99 38.51 38.18
p =4,c~ ~0.05 95.91 87.16 82.37 79.35 77.23 75.67 74.46 73.51 72.74 72.10 71.56 71.09 70.70 70.35 70.03 69.77 69.52 69.30 69.10 68.92 68.22 67.74
k=4
k=5
80.89 74.91 71.80 69.88 68.57 67.62 66.90 66.34 65.89 65.52 65.19 64.94 64.71 64.50 64.33 64.17 64.04 63.92 63.80 63.70 63.31 63.05
101.75 94.65 90.97 88,70 87.16 86.05 85 .21 84.55 84.02 83.57 83.21 82.90 82.63 82.40 82.20 82.01 81.85 81.71 81.58 81.46 81.01 80.70
131.68 120.63 114.58 110.73 108.05 106.08 104.57 103.36 102.39 101.57 100.89 100.29 99.80 99.36 98.97 98.62 98.31 98.04 97.79 97.55 96.67 96.07
166.49 153.23 145.98 141.37 138.13 135.78 133.95 132.51 131.33 130,37 129.54 128.84 128.22 127.70 127.23 t26.83 126.45 126.13 125.82 125.54 124.50 123.76
*The entries in this table give the values of c9 where P [ - 21ogh 9 < c 91H9] = (1 - a). For an explanation of the notations, the reader is referred to Section 5.2. The entries in this table are reproduced from Lee et al. (1977) with the kind permission of North-Holland Publishing Company.
559
560
P. R. Krishnaiah and Jack C. Lee
"Fable 15" Percentage points of the ~ e l i h o o d ratio test statistic for the equality of variances and eovariances 4
5
6
7
8
9
10
.0684400 .0458442 .0540746 .0213701 .0232229 .0261419 .010200 .015389 .021645 .028869 .036947 .045759 .097007 .15287 .20751 .25851 .30511 .34728 .41965 .47875 .52752 .56828 .60275
.0628066 .0*20340 .0515487 .0356237 .0314138 .0328543 .0249853 .0378624 .011501 .015886 .020978 .055015 .097785 .14349 .18879 .23202 .27245 .34437 .40520 .45665 .50044 .53802
.0792027 .0571491 .0458951 .0322970 .0561430 .0213094 .0223991 .0339479 .0359982 .0285717 .028998 .059290 .095122 .13316 .17129 .20831 °27688 .33718 .38962 .43517 .47489
a~O.05 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 25 30 35 40 45 50 60 70 80 90 100
.0312635 .0350303 .021109 .046971 .078826 .11353 .14898 .18392 .21763 .24974 .28009 .30864 .33543 .36054 .38406 .40610 .49765 .56580 .61812 .65939 .69273 .72019 .76271 .79406 .81813 .83717 .85261
.0432727 .0215517 .0376651 .019335 .035879 .055972 .078419 .10223 .12667 .15120 .17545 .19917 .22221 .24447 .26589 .36028 .43574 .49643 .54584 .58674 .62105 .67527 .71607 .74784 .77325 .79404
.0592238 .0350592 .0228538 .0280234 .016233 .027219 .040508 .055581 .071959 .089226 .10704 .12514 .14331 .16138 .24711 .32180 .38524 .43897 .48468 .52388 .58729 .63615 .67484 .70619 .73208
.0527274 .0517023 .0210751 .0233219 .0272716 .013029 .020517 .029553 .039910 .051350 .063648 .076600 .090029 .15981 .22717 .28824 .34233 .38990 .43173 .50129 .55635 .60081 .63735 .66788
561
Likelihood ratio tests for mean vectors and covariance matrices
Table 15 (continued)
N• 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 25 30 35 40 45 50 60 70 80 90 1120
4
.0543589 .0290698 .0263672 .018484 .036601 .059079 .084300 .11098 .13820 .16531 .19191 .21772 .24258 .26643 .28921 .31094 .40477 .47807 .53614 .58300 .62148 .65359 .70402 .74174 .77100 .79433 .81336
5
6
7
8
9
10
a =0.01
.0512499 .0527002 .0222060 .0272407 .015834 .027717 .04230 .058928 .076996 .095996 .11552 .13526 .15497 .17448 .19367 .28233 .35739 .42005 .47248 .51672 .55441 .61496 .66130 .69782 :72731 .75159
.0637056 .0485568 .0379055 .0228787 .0268646 .012922 .020971 .030792 .042102 .054610 .068043 .082159 .096750 .11164 .18647 .25618 .31803 .37205 .41907 .46011 .52778 .58095 .62363 .65857 .68767
.0610191 .0528099 .0328834 .0211499 .0229634 .0259613 .010244 .015805 .022566 .030407 .039190 .048769 .059006 .11600 .17543 .23208 .28406 .33097 .37305 .44456 .50244 .54989 .58936 .62263
.0731276 .0594513 .0310624 .0345954 .0212711 .0227191 .0249297 .0279719 .011865 .016588 .022094 .028321 .067633 .11443 .16289 .20995 .25421 .29519 .36725 .42755 .47818 .52105 .55769
.0710254 .0532295 .0439383 .0318340 .0254147 .0212266 .0223392 .0239560 .0261272 .0288778 .012211 .036777 .070887 .10974 .15005 .18970 .22777 .29736 .35782 .40995 .45497 .49404
.0835966 .0511195 .0414656 .0473027 .0322907 .0354763 .0210956 .0219339 .0231126 .0246685 .018549 .041578 .070837 .10347 .13738 .17124 .23587 .29440 .34639 .39228 .43277
*The entries in this table give the values of c io where P [Xl0 ~ c 101I'12fq Ha ] = ( 1 - a). For an explanation of the notations, the reader is referred to Section 6. The entries in this table are reproduced from Nagarsenker (1975a) with the kind permission of Marcel Dekker, Inc.
562
P. R. Krishnaiah and Jack C Lee
Table 16" Percentage points of the, like~ood ratio test statistic for the equality of means, variances and covariances 5
6
7
8
9
10
0.0635519 0.0425869 0.0319296 0.0368819 0.0117036 0.0133936 0.0158588 0.0191461 0.013259 0.018167 0.023820 0.030151 0.069462 0.11572 0.16346 0.20980 0.25343 0.29386 0.36513 0.42492 0.47525 0.51794 0.55448
0.0611201 0.0388559 0.0471825 0.0327575 0.0372798 0.0115342 0,0127833 0.0145400 0.0168439 0.0197117 0.013141 0.037823 0.071549 0.10978 0.14935 0.18838 0.22587 0.29456 0.35441 0.40616 0.45095 0.48989
0.0738902 0.0530708 0.0426858 0.0311033 010330923 0.0368711 0.0213065 0.0122223 0,0134784 0.0151063 0.019146 0.041939 0.070689 0.10269 0.13596 0.16921 0.23284 0.29065 0.34215 0.38771 0.42802
a=0.05 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 25 30 35 40 45 50 60 70 80 90 100
0.0458284 0.0124987 0.011531 0.027632 0.049172 0.074247 0.10129 0.12916 0.15709 0.18455 0.21124 0.23695 0.26160 0.28513 0.30754 0.32886 0.42051 0.49181 0.54823 0.59373 0.63110 0.66229 0.71130 0.74799 0.77645 0.79917 0.81771
0.0414294 0.0174096 0.0139892 0.010803 0.021204 0.034680 0.050554 0.068162 0.086935 0.10641 O.12622 0.14610 0.16585 O.18530 0.20436 0.29193 0.36573 0.42724 0.47872 0.52218 0.55921 0.61878 0.66443 0.70045 0.72955 0.75354
0.0538976 0.0323417 0.0114293 0.0142929 0.0191786 0.016125 0.024971 0.035458 0.047293 0.060194 0.073901 0.088190 0.10287 0.11778 0.19208 0.26089 0.32185 0.37509 0.42147 0.46197 0.52888 0.58153 0.62387 0.65858 0.68751
0.0511422 0.0476872 0.0352215 0.0117171 0.0139638 0.0174335 0.012176 0.018151 0.225263 0.033385 0.042377 0.052099 0.062420 0.11921 0.17794 0.23380 0.28505 0.33134 0.37292 0.44370 0.50112 0.54829 0.58758 0.62075
Likelihood ratio tests for mean vectors and covariance matrices
563
Table 16 (continued) 4
5
6
7
8
9
10
0.0715137 0.0541262 0.0449354 0.0322589 0.0265687 0.0214684 0.0227676 0.0246324 0.0271093 0.010217 0.013949 0.018285 0.047583 0.085302 0.12658 0.16833 0.20885 0.24730 0.31684 0.37670 0.42801 0.47215 0.51034
0.0848639 0.0513883 0.0417941 0.0488132 0.0227296 0.0364526 0.0212781 0.0222361 0,0235705 0,0253171 0,0274973 0,024844 0.051074 0,082819 0,11722 0.15233 0,18693 0.25218 0,31065 0,36223 0.40756 0,44744
0.0820433 0.0647449 0.0565673 0.0434424 0.0211304 0.0228160 0.0358459 0.0110668 0.0217694 0.0227272 0.012035 0.028960 0.051924 0.78816 0.10784 0.13769 0.19664 0.25187 0.30217 0.34741 0.38795
a=0.01 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 25 30 35 40 45 50 60 70 80 90 100
0.o523634 0.0244158 0.0233851 0,010557 0.022158 0.037515 0.055700 0.075827 0.097158 0.11912 0.14128 0.16333 0.18505 0.20628 0.22692 0.24692 0.33650 0.40980 0.46977 0.51931 0.56074 0.59579 0.65172 0.69426 0.72762 0.75448 0.77654
0.0656984 0.0312661 0.0211201 0.0239329 0.0290997 0.016703 0.026539 0.038270 0.051519 0.065929 0.081180 0.097003 0.11317 0.12951 0.14587 0.22492 0.29563 0.35695 0.40972 0.45521 0.49462 0.55914 0,60949 0.64973 0.68257 0.70985
0.0615302 0.0438951 0.0238711 0.0115020 0.0237819 0.0274584 0.012601 0.019158 0.027006 0.035985 0.045921 0.056643 0.067993 0.079828 0.14245 0.20460 0.26223 0.31421 0.36060 0.40188 0.47147 0.52737 0.57299 0.61080 0.64260
0,0745507 0.0412502 0.0313716 0.0358070 0.0115767 0.0233190 0.0259339 0.0294795 0.013958 0.019329 0.025529 0.032475 0.040080 0.085022 0.13534 0.18575 0.23371 0.27824 0.31908 0.39024 0.44931 0.49864 0.54026 0.57573
*The entries in this table give the values of cll where PDkll > c l l I H ~ n H~O H~] = 1-a. For an explanation of the notations, the reader is referred to Section 6. The entries in the table are reproduced from Nagarsenker (1975b) with the kind permission of the Gordon and Breach Science Publishers, Ltd.
564
P. R. Krishnaiah and Jack C. Lee
Table 17" Percentage points of the distribution of the likelihood ratio test statistic for compound symmetry M~2)
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 22 24 26 28 30 35 40 45 50 60
(2, 2)
(2, 3)
.1397 .1886 .2350 .2782 .3178 .3540 .3871 .4173 .4449 .4702 .4935 .5148 .5346 .5528 .5698 .5855 .6002 .6138 .6265 .6385 .6604 .6799 .6972 .7126 .7267 .7565 .7805 .8002 .8168 .8428
.0569 .0847 .1142 .1443 .1741 .2030 .2310 .2577 .2831 .3072 .3301 .3518 .3723 .3917 .4100 .4274 .4439 .4595 .4743 .4884 .5145 .5383 .5599 .5977 .5799 .6368 .6691 .6962 .7194 .7564
(2, 4)
(2, 5)
b=O,a=O.05 .0240 .0102 .0388 .0178 .0561 .0274 .0751 .0386 .0951 .0513 .1158 .0649 .1366 .0794 .1574 .0944 .1780 .1097 .1981 .1252 .2178 .1408 .2369 .1563 .2554 .1717 .2733 .1868 .2906 .2018 .3072 .2164 .3233 .2308 .3387 .2448 .3536 .2584 .3679 .2718 .3949 .2975 .4200 .3217 .4432 .3447 .4648 .3664 .4848 .3868 .5293 .4332 .5669 .4735 .5992 .5088 .6271 .5399 .6728 .5920
(3, 3)
(3, 4)
(4,4)
.0239 .0388 .0560 .0750 .0950 .1157 .1365 .1573 .1779 .1980 .2177 .2368 .2553 .2732 .2905 .3071 .3232 .3386 .3535 .3678 .3949 .4199 .4431 .4647 .4847 .5292 .5669 .5992 .6270 .6728
.0102 .0178 .0273 .0386 .0512 .0649 .0793 .0943 .1096 .1251 .1407 .1562 .1716 .1868 .2017 .2164 .2307 .2447 .2584 .2718 .2974 .3217 .3446 .3663 .3868 .4331 .4735 .5088 .5399 .5919
.0044 .0081 .0132 .0196 .0271 .0357 .0452 .0554 .0662 .0775 .0890 .1009 .1129 .1250 .1371 .1492 .1613 .1733 .1851 .1968 .2197 .2419 .2633 .2838 .3034 ,3490 .3898 .4262 .4589 .5149
Likelihood ratio tests for mean vectors and covariance matrices
565
Table 17 (continued)
"X'~pt,p2 )
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 22 24 26 28 30 35 40 45 50 60
(2, 2)
(2, 3)
.0973 .1365 .1755 ,2130 .2487 .2822 .3135 .3427 .3697 .3950 .4185 .4404 .4608 .4799 .4977 .5145 .5302 .5449 .5588 .5719 .5959 .6175 .6368 .6545 .6705 .7046 .7325 .7555 .7750 .8059
.0372 .0577 .0804 .1044 .1290 .1536 .1779 .2016 .2246 .2468 ,2682 ,2886 .3083 .3270 .3450 .3622 .3786 .3943 .4093 .4236 .4504 .4750 .4977 .5185 .5378 .5801 .6155 .6455 .6712 .7129
(2, 4)
(2, 5)
(3, 3)
(3, 4)
(4, 4)
b = 1,a ~0.05 .0149 .0061 .0251 .0110 .0375 .0175 .0517 .0255 .0671 .0347 .0835 .0450 .1005 .0561 .1178 .0680 .1353 ,0803 .1527 .0930 .1700 .1060 .1870 .1192 .2038 ,1324 .2201 .1456 .2361 .1588 .2517 .1719 .2668 .1848 .2815 .1975 .2958 .2101 .3096 .2224 .3360 .2464 .3607 .2694 .3840 .2914 .4057 .3125 .4261 .3325 .4719 .3785 ,5114 .4194 .5456 .4556 .5755 .4878 .6252 .5425
.0149 .0251 .0375 .0516 .0671 .0835 .1005 .1178 .1352 .1527 .1699 .1870 .2037 .2201 .2361 .2516 .2667 .2814 .2957 .3095 .3359 .3607 .3839 .4056 .4261 .4719 .5114 .5456 .5755 .6251
,0061 .0110 .0175 .0255 .0347 .0449 .0561 .0679 .0803 .0930 .1060 .1191 .1323 .1456 ,1587 .1718 .1847 ,1975 .2100 .2224 .2463 .2694 .2914 .3124 .3324 ,3785 .4193 .4555 .4878 .5425
.0025 .0048 .0081 .0124 .0177 ,0239 .0309 .0385 .0468 .0557 .0649 .0745 .0844 .0945 .1048 .1152 .1256 .1361 .1465 .1569 .1775 .1978 .2175 .2367 .2554 .2992 .3391 .3755 .4084 .4657
P. R. Krishnaiah and Jack C. Lee
566 Table 17 (continued)
~__(Pl,P2)
M~ 1 2 3 4 5 6 7 8 9 10 11 t2 13 14 15 16 17 18 19 20 22 24 26 28 30 35 40 45 50 60
(2, 2)
(2, 3)
.0720 .1039 .1366 .1692 .2009 .2314 .2603 .2878 .3136 .3381 .3611 .3827 .4030 .4223 .4404 .4575 .4736 .4890 .5034 .5170 .5425 .5653 .5860 .6050 .6223 .6596 .6905 .7162 .7379 .7729
.0258 .0412 .0590 .0783 .0986 .1194 .1404 .1612 .1818 .2019 .2215 .2406 .2590 .2768 .2940 .3105 .3265 .3418 .3566 .3708 .3977 .4225 .4456 .4671 .4869 .5310 .5684 .6004 .6283 .6736
(2, 4)
(2, 5)
b=2,a=0.05 .0098 .0038 .0170 .0071 .026t .0117 .0369 .0174 .0489 .0242 .0620 .0320 .0758 .9407 .0902 .0501 .1050 .0600 .1200 .0705 .1351 .0813 .1501 .0924 .1651 .1037 .1798 .115l .1944 .1266 .2087 .1382 .2227 .t497 .2365 .1612 .2499 .1725 .2630 .1838 .2883 .2060 .3123 .2275 .3350 .2483 .3565 .2683 .3767 .2876 .4229 .3325 .4633 .3730 .4987 .4094 .5300 .4421 .5824 .4985
(3, 3)
(3, 4)
(4, 4)
.0098 .0170 .0261 .0368 °0489 .0620 .0758 .0902 .1050 .1200 .1350 .1501 .1650 .1798 .1943 .2087 .2227 .2364 .2499 .2630 .2883 .3122 .3350 .3564 .3768 .4229 .4634 .4987 .5298 .5824
.0038 .0071 .0117 .0174 .0242 .0320 .0407 .0500 .0600 .0704 .0813 .0923 .1037 .1151 .1266 .1382 .1497 .1611 .1725 .1838 .2059 .2274 .2482 .2682 .2875 .3325 .3730 .4093 .4422 .4985
.0015 .0030 .0052 .0082 .0119 .0164 .0216 .0274 .0339 .0408 .0482 .0560 .0641 .0726 .0812 .0900 .0990 .1081 .1172 .1264 .1448 .1631 .1812 .1989 .2163 .2578 .2964 .3319 .3647 .4223
567
Likelihood ratio tests for mean vectors and covariance matrices
Table 17 (continued) M~P2)
(2, 2/
(2, 3)
.0555 .0818 .1097 .1380 .1662 .1936 .2202 .2458 .2702 .2934 .3155 .3365 .3564 .3754 .3933 .4104 .4267 .4421 .4568 .4708 .4967 .5205 .5422 .5621 .5802 .6200 .6529 .6807 .7046 .7429
.0187 .0305 .0446 .0603 .0772 .0949 .1130 .1313 .1496 .1677 .1855 .2031 .2202 .2368 ,2531 .2688 .2841 .2989 .3132 .3271 .3534 .3781 .4011 .4228 .4429 .4882 .5268 .5604 .5896 .6381
(2, 4)
(2, 5)
(3, 3)
(3, 4)
(4, 4)
b = 3 , a =0.05 .0067 .0025 .0119 .0048 .0188 .0080 .0270 .0122 .0365 .0173 .0470 .0233 .0584 .0301 .0704 .0376 .0829 .0457 .0958 .0543 .1088 .0633 .1221 .0726 .1353 .0823 .1486 .0921 .1618 .1022 .1748 .1123 .1877 .1225 .2005 .1328 .2129 .1430 .2253 .1532 .2492 .1735 .2721 .1933 .2941 .2128 .3150 .2317 .3349 .2501 .3808 .2934 .4213 .3330 .4573 .3692 .4894 .4020 .5438 .4592
.0067 .0119 .0188 .0270 .0365 .0470 .0584 .0704 .0829 .0957 .1088 .1220 .1353 .1485 .1617 .1747 .1877 .2004 .2130 .2252 .2492 .2720 .2940 .3149 .3349 .2806 .4212 .4573 .4892 .5438
.0025 .0048 .0080 .0122 .0173 .0233 .0301 .0376 .0457 .0542 .0633 .0726 .0823 .0921 .1021 .1123 .1225 .1327 .1430 .1532 .1734 .1933 ,2128 .2316 .2500 .2934 .3330 .3691 .4019 .4592
.0009 .0019 .0034 .0055 .0082 .0115 .0154 .0199 .0249 .0304 .0364 .0427 .0494 .0564 .0637 .0712 ,0788 .0867 .0946 .1027 .1190 .1354 .1518 .1681 .1842 .2232 .2601 .2945 .3266 .3839
x, 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 22 24 26 28 30 35 40 45 50 60
*The entries in this table give the values c12 where P[X12 >c121H12] = ( 1 - a). For an explanation of the notations, the reader is referred to Section 7. T h e entries in this table are reproduced from Lee, Krishnaiah and C h a n g (1976) with the kind permission of the editor of the South African Statistical Journal.
568
P. R. Krishnaiah aeM Jack C. Lee
References Abruzzi, A~ (1950). Experimental procedures and criteria for estimating and evaluating industrial productivity. Ph.D. Thesis, Columbia University. Anderson, T. W. (1958). A n Introduction to Multivariate Statistical Analysis. Wiley, New York. Anderson, T. W. (1969). Statistical inference for covariance matrices with linear structure. In: P. R. Krishnalah, ed., Multivariate Analysis--f1. Academic Press, New York, pp. 55-66. Bock, R. D. and Bargmann, R. E. (1966). Analysis of covariance structures. Psychometrika 31, 507-534. Bowker, A. H. (1961)~ A representation of Hotelliug's T 2 and Anderson's classification statistic W in terms of simple statistics. In: H. Solomon, ed., Studies in Item Analysis and Prediction. pp. 285-292. Box, G. E. P. (1949). A general distribution theory for a class of" likelihood criteria. Biornetrika 36, 317-346. Chang, T. C., Krishnaiah, P. R° and Lee, J. C. (1977). Approximations to the distributions of the likelihood ratio statistics for testing the hypotheses on covariance matrices and mean vectors simultaneously. In: P. R. Krishnaiah, ed., Applications of Statistics. North-Holland Publishing Company, Amsterdam, pp. 97-103. Consul, P. C. (1969). The exact distributions of likelihood criteria for different hypotheses. In: P. R. Krishnaiah, ed., Multivariate Analvsis--11. Academic Press, New York, pp. 171-181. Davis, A. W. (1971). Percentile approximations for a class of likelihood ratio criteria. Biometrika 58, 349-356. Davis, A. W. and Field, J. B. F. (1971). Tables of some multivariate test criteria. Tech. Report No. 32, Division of Mathematical Statistics, CSIRO, Australia. Dixon, W. J., ed. (1974). BMD Biomedical Computer Programs. University of California Press, Berkeley, CA. Hotelling, H. (1931). The generalization of Student's ratio. Ann. Math. Statist. 2, 360--378. Johnson, N. L., Nixon, E., Amos, D. E. and Pearson, E. S. (1963). Table of percentage points of Pearson curves, for given ~ and f12, expressed in standard measure. Biometrika 50, 459-497. Kendall, M. G. and Stuart, A. (1947). The Advanced Theopy of Statistics (third edition). Hafner Publishing Company, New York. Korin, B. P. (1968). On the distribution of a statistic used for testing a covariance matrix. Biometrika 55, 171-178. Korin, B. P. (1969). On testing of equality of k covariance matrices. Biometrika 56, 216-217. Korin, B. P. and Stevens, E. H. (1973). Some approximations for the distribution of a multivariate likelihood-ratio criterion. J. Roy. Statist. Soc. Ser. B 29, 24-27. Krishnaiah, P. R. (t978). Some recent developments on real multivariate distribution. Developments in Statistics 1, 135-169. Academic Press, New York. Kdshnaiah, P. R. and Lee, J. C. (1976). On covariance structures. Sankhya, Ser. A 38, 357-371. Lee, J. C., Chang, T. C. and Krishnaiah, P. R. (1977). Approximations to the distributions of the likelihood ratio statistics for testing certain structures on the covariance matrices of real multivariate normal populations. In: P. R. Krishnaiah, ed., Multivariate Analysis--IV. Academic Press, New York, pp. 105-118. Lee, J. C., Krishnaiah, P. R. and Chang, T. C. (1976). On the distribution of the likelihood ratio test statistic for compound symmetry. S. African Statist. J. 10, 49-62. Lee, J. C., Krishnaiah, P. R. and Chang, T. C. (1977). Approximations to the distributions of the determinants of real and complex multivariate beta matrices. S. African Statist. J. 11, 13-26.
Likelihood ratio tests for mean vectors and covariance matrices
569
Lee, Y. S. (1972). Some results on the distribution of Wilks' likelihood ratio criteria. Biometrika 59, 649-664. Mathai, A. M. (1971). On the distribution of the ~ e l i h o o d ratio criterion for testing linear hypotheses on regression coefficients. Ann. Inst. Statist. Math. 23, 181-197. Mathai, A. M. and Katiyar, R. S. (1977). The exact percentage points for the problem of testing independence. Unpublished. Mathai, A. M. and Rathie, P. N. (1970). The exact distribution for the sphericity test. J. Statist. Res. (Dacca)4, 140-159. Mathai, A. M. and Rathie, P. N. (1971). The problem of testing independence. Statistica 31, 673-688. Mathai, A. M. and Saxena, R. K. (1973). Generalized Hypergeometric Functions with Applications in Statistics and Physical Sciences. Springer-Veflag, Heidelberg and New York. Mauchly, J. W. (1940). Significance test for sphericiVy of a normal n-variate distribution. Ann. Math. Statist. 11, 204-209. Morrison, D. F. (1967). Multivariate Statistical Methods (second edition published in 1976). McGraw-Hill Book ComI-_~ny, New York. Nagarsenker, B. N. (1975a). Percentage points of WiLks' L~ criterion. Comm. Statist. 4, 629-641. Nagarsenker, B. N. (1975b). The exact distribution of the likelihood ratio criterion for testing equality of means, variances and covariances. J. Statist. Comp. and Simulation 4, 225-233. Nagarsenker, B. N. and PiUai, K. C. S. (1973a). The distribution of the sphericity test criterion. J. Multivariate A n a l 3, 226-235. Nagarsenker, B. N. and Pillai, K. C. S. (1973b). Distribution of the Likelihood ratio criterion for testing a hypothesis specifying a covariance matrix. Biometrika 60, 359-364. Nagarsenker, B. N. and Pillai, K. C. S. (1974). Distribution of the likelihood ratio criterion for testing 32=320,/~ ~P~0, J. Multivariate Anal. 4, 114-122. Nair, U. S. (1938). The application of the moment function in the study of distribution laws in statistics. Biometrika 30, 274-294. Nie, N. H. et al. (1975). Statistical Package for the Social Sciences (second edition). Olkin, I. (1973). Testing and estimation for structures which are circularly symmetic in blocks. In: D. G. Kabe and R. P. Gupta, eds., Multivariate Statistical Inference. North-Holland Publishing Company, Amsterdam, pp. 183-195. Pillai, K. C. S. and Gupta, A. K. (1969). On the exact distribution of Wilks' criterion. Biometrika 56, 109-118. Rao, C. R. (1948). Tests of significance in multivariate analysis. Biomeo'ika 35, 58-79. Rao, C. R. (1951). An asymptotic expansion of the distribution of Wilks' A criterion. Bull. Inst. Intt. Statist. 33, Part II, 177-180. Rao, C. R. (1971). Estimation of variance and covariance c o m p o n e n t s - M I N Q U E theory. J. Multivariate A n a l 1, 257-275. Roy, J. (1951). The distribution of certain likelihood criteria useful in multivariate analysis. Bull. Inst. lntt. Statist. 33, 219-230. Schatzoff, M. (1966). Exact distributions of Wilks's likelihood ratio criterion, Biometrika 53, 347-358. Sinha, B. K. and Wieand, H. S. (1977). MINQUE's of variance and covariance components of certain covariance structures. Indian Statistical Institute, Tech. Rep. 28/77. Srivastava, J. N. (1966). On testing hypotheses regarding a class of covariance structures. Psychometrika 31, 147-164. Tukey, J. W. and Wilks, S. S. (1946). Approximation of the distribution of the product of beta variables by a single beta variable. Ann. Math. Statist. 17, 318-324. Varma, K. B. (1951). On the exact distribution of Lmvc and L~ criteria. Bull. lnst. Int. Statist. 33, 181-214.
570
P. 1L Krishnaiah and Jack C. Lee
Votaw Jr, D. F. (1948). Testing compound symmetry in a normal multivariate distribution. Ann. Math. Statist. 19, 447- 473. Wald, A. and Brookner, R. J. (1941). On the distribution of Wilks ~ statistic for testing independence of several groups of variates. Ann. Math. Statist. 12, 137-152. Wijsman, R. A. (1957). Random orthogonal transformations and their use in some classical distribution problems in multivariate analysis. Anr~ Math. Statist. 28, 415-423. Wilks, S. S. (1932). Certain generalizations in the analysis of variance. Biometrika 24, 471-494. Wilks, S. S. (1935). On the independence of k sets of normally distributed statistical variables. Econometrica 3, 309-325. Wilks, S. S. (1946). Sample criteria for testing equality of means, equality of variances, and equality of covariances in a normal multivariate distribution. Ann. Math. Statist. 17, 257-281.