The distance matrix of a tree with weights on its arcs

The distance matrix of a tree with weights on its arcs

Linear Algebra and its Applications 511 (2016) 365–377 Contents lists available at ScienceDirect Linear Algebra and its Applications www.elsevier.co...

323KB Sizes 1 Downloads 57 Views

Linear Algebra and its Applications 511 (2016) 365–377

Contents lists available at ScienceDirect

Linear Algebra and its Applications www.elsevier.com/locate/laa

The distance matrix of a tree with weights on its arcs ✩ Hui Zhou a,∗ , Qi Ding b,c a

School of Mathematical Sciences, Peking University, Beijing, 100871, PR China School of Mathematics and Statistics, Lanzhou University, Lanzhou, Gansu, 730000, PR China c Yonyou Network Technology Co., Ltd., Yonyou Software Park, No. 68 Beiqing Road, Haidian District, Beijing, 100094, PR China b

a r t i c l e

i n f o

Article history: Received 19 July 2016 Accepted 19 September 2016 Available online 21 September 2016 Submitted by R. Brualdi MSC: 05C05 05C12 15A09 15A15

a b s t r a c t An ordered pair of two adjacent vertices in a graph is called an arc. In this paper, we give the determinant and the inverse, whenever it exists, of the distance matrix of an arc-weighted tree with arc weights in a ring. These generalize known results. We also give examples that cannot be dealt with by known results. However, our results can be applied for these examples. © 2016 Elsevier Inc. All rights reserved.

Keywords: Tree Ring Arc weights Distance matrix Determinant Inverse



Supported by China Scholarship Council (CSC) under file number 201506010016.

* Corresponding author. E-mail addresses: [email protected], [email protected] (H. Zhou), [email protected], [email protected] (Q. Ding). http://dx.doi.org/10.1016/j.laa.2016.09.028 0024-3795/© 2016 Elsevier Inc. All rights reserved.

366

H. Zhou, Q. Ding / Linear Algebra and its Applications 511 (2016) 365–377

1. Introduction Let n  1 be an integer, and let [n] = {1, 2, . . . , n}. Let T be a tree with vertex set [n]. Then for any two vertices i, j ∈ [n], there is a unique path Pi,j between i and j. The length (number of edges) of Pi,j is called the distance between i and j and is denoted by di,j . For each i ∈ [n], let di,i be zero. Then the matrix D = (di,j )i,j∈[n] is the distance matrix of T . Graham and Pollak [7] showed a very beautiful formula det(D) = (−1)n−1 2n−2 (n − 1). Thus the determinant of D only depends on the number of vertices of the tree T , but has nothing to do with the structure of it. Yan and Yeh [8] gave a simple proof of Graham and Pollak’s formula. For each i ∈ [n], let di be the degree of the vertex i in T . Let δ be an n × 1 column vector whose i-th entry is δi = 2 − di . For any matrix M and any (row or column) vector β, we use M T and β T to denote their transposes. When n  2, D is invertible and Graham and Lovasz [6] gave a formula for the inverse of D, δδ T D−1 = − 12 L + 2(n−1) , where L is the Laplacian matrix of T . As an analogue, the distance matrix of an edge-weighted tree was considered. Let e1 , e2 , . . . , en−1 be all the edges of T . For each i ∈ [n], the edge ei has a positive number weight αi . For any two vertices i, j ∈ [n], define ei,j to be the sum of the weights of the edges on the path Pi,j . By convention, let ei,i = 0 for each i ∈ [n]. The distance matrix of the edge-weighted tree T is E = (ei,j )i,j∈[n] . Bapat, Kirkland and Neumann [3] gave n−1 n−1    n−1 n−2 the formula of the determinant of E, det(E) = (−1) 2 αi αi , and the i=1

i=1

T ˜ + δδ ˜ is the Laplacian matrix of formula of the inverse of E, E−1 = − 12 L , where L 2 n−1 i=1 αi the weighting of T that arises by replacing each edge weight by its reciprocal. An ordered pair of two adjacent vertices is called an arc. So each edge corresponds to two arcs. Let each arc (i, j) of T have a positive number weight vi,j . Let i, j ∈ [n] be two vertices, and let Pi,j = u0 , u1 , u2 , . . . , ud be the unique path from i to j in T , where d  1, u0 = i and ud = j. Then define the distance from i to j be the sum si,j = d k=1 vuk−1 ,uk . For each i ∈ [n], let si,i = 0. The distance matrix of the arc-weighted tree T is A = (si,j )i,j∈[n]. Bapat, Lal and Pati  [4] showed the formula  of the determinant     v v i,j j,i of A, det(A) = (−1)n−1 vi,j + vj,i vi,j +vj,i , and the formula of

{i,j}∈E(T )

−1

the inverse of A, A

ˇ + (−1)n−1 = −L det(A)

{i,j}∈E(T ) z zT , 1 2  vi,j +vj,i

ˇ is the Laplacian where L

{i,j}∈E(T )

1 matrix of the arc-weighted tree T that arises by replacing each weight vi,j by vi,j +v , j,i and z1 and z2 are column vectors as defined in [4]. Let p  1 and s  1 be integers, and let S be a commutative ring with identity. Let R be the ring consisting of real numbers, let Zp be the ring consisting of integers modulo p, and let Ms (S) be the ring consisting of s × s matrices over S. Then R and Zp are commutative, while Ms (S) is noncommutative. All the above weights are positive numbers in R, however noncommutative weights coming from Ms (S) were also considered. Let the edge ej have an s × s matrix weight Wj for each j = 1, 2, . . . , n − 1. Let i, j ∈ [n] with i = j, and let ek1 , ek2 , . . . , ekd be all

H. Zhou, Q. Ding / Linear Algebra and its Applications 511 (2016) 365–377

367

the edges on the unique path Pi,j from i to j in T , where d  1. Then define Mi,j = d i=1 Wki . For each i ∈ [n], define Mi,i to be the s ×s null matrix. The ns ×ns partitioned matrix M, whose (i, j)-block is Mi,j , is the distance matrix of the tree T with matrix weights on its edges.  Bapat [2] formula of the determinant of M, det(M) =  showed n−1the  n−1   (−1)(n−1)s 2(n−2)s det Wi det Wi . When edge weights W1 , W2 , . . . , Wn−1 are i=1 i=1 n−1 real positive definite matrices, then R = i=1 Wi is also positive definite, and Balaji ˆ + 1 ΔR−1 ΔT , where L ˆ and Bapat [1] gave the formula of the inverse of M, M−1 = − 1 L 2

2

is the Laplacian matrix as defined in [1] and Δ is the tensor product of δ and the s × s identity matrix. In this paper, we consider the distance matrix of a tree with weights on its arcs, where the weights are in a ring. Let T be a tree with vertex set [n]. The order of the vertices of T is called a left unique order, if for each j > 1, there exists a unique vertex i, which is denoted by rj , such that i < j and {i, j} ∈ E(T ). The left unique order was first defined by the authors in [9]. It is obvious that r2 = 1. For example, the depth first searching order, the breadth first searching order and the rooted tree order are left unique orders of the vertices of the tree T . We suppose the order of the vertices of T is a left unique order. Let R be a (commutative or noncommutative) ring with identity. Let each arc (i, j) of T have a weight wi,j ∈ R. Let i, j ∈ [n] be two vertices of T , and let Pi,j = u0 , u1 , u2 , . . . , ud be the unique path from i to j in T , where d  1, u0 = i and ud = j. Then define the d distance from i to j be the sum Di,j = k=1 wuk−1 ,uk ∈ R. For each i ∈ [n], let Di,i be the zero element of R. The matrix D = (Di,j )i,j∈[n] is the distance matrix of the tree T with weights on its arcs. We assume that wi,j + wj,i is invertible for each edge {i, j} ∈ E(T ). When the ring R is commutative or R = Ms (S), then a formula for det(D) is obtained, which shows that the determinant of the distance matrix D is again independent of the structure n of the tree T (see Theorem 4 and Theorem 7). When the element j=2 wrj ,j (wrj ,j + wj,rj )−1 wj,rj is also invertible, then D is invertible and the inverse of D has an explicit expression (see Theorem 10). These generalize known results for the distance matrix of an arc-weighted tree when the weights are positive numbers, and for the distance matrix of an edge-weighted tree when the weights are s × s positive definite matrices. We also give examples of our results. We consider the tree with arc weights from R, Zp or M2 (R). Notice that, R properly contains positive numbers and M2 (R) properly contains 2 × 2 positive definite matrices over R. So the known results could not be used to deal with these examples. 2. The distance matrix We first give a formula about the distances. Lemma 1. Let i, j ∈ [n] \ {1} be two different vertices of the tree T . Then Di,j + Dri ,rj = Di,rj + Dri ,j .

(1)

368

H. Zhou, Q. Ding / Linear Algebra and its Applications 511 (2016) 365–377

Proof. Since i = j, we have i > j or i < j. The proofs for the two cases are similar, so we only give the proof of the case i < j. Suppose that i < j. Then rj is on the unique path Pi,j between i and j. If not, then the union of the paths P1,i , P1,j and Pi,j contains a cycle, which is a contradiction. If ri is on Pi,j , then Di,j = Di,ri + Dri ,rj + Drj ,j . So Di,j + Dri ,rj = (Di,ri + Dri ,rj ) + (Dri ,rj + Drj ,j ) = Di,rj + Dri ,j . If ri is not on Pi,j , then Dri ,rj = Dri ,i + Di,rj , Dri ,j = Dri ,rj + Drj ,j and Di,j = Di,rj + Drj ,j . So Di,j + Dri ,rj = Di,rj + (Dri ,i + Di,rj + Drj ,j ) = Di,rj + Dri ,j . 2 For a matrix M , we use Mi,j to denote the (i, j)-entry of M . We use 0 and 1 to denote the zero element and identity element in R, respectively. Let L, U be n × n matrices defined as follows. ⎧ ⎧ ⎪ ⎪ if j = i; if i = j; ⎨ 1, ⎨ 1, Li,j = −1, if j = ri ; Ui,j = −1, if i = rj ; ⎪ ⎪ ⎩ 0, ⎩ 0, otherwise. otherwise. It is obvious that L is a lower triangular matrix whose diagonal entries are identities, U is an upper triangular matrix whose diagonal entries are identities. For each j = 2, 3, . . . , n, let xj = wj,rj + wrj ,j . Let V be the following n × n matrix ⎛

0

⎜w ⎜ 2,r2 ⎜ w 3 V =⎜ ⎜ 3,r ⎜ .. ⎝ . wn,rn

wr2 ,2 −x2 0 .. . 0

⎞ · · · wrn ,n ··· 0 ⎟ ⎟  ⎟ ··· 0 ⎟= 0 K .. ⎟ .. ⎟ . . ⎠ · · · −xn

wr3 ,3 0 −x3 .. . 0

 H −Q

,

where H = (wr2 ,2 , wr3 ,3 , . . . , wrn ,n ), K = (w2,r2 , w3,r3 , . . . , wn,rn )T and ⎛

x2 ⎜ ⎜ 0 Q=⎜ ⎜ .. ⎝ . 0

0 x3 .. . 0

⎞ ··· 0 ⎟ ··· 0 ⎟ . ⎟ .. ⎟. . .. ⎠ · · · xn

Then we have the following lemma. Lemma 2. With notations as above, then LDU = V . Proof. Let i, j ∈ [n] \ {1}, and let F = LDU . Then F1,1 =



L1,k Dk,h Uh,1 =

k,h∈[n]

F1,j =



k,h∈[n]



D1,h Uh,1 = D1,1 = 0 = V1,1 ,

h∈[n]

L1,k Dk,h Uh,j =



h∈[n]

D1,h Uh,j

H. Zhou, Q. Ding / Linear Algebra and its Applications 511 (2016) 365–377

Fi,1

369

= D1,j − D1,rj = Drj ,j = wrj ,j = V1,j ,   = Li,k Dk,h Uh,1 = Li,k Dk,1 k,h∈[n]

h∈[n]

= Di,1 − Drj ,1 = Di,ri = wi,ri = Vi,1 . If i = j, then Fi,i =





Li,k Dk,h Uh,i =

k,h∈[n]

Di,h Uh,i −

h∈[n]



Dri ,h Uh,i

h∈[n]

= Di,i − Di,ri − Dri ,i + Dri ,ri = −wi,ri − wri ,i = −xi = Vi,i . Now suppose i = j. Then by Equation (1) in Lemma 1, we have 

Fi,j =

Li,k Dk,h Uh,j =

k,h∈[n]



Di,h Uh,j −

h∈[n]



Dri ,h Uh,j

h∈[n]

= Di,j − Di,rj − Dri ,j + Dri ,rj = 0 = Vi,i . So F = LDU = V . 2 3. Determinant of D For each j = 2, 3, . . . , n, let xj = wj,rj + wrj ,j . Suppose x2 , x3 , . . . , xn are invertible, and let X, Y be n × n matrices defined as follows. ⎧ ⎪ ⎨

Xi,j

1, = wrj ,j x−1 j , ⎪ ⎩ 0,

if j = i; if j > 1, i = 1; otherwise.

⎧ ⎪ ⎨ Yi,j =

⎪ ⎩

1, x−1 i wi,ri , 0,

if i = j; if i > 1, j = 1; otherwise.

It is obvious that X is an upper triangular matrix whose diagonal entries are identities, Y is a lower triangular matrix whose diagonal entries are identities. Let x1 = n −1 wj,rj , and let Z be the following n × n matrix j=2 wrj ,j (wrj ,j + wj,rj ) ⎛

x1 ⎜ 0 ⎜ ⎜ 0 Z=⎜ ⎜ . ⎜ . ⎝ . 0

0 −x2 0 .. . 0

0 0 −x3 .. . 0

⎞ ··· 0 ··· 0 ⎟ ⎟  ⎟ ··· 0 ⎟ = x1 0 .. ⎟ ⎟ .. . . ⎠ · · · −xn

 0 −Q

.

Then the following lemma follows by direct calculations. Lemma 3. With notations as above, and suppose xj is invertible for each j = 2, 3, . . . , n, then XV Y = Z.

370

H. Zhou, Q. Ding / Linear Algebra and its Applications 511 (2016) 365–377

When the ring R is commutative, then the determinants of matrices over R are well defined, and det(AB) = det(A) det(B) for any n × n matrices A, B over R. Then the above lemmas imply the following Theorem 4 and Theorem 7 about the determinant of the distance matrix D. Theorem 4. With notations as above, and suppose the ring R is commutative and xj is invertible for each j = 2, 3, . . . , n, then det(D) = det(Z) = (−1)n−1 ⎛ = (−1)n−1 ⎝

n 

xj

j=1 n  

wrj ,j

⎞⎛ ⎞ n   −1 + wj,rj ⎠ ⎝ wrj ,j wrj ,j + wj,rj wj,rj ⎠

j=2

= (−1)n−1

n 

j=2

wri ,i wi,ri

i=2

n   wrj ,j + wj,rj .

(2)

j=2,j=i



 0 H Remark 5. Consider V = , then by Fact 2.14.2 in Bernstein’s book [5], K −Q Equation (2) in Theorem 4 still holds without assuming x2 , x3 , . . . , xn are invertible. So, when we take R = Z2 , D is invertible if and only if there exists a unique j ∈ [n] \ {1} such that xj = 0 and wj,rj = wrj ,j = 1. We leave the proof to the reader. If the weights are taken to be positive numbers, then (−1)n−1 det(D) is also positive, and part of Theorem 3.1 of [4], Corollary 2.5 of [3] and Graham and Pollack’s formula [7] are implied by Theorem 4. Example 6 (Weights from R). Let the ring R = R and let 0 = α ∈ R. Suppose the weights wrj ,j = j and wj,rj = α − j for each j = 2, 3, . . . , n. Then wj,rj  0 if j  α and n  −1  xj = wrj ,j +wj,rj = α for each j = 2, 3, . . . , n. So x1 = wrj ,j wrj ,j + wj,rj wj,rj = 1 α

n  j=2

j(α − j) = − α1

1

 3 n(n + 1) n − 

det(D) = (−α)

n−2

3α−1 2

j=2

+ (α − 1) . Then by Theorem 4, we have

   1 3α − 1 n(n + 1) n − + (α − 1) . 3 2

Let R = Ms (S). Then the weights and x1 , x2 , . . . , xn are s × s matrices. Theorem 7. With notations as above, and suppose the ring R = Ms (S), where S is a commutative ring with identity, and xj is an invertible s × s matrix for each j = 2, 3, . . . , n, then

H. Zhou, Q. Ding / Linear Algebra and its Applications 511 (2016) 365–377

det(D) = det(Z) = det(x1 ) ⎛ = (−1)(n−1)s ⎝

n 

det(−xj )

j=2 n 

j=2

371



det wrj ,j

⎞ ⎛ ⎞ n   −1 + wj,rj ⎠ det ⎝ wrj ,j wrj ,j + wj,rj wj,rj ⎠ . j=2

If for each edge e = {i, j} ∈ E(T ), the weights wi,j = wj,i = we , then Theorem 2 of [2] is implied by Theorem 7. Remark 8. Suppose the ring R = Ms (S), where S is a commutative ring with identity, and wrj ,j = wj,rj for each j = 2, 3, . . . , n. When n = 2, then det(D) = (−1)s det(w1,2 ) det(w2,1 ) without assuming x2 is invertible. But, for the general case when n  3 and s  2, the formula for det(D) without assuming x2 , x3 , . . . , xn are invertible, is still open. 

 2 0 0 −2

Example 9 (Weights from M2 (R)). Let R = M2 (R) and let W = ∈ R.   −1 0 Suppose the weights wrj ,j = W and wj,rj = for each j = 2, 3, . . . , n. Then 0 3   1 0 wrj ,j and wj,rj are not positive definite and xj = wrj ,j + wj,rj = for each 0 1   −2(n − 1) 0 j = 2, 3, . . . , n. So x1 = . By Theorem 7, we have det(D) = 0 −6(n − 1) 12(n − 1)2 . As applications, Theorem 7 gives the following determinant identities. Let m  3, let A, B, A1 , A2 , A3 , B1 , B2 and B3 be s × s matrices such that A + B, A1 + B1 , A2 + B2 and A3 + B3 are invertible. We consider the partitioned matrices ⎛

⎞ 0 A 2A · · · (m − 1)A ⎜ B 0 A · · · (m − 2)A ⎟ ⎜ ⎟ ⎜ ⎟ 2B B 0 · · · (m − 3)A ⎟ , X=⎜ ⎜ ⎟ .. .. .. .. .. ⎜ ⎟ . ⎝ ⎠ . . . . (m − 1)B (m − 2)B (m − 3)B · · · 0 ⎛ ⎞ 0 A1 A1 + A2 A1 + A2 + A3 ⎜ B1 0 A2 A 2 + A3 ⎟ ⎜ ⎟ Y =⎜ ⎟. ⎝ B1 + B2 ⎠ B2 0 A3 B1 + B2 + B3 B2 + B3 B3 0 Let Pm be the path on the vertex set [m] = {1, 2, . . . , m}. Two vertices i, j ∈ [m] are adjacent in Pm if and only if i, j are consecutive numbers. Let each arc (i, j) have the

372

H. Zhou, Q. Ding / Linear Algebra and its Applications 511 (2016) 365–377

weight A if i < j, B if i > j. By the above definition, the matrix X is the distance matrix of the path Pm . Let G = P4 such that each arc (i, j) has weight Ai if i < j, Bj if i > j. Then the matrix Y is the distance matrix of the graph G. It follows from Theorem 7 that  m−2 det(X) = (−1)(m−1)s (m − 1)s det(A) det(B) det(A + B) ,  3  3   3s −1 det(Ai + Bi ) det Ai (Ai + Bi ) Bi . det(Y ) = (−1) i=1

i=1

4. Inverse of D Let s  1 be an integer, and let Is be the s × s identity matrix. Let a, b be positive integers, and let A, B, C, D be a × a, a × b, b × a and b × b matrices, respectively.  −1 Suppose D and A − BD−1 C are invertible matrices, and let E = A − BD−1 C . By   A B Proposition 2.8.7 of Bernstein’s book [5], the inverse of the partitioned matrix C D is  A C

B D

−1



 −EBD−1 = −D−1 CE D−1 + D−1 CEBD−1       Ia 0 0 −1 E Ia −BD . + = −D−1 C 0 D−1 E

(3)

Suppose xj = wj,rj + wrj ,j is invertible for each j = 2, 3, . . . , n. Now we define the n × n ring Laplacian matrix Lr of the tree T and n × 1 column vectors α and β as follows: ⎧  ⎪ (wi,k + wk,i )−1 , if j = i; ⎪ ⎨ {i,k}∈E(T ) Lri,j = −(w + w )−1 , if {i, j} ∈ E(T ); i,j j,i ⎪ ⎪ ⎩ 0, otherwise;  (wi,j + wj,i )−1 wj,i ; (4) αi = 1 − {i,j}∈E(T )

βi = 1 −



wi,j (wi,j + wj,i )−1 .

(5)

{i,j}∈E(T )

Theorem 10. With notations as above, and suppose the element xi is invertible for each i ∈ [n], then D is invertible and T D−1 = −Lr + αx−1 1 β .

H. Zhou, Q. Ding / Linear Algebra and its Applications 511 (2016) 365–377

373

Proof. Suppose xi is invertible for each i ∈ [n]. Then Q is invertible and x1 = n  −1 wrj ,j x−1 K. By Equation (3), the inverse of the matrix V is j wj,rj = HQ j=2

 0 K

V −1 =

H −Q



−1

 1

=

Q−1 K

x−1 1



1 HQ−1







0 0



0 Q−1

.

By Lemma 2, D is invertible and D−1 = U V −1 L      1 0 −1 x1 =U 1 HQ−1 L − U Q−1 K 0  Let φ = U

 0 Q−1

L.

 1

and let i ∈ [n] \ {1}. Then

Q−1 K

φ1 = U1,1 +

 j>1

φi =



U1,j x−1 j wj,rj = 1 −

Ui,i x−1 i wi,ri

+



x−1 j wj,rj = α1 ,

1=rj

Ui,j x−1 j wj,rj

= x−1 i wi,ri −

j>i

= 1 − x−1 i wri ,i −





x−1 j wj,rj

i=rj

x−1 j wj,i = αi .

i=rj

  So α = φ. Similarly, we can get that β T = 1 HQ−1 L.   0 0 Let Ω = U L and let i, j ∈ [n]. Since L = U T , the matrix Ω is symmetric. 0 Q−1  Then Ωi,j = Ui,k x−1 k Lk,j . Suppose i > 1, then k>1

Ω1,1 =



U1,k x−1 k Lk,1 =

Ωi,i =

r x−1 k = L1,1 ,

1=rk

k>1





Ui,k x−1 k Lk,i

= x−1 i +



r x−1 k = Li,i .

i=rk

k>1

If {1, i} ∈ E(T ), then 1 = ri and Ωi,1 = Ω1,i =

 k>1

U1,k x−1 k Lk,i = −



−1 x−1 = Lr1,i = Lri,1 . k Lk,i = −xi

1=rk

Let j > 1 and suppose {i, j} ∈ E(T ). Without loss of generality, we may assume j > i, i.e. i = rj . Then Li,j = Uj,i = 0 and

374

H. Zhou, Q. Ding / Linear Algebra and its Applications 511 (2016) 365–377

Ωj,i = Ωi,j =





Ui,k x−1 k Lk,j = −

−1 r r x−1 k Lk,j = −xj = Li,j = Lj,i .

i=rk

k>1

Notice that, for any r, t ∈ [n], if r = t and {r, t} ∈ / E(T ), then at least one of Ur,k and Lk,t is zero for any k ∈ [n]. Now suppose i = j and {i, j} ∈ / E(T ). Then Ωi,j = 0 = Lri,j . −1 T r −1 r So L = Φ. Hence D = αx1 β − L . 2 When the weights are positive real numbers, Bapat, Lal and Pati [4] gave formulas of the determinant and the inverse of the distance matrix D. An orientation T or of the tree T is a directed graph obtained from T by replacing each edge by one of the two arcs corresponding to this edge. Hence T has 2n−1 orientations. Let T or be an orientation of T , and let us denote the indegree and the outdegree of the vertex v in T or by InT or (v) and OutT or (v), respectively. Let w(T or ) be the product of weights of arcs of T or . Define w(T ) to be

w(T ) =

 T or





w(T or ) =

n     wi,j + wj,i = wrj ,j + wj,rj . j=2

i
Then det(D) = (−1)n−1 w(T )x1 , where x1 =

n  wrj ,j wj,rj . The n × 1 column vectors w + wj,rj j=2 rj ,j

z1 , z2 are defined as z1 (i) = (−1)n

 T or

z2 (i) = (−1)n



 InT or (i) − 1 w(T or ),  OutT or (i) − 1 w(T or ).

T or

Then Theorem 10 implies the following result. Corollary 11 (Theorem 3.1 of [4]). With notations as above, and suppose the arc weights are positive real numbers, then D is invertible and z1 zT2 (−1)n−1 det(D)w(T )   T (−1)n−1 z2 1 (−1)n−1 z1 r = −L + . x1 w(T ) w(T )

D−1 = −Lr +

Proof. We only need to show that α = analogous, so we only give the proof of α

(−1)n−1 z1 and w(T ) (−1)n−1 z1 = w(T ) .

β =

(−1)n−1 z2 . w(T )

The proofs are

H. Zhou, Q. Ding / Linear Algebra and its Applications 511 (2016) 365–377

375

For each i ∈ [n], by counting in two ways and Equation (4), we have (−1)n−1 z1 (i) =



 1 − InT or (i) w(T or )

T or

=



w(T or ) −

T or

= w(T ) −

 T or





InT or (i)w(T or )

T or









⎞ w(T or )⎠

T or ,(k,i)∈T or

wk,i

{k,i}∈E(T )

= w(T ) ⎝1 −





{k,i}∈E(T )

= w(T ) −

1⎠ w(T or )

k,(k,i)∈T or



= w(T ) −





 {k,i}∈E(T )

w(T ) wk,i + wi,k ⎞

wk,i ⎠ = w(T )αi . wk,i + wi,k

Then (−1)n−1 z1 = w(T )α, which completes the proof. 2 Suppose for each edge e = {i, j} ∈ E(T ), the weights satisfy wi,j = wj,i = we , then δi i for each i ∈ [n], αi = βi = 2−d 2 1R = 2 1R , where 1R is the identity of the ring R. Let s  1 be an integer, let S be a commutative ring with identity, and let R = Ms (S). Then 1R is the s × s identity matrix. The distance matrix D is an ns × ns partitioned matrix. Theorem 7 gives the determinant of D. When all the matrix weights are positive definite n matrices, then det(D) = 0 and D is invertible. Let Λ = j=2 wj,rj , and let Δ be the tensor product of δ (defined in the first paragraph of Section 1) and the s × s identity matrix. Then Λ is positive definite, Λ = 2x1 , and Δ = α2 = β2 . Then Theorem 10 implies the following result. Corollary 12 (Theorem 3.7 of [1]). With notations as above, and suppose that R = Ms (S) and all the edge weights are positive definite matrices in R, then D is invertible and 1 D−1 = −Lr + ΔΛ−1 ΔT . 2 5. More examples At the end of this paper, we give examples which are applications of our results, but can not be dealt with by known results. For the simplicity of calculation, we consider the tree T being the path P3 (defined at the end of Section 3) with arc weights from R, Zp or M2 (R).

376

H. Zhou, Q. Ding / Linear Algebra and its Applications 511 (2016) 365–377

Example 13 (Weights from R). We use notations as in Example 6. If α  1 and n  3α−1 matrix D is invertible. Suppose α = 1 and n = 3. Then ⎞ ⎛2 , then the⎞distance ⎛ 0 2 5 1 −1 0 ⎟ ⎟ ⎜ ⎜ D = ⎝ −1 0 3 ⎠, Lr = ⎝ −1 2 −1 ⎠, α = (2, 1, −2)T , β = (−1, −1, 2)T and −3 −2 0 0 −1 1 det(D) = x1 = −8. Then by Theorem 10, we have ⎛

D−1

⎞ 6 −10 6 1⎜ ⎟ T = −Lr + αx−1 15 −5 ⎠ . 1 β = − ⎝ −9 8 2 −6 2

Example 14 (Weights from Zp ). Let T = P3 with weights in Z5 . Suppose the ⎛ arc weights ⎞ 0 2 0 ⎜ ⎟ are w1,2 = 2, w2,3 = 3, w3,2 = 3 and w2,1 = 4. The distance matrix is D = ⎝ 4 0 3 ⎠. 2 3 0 ⎞ ⎛ 1 4 0 ⎟ ⎜ r Then x2 = x3 = 1, L = ⎝ 4 2 4 ⎠, α = (2, 1, 3)T , β = (4, 4, 2)T , det(D) = x1 = 2 0 4 1 −1 and x1 = 3. Then by Theorem 10, we have ⎛

D−1

3 ⎜ T = −Lr + αx−1 β = 3 ⎝ 1 1

0 0 2

⎞ 3 ⎟ 0⎠. 1

Example 15 (Weights from M2 (R)). We use notations as in Example 9. If n  2, the distance matrix D is invertible. Let T = P3 . The distance matrix is ⎛



0 W ⎜ D = ⎝ I2 − W 0 2(I2 − W ) I2 − W

0 0 2 ⎞ ⎜ 0 0 0 ⎜ 2W ⎜ ⎟ ⎜ −1 0 0 W ⎠=⎜ ⎜ 0 3 0 ⎜ 0 ⎝ −2 0 −1 0 6 0

⎞ 0 4 0 −2 0 −4 ⎟ ⎟ 0 2 0 ⎟ ⎟ ⎟. 0 0 −2 ⎟ ⎟ 0 0 0 ⎠ 3 0 0

⎞ ⎞ ⎛ ⎛ ⎞ I2 −I2 W I2 − W 0 ⎟ ⎟ ⎜ ⎜ ⎜ ⎟ Then Lr = ⎝ −I2 2I2 −I2 ⎠, α = ⎝ 0 0 ⎠, β = ⎝ ⎠, x1 = 2W (I2 −W ). 0 −I2 I2 I2 − W W The matrices W and I2 − W commute with each other. Then by Theorem 10, we have ⎛



D−1

I2 − 12 I2 ⎜ −1 T r = −L + αx1 β = ⎝ I2 −I2 1 −1 (I2 − W ) I2 2W

⎞ − W )−1 ⎟ I2 ⎠ 1 2 I2

1 2 W (I2

H. Zhou, Q. Ding / Linear Algebra and its Applications 511 (2016) 365–377



− 12 ⎜ 0 ⎜ ⎜ 1 ⎜ =⎜ ⎜ 0 ⎜ 1 ⎝ −4 0

0 − 12 0 1 0 − 34

1 0 0 1 −2 0 0 −2 1 0 0 1

−1 0 1 0 − 12 0

0 − 13 0 1 0 − 12

377

⎞ ⎟ ⎟ ⎟ ⎟ ⎟. ⎟ ⎟ ⎠

References [1] R. Balaji, R.B. Bapat, Block distance matrices, Electron. J. Linear Algebra 16 (2007) 435–443. [2] R.B. Bapat, Determinant of the distance matrix of a tree with matrix weights, Linear Algebra Appl. 416 (2006) 2–7. [3] R.B. Bapat, S.J. Kirkland, M. Neumann, On distance matrices and Laplacians, Linear Algebra Appl. 401 (2005) 193–209. [4] R.B. Bapat, A.K. Lal, S. Pati, The distance matrix of a bidirected tree, Electron. J. Linear Algebra 17 (2009) 233–245. [5] D.S. Bernstein, Matrix Mathematics: Theory, Facts, and Formulas, second edition, Princeton University Press, Princeton, NJ, 2009. [6] R.L. Graham, L. Lovasz, Distance matrix polynomials of trees, Adv. Math. 29 (1978) 60–88. [7] R.L. Graham, H.O. Pollak, On the addressing problem for loop switching, Bell Syst. Tech. J. 50 (1971) 2495–2519. [8] W. Yan, Y. Yeh, Note: a simple proof of Graham and Pollak’s theorem, J. Combin. Theory Ser. A 113 (2006) 892–893. [9] H. Zhou, Q. Ding, The product distance matrix of a tree with matrix weights on its arcs, Linear Algebra Appl. 499 (2016) 90–98.