Partial orders on B(H)

Partial orders on B(H)

Linear Algebra and its Applications 481 (2015) 115–130 Contents lists available at ScienceDirect Linear Algebra and its Applications www.elsevier.co...

357KB Sizes 2 Downloads 98 Views

Linear Algebra and its Applications 481 (2015) 115–130

Contents lists available at ScienceDirect

Linear Algebra and its Applications www.elsevier.com/locate/laa

Partial orders on B(H) ✩ Dragana S. Cvetković-Ilić a,∗ , Dijana Mosić a , Yimin Wei b a

Faculty of Sciences and Mathematics, University of Niš, P.O. Box 224, 18000 Niš, Serbia b School of Mathematical Sciences and Key Laboratory of Mathematics for Nonlinear Sciences, Fudan University, Shanghai, 200433, PR China

a r t i c l e

i n f o

Article history: Received 17 December 2014 Accepted 20 April 2015 Available online 14 May 2015 Submitted by P. Semrl

a b s t r a c t In this paper we characterize the sets of all B ∈ B(H) such that AρB and the sets of all B ∈ B(H) such that BρA, where A ∈ B(H) is given and ρ ∈ {≤− , ≤∗ , ≤# , ≤⊕ }. © 2015 Elsevier Inc. All rights reserved.

MSC: 15A09 47A05 47A99 Keywords: Minus partial order Star partial order Sharp partial order Core partial order

✩ This work is supported by the bilateral project between Serbia and China, “Generalized inverses and applications” No. 2-15. The first and the second authors are also supported by Grant No. 174007 of the Ministry of Science, Technology and Development, Republic of Serbia. The third author is supported by the National Natural Science Foundation of China under grant 11271084. * Corresponding author. E-mail addresses: [email protected] (D.S. Cvetković-Ilić), [email protected] (D. Mosić), [email protected], [email protected] (Y. Wei).

http://dx.doi.org/10.1016/j.laa.2015.04.025 0024-3795/© 2015 Elsevier Inc. All rights reserved.

116

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

1. Introduction Throughout the paper H and K will denote Hilbert spaces. By B(H, K) we denote the set of all bounded linear operators from H to K. Set B(H) = B(H, H). For an operator A ∈ B(H, K), the symbols N (A) and R(A), respectively, will denote the null space and the range of A. If A ∈ B(H, K) and there exists some C ∈ B(K, H) such that ACA = A, we say that C is an inner generalized inverse of A and call the operator A relatively regular. An operator A ∈ B(H, K) is relatively regular if and only if R(A) is closed in K. By Breg (H, K) we denote the set of all relatively regular operators from B(H, K). The Moore–Penrose inverse of A ∈ B(H, K) (if it exists) is the unique operator X ∈ B(K, H) satisfying the following: (1) AXA = A

(2) XAX = X

(3) (AX)∗ = AX

(4) (XA)∗ = XA.

(1)

It is denoted by A† . It is well-known that A† exists for a given A if and only if A is a relatively regular operator. For any A ∈ B(H, K), let A{i, j, . . . , k} denote the set of all X ∈ B(K, H) which satisfy equations (i), (j), . . . , (k) of (1). Operators in A{i, j, . . . , k} are known as {i, j, . . . , k}-inverses of A and it is customary to denote a generic element of this set by A(i,j,...,k) . Evidently, A{1, 2, 3, 4} = {A† }. For A ∈ B(H), the group inverse of A (if it exists) is the unique operator A# ∈ B(H) such that AA# A = A, A# AA# = A# , AA# = A# A. A ∈ B(H) has the group inverse if and only if the Drazin index ind(A) ≤ 1. In that case we say that A is group invertible. If A ∈ B(H) is such that R(A) is closed and AA† = A† A, then we say that A is EP operator. Also, closed range operator A is EP if and only if R(A) = R(A∗ ). In that case A† = A# . For more details concerning generalized inverses see [2,7]. Let A ∈ B(H, K). If there exists a Hilbert space M and operators P ∈ B(M, K), Q ∈ B(H, M) such that P is left and Q is right invertible and A = P Q, we say that (P, Q) is a full-rank decomposition of A. Recall that an operator A ∈ B(H, K) has a full-rank decomposition if and only if A is relatively regular. P ∈ B(H) is a projection, if P 2 = P . If M and N are closed subspaces of H such that H = M ⊕ N , then by PM,N we denote the projection onto the subspace M parallel to N . Furthermore, if M and N are orthogonal, then PM,N is an orthogonal projection (Hermitian projection) which we usually denote by PM . By C(M) we will denote the set of all closed complements of M in H.

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

117

2. Various definitions of the minus, star, sharp and core orders on Cn×n and B(H) There are many partial orders defined on Cn×n . One such order is the star partial order, which was defined by Drazin [3] as A ≤∗ B ⇔ A∗ A = A∗ B and AA∗ = BA∗ ,

(2)

where A, B belong Cn×n or B(H). If A ∈ Cn×n or A ∈ B(H) has a closed range, then the star order may be characterized as A ≤∗ B ⇔ A† A = A† B and AA† = BA† .

(3)

Hartwig [5] showed that if the conjugate transpose A∗ in (2) or the Moore–Penrose inverse in (3) is replaced by some A− ∈ A{1}, we still have a partial order on Cn×n , called by him, the minus partial order, A ≤− B ⇔ A− A = A− B and AA− = BA− ,

(4)

for some A− ∈ A{1}. In the same paper, he showed that the minus partial order is equivalent with the rank subtractivity order A ≤− B ⇔ r(B − A) = r(B) − r(A), where r(A) is the rank of A. Also, for A, B ∈ Cn×n , A ≤− B ⇔ R(B) = R(A) ⊕ R(B − A) ⇔ B{1} ⊆ A{1}. Evidently, there are many equivalent definitions for the minus partial order on Cn×n and it is not clear which one of them can be used for generalizing this order to the algebra of all bounded linear operators on a Hilbert space. Definition given by (4) could only be generalized to operators with closed ranges. The second one is in terms of ranks of matrices. As for the third one it would seem more natural to take closures of the ranges in question instead of the ranges themselves, but it remains unclear whether closure of the sum is to be taken as well. This question was in detail and very systematically considered in the paper of Šemrl [12], where the minus partial order was generalized from Cn×n to B(H) without any additional conditions regarding the closedness of the ranges of operators: Definition 2.1 (See [12]). For A, B ∈ B(H), we write A ≤− B if there exist projections P, Q ∈ B(H) such that

118

(i) (ii) (iii) (iv)

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

R(P ) = R(A), N (Q) = N (A), P A = P B, AQ = BQ.

Obviously the conditions P A = P B and AQ = BQ in Šemrl’s definition can be replaced by A = P B and A = BQ, respectively. In the case when A ∈ Breg (H), using [12, Theorem 2], we can check that Šemrl’s definition is equivalent to the definition (4) given by Hartwig. Inspired by Šemrl’s definition of the minus partial order, Dolinar et al. [4] introduced an equivalent definition for the star partial order on the algebra of all bounded linear operators on a Hilbert space, using orthogonal projections: Definition 2.2 (See [4]). For A, B ∈ B(H), we write A ≤∗ B if there exist orthogonal projections P, Q ∈ B(H) such that (i) (ii) (iii) (iv)

R(P ) = R(A), N (Q) = N (A), P A = P B, AQ = BQ.

The order ≤∗ given by Definition 2.2 is a partial order on the algebra B(H) and Definition 2.2 is equivalent with the definition (2) given by Drazin. Similar to the matrix case, we can define the sharp order in the case when A, B ∈ B(H) and ind(A) ≤ 1: A ≤# B ⇔ A# B = BA# = A# A. The sharp order is a partial order on the set {A ∈ B(H) : ind(A) ≤ 1}. The sharp order was considered in [9] for the elements of Rickart rings where in Theorem 14 one direction of the following lemma which give a relation between the minus and sharp partial orders was proved. Lemma 2.1. Let A, B ∈ B(H) be such that ind(A) ≤ 1. Then A ≤# B if and only if A ≤− B and AB = BA. Proof. (⇒:) Suppose that A ≤# B. Since A is relatively regular and A# ∈ A{1}, it follows that A ≤− B. Also, by A = (A# )# , using the fact that A# is a double commutant of A, it follows AB = BA. (⇐:) Assume that AB = BA and that there exists A− ∈ A{1} such that AA− = BA− and A− A = A− B. Since A# is a double commutant of A, we have that A# B = BA# . Also,

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

119

AA# = AA− AA# = AA− BA# = AA− A# B = AA− A(A# )2 B = A# B. Thus, AA# = A# B = BA# , i.e. A ≤# B.

2

The core inverse on the set of all matrices of index one and core partial order were recently introduced by Baksalary and Trenkler [1]: Definition 2.3. A matrix A⊕ ∈ Cn×n is a core inverse of A ∈ Cn×n if AA⊕ = PR(A),N (A∗ ) and R(A⊕ ) ⊆ R(A). A generalization of this inverse to the algebra of bounded linear operators on a Hilbert space is given in [11]: Definition 2.4. For A ∈ B(H), an operator A⊕ ∈ B(H) is a core inverse of A if AA⊕ A = A, R(A⊕ ) = R(A), N (A⊕ ) = N (A∗ ). The core inverse of A exists if and only if ind(A) ≤ 1. The core inverse is in a way somewhere in between the group and the Moore–Penrose inverse, just as the core partial order is somewhere in between the sharp and the star partial order. Using the core inverse, the core partial order is defined in [1] for matrices, and in [11] analogously for Hilbert space operators: if A, B ∈ B(H) are such that ind(A) ≤ 1, then A ≤⊕ B ⇔ A⊕ A = A⊕ B

and AA⊕ = BA⊕ .

(5)

The core order is a partial order on the set {A ∈ B(H) : ind(A) ≤ 1}. In this paper, we will generalize the core partial order to the algebra B(H) without any additional conditions of the existence of the core inverse. For a given operator A ∈ B(H), we characterize the sets SAρ of all B ∈ B(H) such that AρB and the sets SρA of all B ∈ B(H) such that BρA, where ρ ∈ {≤− , ≤∗ , ≤# , ≤⊕ }. We extend some results for complex matrices proved in [6] to Hilbert space operators. 3. The sets SAρ and SρA In [12], it is proved that for every pair of operators A, B ∈ B(H) we have that A ≤− B if and only if there exist decompositions H = H1 ⊕ H2 and H = H3 ⊕ H4 (Hi , i = 1, 4 are closed subspaces) such that  A= and

  A1 0

0 0

:

H1 H2



 →

H3 H4

 (6)

120

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

 B=

  A1 0

0 B1

:

H1 H2



 →

H3 H4

 ,

(7)

where A1 is injective with dense range. From this result, we can conclude two facts: 1. If we look more carefully at the representations given above, we see that H2 = N (A) and H3 = R(A), while H1 and H4 are arbitrary complemented subspaces of N (A) and R(A), respectively. Since an arbitrary A ∈ B(H) can be represented by (6) where A1 is injective with dense range, we have that if A ≤− B then the relatively regularity of B implies the relatively regularity of A, while the converse is not true in general. An example that the converse is not true can only exist in the case when dim N (A) = ∞, taking for B1 : N (A) → H4 any operator with non-closed range. 2. This Šemrl’s result allows us to, for an operator A given by (6), describe the set of all operators B ∈ B(H) such that A ≤− B:  SA≤− =



  A1 0

B ∈ B (H) : B =

0 B1

:

H1 N (A)



 →

R(A) H4

 .

Another way to describe the set of all operators B ∈ B(H) such that A ≤− B is the following: SA≤− = {B ∈ B (H) : B = A + (I − PR(A),H4 )XPN (A),H1 , X ∈ B (H) , H1 ∈ C(N (A)), H4 ∈ C(R(A))}. If we are interested in what could be seen as the dual problem, namely if given B ∈ B(H) we wish to describe the set of all operators A ∈ B(H) such that A ≤− B, things will not go so smoothly in general. Let B ∈ B(H) be given. Observe that B has the following matrix representation with respect to the orthogonal sum H = R(B) ⊕ N (B ∗ ),  B=

  B1 0

B2 0

:

R(B) N (B ∗ )



 →

 R(B) . N (B ∗ )

(8)

In the following theorem for B ∈ B(H) given by (8), we present a representation of A ∈ B(H) which satisfies A ≤− B. Theorem 3.1. Let B ∈ B(H) be given by (8). If for A ∈ B(H) inequality A ≤− B holds, then A has the form  A=

  P B1 0

P B2 0

for some projection P ∈ B(R(B)).

:

R(B) N (B ∗ )



 →

 R(B) , N (B ∗ )

(9)

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

121

Proof. Suppose that for some A ∈ B(H) the inequality A ≤− B holds. Without loss of generality, we can assume that A is represented by  A=

A1 A3

A2 A4

  :

R(B) N (B ∗ )



 →

 R(B) . N (B ∗ )

(10)

By Theorem 2 from [12] it follows that R(A) ⊆ R(B), which implies A3 = 0 and A4 = 0. Since there exists a projection P ∈ B(H) such that R(P ) = R(A) and A = P B, we can suppose that P is given by  P =

P1 0

 

P2 0



R(B) N (B ∗ )

:



 R(B) , N (B ∗ )



where P1 ∈ B(R(B)) is also a projection. From P A = A = P B, using the above decompositions of A, B and P , we get that A1 = P1 A1 = P1 B1 and A2 = P1 A2 = P1 B2 . Hence, there exists a projection P1 ∈ B(R(B)) such that  A=

P1 B1 0

P1 B2 0

  :

R(B) N (B ∗ )



 →

 R(B) . N (B ∗ )

2

We will prove that in the case when B ∈ Breg (H), the converse of Theorem 3.1 is valid. Notice that under the condition B ∈ Breg (H), we have that  B=

B1 0

B2 0

  :

R(B) N (B ∗ )



 →

 R(B) . N (B ∗ )





Since B is regular it follows that R(B) is closed, and since B = B1 B2

(11) 

 :

R(B) N (B ∗ )

 →

R(B) and R(B) = R(B  ) we have that B  is right invertible. Now, D = B  (B  )∗ = B1 B1∗ + B2 B2∗ ∈ B(R(B)) is an invertible operator. Theorem 3.2. Let B ∈ Breg (H) be given by (11). For A ∈ B(H), we have that A ≤− B if and only if A has the form  A=

P B1 0

P B2 0

  :

R(B) N (B ∗ )



 →

 R(B) , N (B ∗ )

(12)

for some projection P ∈ B(R(B)). In addition, the projection P is unique, when it exists.

122

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

Proof. (⇒:) By Theorem 3.1. (⇐:) Notice that   I  B= B1 0



 B2

:

R(B) N (B ∗ )



 →

 R(B) , N (B ∗ )

where     I R(B) P1 = , : R(B) → 0 N (B ∗ )     R(B) Q1 = B1 B2 : → R(B). N (B ∗ ) Since P1 is left and Q1 is right invertible, we have that B = P1 Q1 is a full-rank decomposition of B. Suppose that A is given by (12), for some projection P ∈ B(R(B)). Evidently, A ∈ Breg (H) and A = P1 P Q1 . By [10, Corollary 3.2], it follows that A ≤− B. To prove the uniqueness of the projection P , assume that P1 ∈ B(R(B)) is also a projection such that 

 P1 B1 0

A=

P1 B2 0

.

      From P B1 P B2 = P1 B1 P1 B2 and the right invertibility of B1 B2 , we have that P = P1 . 2 Now, we can completely characterize the set S≤− B of all operators which are below a given B ∈ Breg (H) of the form (11) with respect to the minus partial order: S≤− B

 = A ∈ B (H) : A =



P B1 0 P 2 = P ∈ B(R(B)) .

  P B2 0

:

R(B) N (B ∗ )



 →

 R(B) , N (B ∗ )

In [4], using Definition 2.2, the authors proved the analogous result for the star partial order to the result concerning the minus partial order given in [12]: Theorem 3.3 (See [4]). For A, B ∈ B(H), A ≤∗ B if and only if there exist direct sum decompositions H = H1 ⊕ H1⊥ and H = H2 ⊕ H2⊥ (Hi , i = 1, 2 are closed subspaces) such that       A1 0 H2 H1 A= → : H1⊥ H2⊥ 0 0

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

123

and  B=

  A1 0

0 B1

:

H1 H1⊥



 →

H2 H2⊥

 ,

where A1 is injective with dense range. It is evident that in the representation above we have that H1 = R(A∗ ) and H2 = R(A). So, for given A ∈ B(H), the set of all B ∈ B(H) such that A ≤∗ B is given by

SA≤∗ = B ∈ B (H) : B = A + PN (A∗ ) XPN (A) , X ∈ B (H) , where we have to remark that PN (A∗ ) and PN (A) are orthogonal projections. In the case of the star partial order, we can completely describe the set S≤∗B without the additional assumption that B is regular: Theorem 3.4. Let B ∈ B(H) be an operator given by (8). For A ∈ B(H), we have that A ≤∗ B if and only if A is given by (9), for some orthogonal projection P ∈ B(R(B)) which commutes with D = B1 B1∗ + B2 B2∗ ∈ B(R(B)). Proof. (⇒:) Suppose that A ≤∗ B. Without loss of generality, we can assume that A is represented by (10). By Theorem 3.3 we have that R(A) ⊆ R(B), so A3 = 0 and A4 = 0. By Definition 2.2, there exists an orthogonal projection P ∈ B(H) such that R(P ) = R(A) and A = P B. Such P must be given by  P =

  P1 0

0 0

:

R(B) N (B ∗ )



 →

 R(B) , N (B ∗ )

where P1 ∈ B(R(B)) is also an orthogonal projection. From P A = A = P B, using the above decompositions of A, B and P , we get that A1 = P1 A1 = P1 B1 and A2 = P1 A2 = P1 B2 . Hence, there exists an orthogonal projection P1 ∈ B(R(B)) such that  A=

  P1 B1 0

P1 B2 0

:

R(B) N (B ∗ )



 →

 R(B) . N (B ∗ )

Now, using AA∗ = BA∗ , we get that P1 DP1 = DP1 which implies P1 D = DP1 . (⇐:) Assume that A has the form (9) for some orthogonal projection P ∈ B(R(B)) which commutes with D. It can be checked that AA∗ = BA∗ and A∗ A = A∗ B. Hence, A ≤∗ B. 2 Using Lemma 2.1 and Theorem 3.1, we obtain the following result:

124

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

Theorem 3.5. Let B ∈ B(H) be given by (8). If for a group invertible A ∈ B(H) we have A ≤# B, then A has the form (9) for some projection P ∈ B(R(B)) satisfying P B12 = B1 P B1 and P B1 B2 = B1 P B2 . Proof. By Lemma 2.1, A ≤# B if and only if A ≤− B and AB = BA. So, by Theorem 3.1, A has the form (9), for some projection P ∈ B(R(B)). Now, AB = BA holds if only if P B12 = B1 P B1 and P B1 B2 = B1 P B2 . 2 Notice that in the case when B is relatively regular, the conditions P B12 = B1 P B1 and P B1 B2 = B1 P B2 are equivalent with P B1 = B1 P , so by Theorem 3.2 we have the following corollary: Corollary 3.1. Let B ∈ Breg (H) be given by (11). For A ∈ B(H) which is group invertible, we have that A ≤# B if and only if A has the form (12), for some projection P ∈ B(R(B)) which satisfies P B1 = B1 P . Now, we will generalize the definition of the core partial order to the algebra B(H) without the additional condition of the existence of the core inverse. Definition 3.1. For A, B ∈ B(H), we write A ≤⊕ B if there exists an orthogonal projection P ∈ B(H) and a projection Q ∈ B(H) such that (i) (ii) (iii) (iv)

R(P ) = R(A), R(Q) = R(A), N (Q) = N (A), P A = P B, AQ = BQ.

Evidently Definition 3.1 makes sense only for the operators belonging to the set {A ∈ B(H) : R(A) ⊕ N (A) = H}. In a moment, we will see that on the set {A ∈ B(H) : R(A) ⊕ N (A) = H}, Definitions 3.1 and (5) are equivalent. In the following theorem, we give a characterization of the core order defined by Definition 3.1. Theorem 3.6. Let A, B ∈ B(H) be given. We have that A ≤⊕ B if and only if H = R(A) ⊕ N (A),  

 A=

A1 0

0 0

:

R(A) N (A)



 →

R(A) N (A∗ )

 (13)

and  B=

  A1 0

0 B1

:

R(A) N (A)



 →

 R(A) . N (A∗ )

(14)

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

125

Proof. (⇒:) Suppose that A ≤⊕ B in the sense of Definition 3.1. The existence of projection Q such that R(Q) = R(A) and N (Q) = N (A) implies that H can be decomposed as H = R(A) ⊕ N (A) and that A can be given by (13), where A1 is injective with dense range. Now, the projections P and Q from Definition 3.1 are given by the following:  0 0 I 0

Q=

   R(A) R(A) → , : N (A∗ ) N (A∗ )      0 R(A) R(A) → . : 0 N (A) N (A)  

I P = 0 

Let  B=

  B4 B3

B2 B1

:

R(A) N (A)



 →

 R(A) . N (A∗ )

From P A = P B and AQ = BQ, we get that B4 = A1 , B2 = 0 and B3 = 0. Hence, B is given by (14). (⇐:) This is trivial to verify. 2 Hence,  SA≤⊕ = B ∈ B (H) : B = A + PN (A∗ ) X(I − PR(A),N (A) ), X ∈ B (H) , where PN (A∗ ) is an orthogonal projection. In the case when A is group invertible (the core inverse of A exists), we have that A is given by 

  A1 0

A=

0 0

:

R(A) N (A)



 →

R(A) N (A∗ )



where A1 is invertible, and in that case A⊕ is given by:  ⊕

A =

A−1 1 0

  0 0

:

R(A) N (A∗ )



 →

 R(A) . N (A)

Thus, Definitions 3.1 and (5) are equivalent. As before, if A ≤⊕ B then the relative regularity of B implies the relative regularity of A, which is in this case equivalent with the group invertibility of A. Evidently A ≤⊕ B implies that A ≤− B.

126

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

Now, we give a characterization of the set S≤⊕ B in the case when B is group invertible: Theorem 3.7. Let A, B ∈ B(H) be group invertible operators and let B be given by (11). We have that A ≤⊕ B if and only if A is given by (12) for some orthogonal projection P ∈ B(R(B)) which satisfies P B1 P = B1 P . Proof. (⇒:) Since A ≤− B, by Theorem 3.2 it follows that A is given by (12) for some projection P ∈ B(R(B)). Now, we need to prove that P is orthogonal and that P B1 P = B1 P . By the group invertibility of B we have the group invertibility of A and we can check that the group and the Moore–Penrose inverse of A are given by:  #

A =

(P B1 )# 0

    

2 R(B) R(B) (P B1 )# P B2 : → N (B ∗ ) N (B ∗ ) 0

(15)

and  †

A =

B1∗ P ∗ (P DP ∗ )† B2∗ P ∗ (P DP ∗ )†

  0 0

:

R(B) N (B ∗ )



 →

 R(B) , N (B ∗ )

where D = B1 B1∗ + B2 B2∗ . Since A⊕ = A# AA† , we obtain that  ⊕

A =

(P B1 )# P DP ∗ (P DP ∗ )† 0

  0 0

:

R(B) N (B ∗ )



 →

 R(B) . N (B ∗ )

(16)

Using A⊕ A = A⊕ B, we get that A = AA† B which is equivalent with P = P DP ∗ (P DP ∗ )† .

(17)

Hence, P is orthogonal. From AA⊕ = BA⊕ we have that A = BA# A, which using (15) implies that P = B1 (P B1 )# P . Now P B1 P B1 = B1 (P B1 )# P B1 P B1 = B1 P B1 . By the group invertibility of B it follows that B1 = B|R(B) is invertible which from the last equality implies that P B1 P = B1 P . (⇐:) Suppose that A is given by (12) for some orthogonal projection P ∈ B(R(B)) which satisfies P B1 P = B1 P . Then A⊕ is given by (16) with a remark that P ∗ can be replaced by P . Using that (P DP )† = (P DP )† P it follows that A⊕ A = A⊕ B. Using (16), we conclude that AA⊕ = BA⊕ is equivalent with P B1 (P B1 )# P DP ∗ (P DP ∗ )† = B1 (P B1 )# P DP ∗ (P DP ∗ )† .

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

127

To prove that the last equality is satisfied we will prove that P B1 (P B1 )# = B1 (P B1 )# : P B1 (P B1 )# = P B1 P B1 [(P B1 )# ]2 = B1 P B1 [(P B1 )# ]2 = B1 (P B1 )# . Hence, A ≤⊕ B.

2

In the next theorem we give a characterization of the core order related to Definition 3.1 on the set of the group invertible operators: Theorem 3.8. Let A, B ∈ B(H) be group invertible operators. Then A ≤⊕ B if and only if there exists an orthogonal projection Q ∈ B(H) such that (i) R(Q) = R(A), (ii) QA = QB, (iii) AQ = BQ. Proof. (⇒:) Assume that B is represented by (11). Then, by Theorem 3.7, A is given by (12) for some orthogonal projections P ∈ B(R(B)) such that P B1 P = B1 P . Let  Q=

  P 0

0 0

:

R(B) N (B ∗ )



 →

 R(B) . N (B ∗ )

(18)

  Then Q2 = Q = Q∗ , A = QB and AQ = BQ. By the right invertibility of B1 B2 and (12), we have that R(A) = R(Q). (⇐:) Notice that A⊕ = A# AA† = A# Q. From QA = QB it follows that A# QA = # A QB. i.e., A⊕ A = A⊕ B. Similarly, AA⊕ = AA† = AQAA# A† = BQAA# A† = BAA# A† = BA⊕ .

2

Notice that some interesting results concerning core inverses can be found in [8]. Theorem 3.9. Let A, B ∈ B(H) be group invertible operators such that A ≤⊕ B. The following are equivalent: (i) AB = BA; (ii) A2 ≤⊕ B 2 ; (iii) Aj ≤⊕ B j for every integer j ≥ 2. Proof. (i) ⇒ (iii): By Theorem 3.7, A and B are given by (12) and (11), respectively, for some orthogonal projection P ∈ B(R(B)) such that P B1 P = B1 P . From AB = BA, we have P B1 = B1 P which implies that for every integer j ≥ 2, we have

128

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

 j

A =

(P B1 )j 0

(P B1 )j−1 P B2 0



 =

P B1j 0

P B1j−1 B2 0

 .

Define an orthogonal projection Q by (18). Then Aj = QB j , Aj Q = B j Q and R(Aj ) = R(Q). Now by Theorem 3.8, we conclude that Aj ≤⊕ B j . (iii) ⇒ (ii): This is obvious. (ii) ⇒ (i): Since A ≤⊕ B, we have that A = AA† B and A2 = BA.

(19)

From A2 ≤⊕ B 2 and (A2 )⊕ = (A⊕ )2 , it follows (A⊕ )2 A2 = (A⊕ )2 B 2 which is equivalent with A # A † A 2 = A# A † B 2 . Now, if we multiple the last equality by A2 from the left side, we have A2 = AA† B 2 . By (19), it follows A2 = AB, i.e., AB = BA. 2 Theorem 3.10. Let B ∈ B(H) be a group invertible operator given by (11). For an EP-operator A ∈ B(H), A ≤⊕ B if and only if A is given by  A=

  P B1 0

0 0

:

R(B) N (B ∗ )



 →

 R(B) , N (B ∗ )

(20)

for an orthogonal projection P ∈ B(R(B)) which satisfies P B1 = B1 P and P B2 = 0. Proof. (⇒:) By Theorem 3.7, there exists an orthogonal projection P ∈ B(R(B)) such that A is given by (12) and P B1 P = B1 P . Since A is EP, we have that A⊕ = A# which is by (16) and (15) equivalent to (P B1 )# P DP (P DP )† = (P B1 )# ,

2 (P B1 )# P B2 = 0.

(21) (22)

From (22), we get P B1 P B2 = 0 which implies P B2 = 0. By (21) and using that (P DP )† = (P DP )† P , we have (P B1 )# = (P B1 )# P . Hence, P B1 = P B1 P = B1 P . (⇐:) By Theorem 3.7, it follows that A ≤⊕ B. 2 Similarly as [6, Proposition 27], we can prove the following: Theorem 3.11. Let A, B ∈ B(H) be group invertible and let B be given by (11). If A ≤⊕ B, then the following statements are equivalent: (i) B − A ≤⊕ B, (ii) P B1 = B1 P ,

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

129

where P ∈ B(R(B)) is an orthogonal projection whose existence was guaranteed by Theorem 3.7. Proof. The hypothesis A ≤⊕ B and Theorem 3.7 imply that there exists an orthogonal projection P ∈ B(R(B)) such that A is given by (12) and P B1 P = B1 P . So,  B−A=

(I − P )B1 0

 (I − P )B2 . 0

(23)

Using again Theorem 3.7, B −A ≤⊕ B if and only if there exists an orthogonal projection P1 ∈ B(R(B)) such that  B−A=

 P1 B1 0

P1 B2 0

(24)

and P1 B1 P1 = B1 P1 . If we suppose that there exists such P1 , then from (23) and (24), we conclude that I − P = P1 . So, (I − P )B1 (I − P ) = B1 (I − P ) which is equivalent to P B1 = P B1 P , i.e., P B1 = B1 P . If P B1 = B1 P , then (I − P )B1 (I − P ) = B1 (I − P ), so by Theorem 3.7 it follows that B − A ≤⊕ B. 2 Applying Theorems 3.1, 3.4, 3.7 and 3.11 and Corollary 3.1, we obtain the next results: Lemma 3.1. Let A, B ∈ B(H) be group invertible operators. Then the following statements hold: (i) A ≤⊕ B and AB = BA if and only if A ≤# B and A∗ A = A∗ B. (ii) A ≤⊕ B and BA∗ is Hermitian if and only if A ≤∗ B and A2 = BA. (iii) If A ≤⊕ B, then B − A ≤⊕ B if and only if A ≤# B. Proof. (i) Assume that B is given by (11). If A ≤⊕ B and AB = BA, by Theorem 3.7, A is given by (12) for an orthogonal projection P ∈ B(R(B)) such that P B1 = B1 P . Hence, by Corollary 3.1, A ≤# B. The condition A∗ A = A∗ B is equivalent with the fact that P is Hermitian, so it is also satisfied. If A ≤# B and A∗ A = A∗ B, by Corollary 3.1 it follows that A is given by (12) for an orthogonal projection P ∈ B(R(B)) such that P B1 = B1 P . So by Theorem 3.7, it follows that A ≤⊕ B. It is easy to check that AB = BA. (ii) If A ≤⊕ B or A ≤∗ B it follows by Theorems 3.7 and 3.4 that A is given by (12) for some orthogonal projection P ∈ B(R(B)). In both cases, the fact that BA∗ is Hermitian is equivalent with DP = P D, where D = B1 B1∗ + B2 B2∗ while A2 = BA is equivalent with P B1 P = B1 P . Now, by Theorems 3.7 and 3.4 we have that the equivalence holds. (iii) It follows by Theorem 3.11 and Corollary 3.1. 2

130

D.S. Cvetković-Ilić et al. / Linear Algebra and its Applications 481 (2015) 115–130

Let us remark that under the assumption that B − A and B are group invertible and B − A ≤⊕ B, we can conclude that A ≤⊕ B if and only if B − A ≤# B. If A ∈ B(H) is EP operator, then evidently A ≤⊕ B ⇔ A ≤∗ B ⇔ A ≤# B. Also, since A ≤∗ B ⇔ B − A ≤∗ B, we have the following: Lemma 3.2. Let A, B ∈ B(H) be such that A and B − A are EP. Then the following statements are equivalent: (i) (ii) (iii) (iv) (v) (vi)

A ≤⊕ B; A ≤∗ B; A ≤# B; B − A ≤⊕ B; B − A ≤∗ B; B − A ≤# B.

Notice that, the equivalence of the statements (ii) and (v) in the previous lemma allways holds. The equivalence of the statements (iii) and (vi) in the previous lemma for the group invertible elements of the Rickart ring was proved in Theorem 12 [9]. Acknowledgement The authors would like to thank the anonymous reviewer for his\her useful suggestions, which helped to improve the original version of this paper. References [1] O.M. Baksalary, G. Trenkler, Core inverse of matrices, Linear Multilinear Algebra 58 (6) (2010) 681–697. [2] A. Ben-Israel, T.N.E. Greville, Generalized Inverses: Theory and Applications, 2nd ed., SpringerVerlag, New York, 2003. [3] M.P. Drazin, Natural structures on semigroups with involution, Bull. Amer. Math. Soc. 84 (1978) 139–141. [4] G. Dolinar, J. Marovt, Star partial order on B(H), Linear Algebra Appl. 434 (2011) 319–326. [5] R.E. Hartwig, How to partially order regular elements?, Math. Jpn. 25 (1980) 1–13. [6] S.B. Malik, L. Rueda, N. Thome, Further properties on the core partial order and other matrix partial orders, Linear Multilinear Algebra 62 (12) (2014) 1629–1648. [7] M. Nashed, Generalized Inverses and Applications, Academic Press, New York, 1976. [8] D.S. Rakić, Generalization of sharp and core partial order using annihilators, Banach J. Math. Anal. 9 (3) (2015) 228–242. [9] J. Marovt, On partial orders in Rickart rings, Linear Multilinear Algebra 63 (9) (2015) 1707–1723. [10] D.S. Rakić, D.S. Djordjević, Space pre-order and minus partial order for operators on Banach spaces, Aequationes Math. 85 (2013) 429–448. [11] D.S. Rakić, N.Č. Dinčić, D.S. Djordjević, Core inverse and core partial order of Hilbert space operators, Appl. Math. Comput. 244 (2014) 283–302. [12] P. Šemrl, Automorphisms of B(H) with respect to minus partial order, J. Math. Anal. Appl. 369 (2010) 205–213.