We first introduce some notations to be used. Let $C^{n\times m}$ denote the set of all $n\times m$ complex matrices; $R^{n\times m}$ denote the set of all $n\times m$ real matrices; $OR^{n\times n}$ be the sets of all $n\times n$ orthogonal matrices. The symbols $A^{T}$, $A^{+}$, $A^{-}$, $R(A)$, $N(A)$ and $r(A)$ stand for the transpose, Moore-Penrose generalized inverse, any generalized inverse, range(column space), null space and rank of $A\in R^{n\times m}$, respectively. The symbols $E_{A}$ and $F_{A}$ stand for the two projectors $E_{A}=I-AA^{-}$ and $F_{A}=I-A^{-}A$ induced by $A$. The matrices $I$ and $0$, respectively, denote the identity and zero matrices of sizes implied by context. We use $\langle A, B\rangle={\rm{trace}}(B^TA)$ to define the inner product of matrices $A$ and $B$ in $R^{n\times m}$. Then $R^{n\times m}$ is a Hilbert inner product space. The norm of a matrix generated by the inner product is the Frobenius norm $\|\cdot\|$, that is $\|A\|=\sqrt{\langle A, A\rangle}=({\rm{trace}} (A^TA))^\frac{1}{2}$.
Researches on extreme ranks of solutions to linear matrix equations was actively ongoing for more than 30 years. For instance, Mitra [1] considered solutions with fixed ranks for the matrix equations $AX=B$ and $AXB=C$; Mitra [2] gave common solutions of minimal rank of the pair of matrix equations $AX=C, XB=D$; Uhlig [3] gave the maximal and minimal ranks of solutions of the equation $AX=B$; Mitra [4] examined common solutions of minimal rank of the pair of matrix equations $A_{1}X_{1}B_{1}=C_{1}$ and $A_{2}X_{2}B_{2}=C_{2}$. By applying the matrix rank method, recently, Tian [5] obtained the minimal rank of solutions to the matrix equation $A=BX+YC$. In 2003, Tian in [6, 7] investigated the extremal ranks solutions to the complex matrix equation $AXB=C$ and gave some applications. In 2006, Lin and Wang in [8] studied the extreme ranks of solutions to the system of matrix equations $A_{1}X=C_{1}$, $XB_{2}=C_{2}$, $A_{3}XB_{3}=C_{3}$ over an arbitrary division ring, which was investigated in [9] and [10]. Recently, Xiao et al. considered the extremal ranks, i.e. maximal and minimal ranks to the equation $AX=B$ (see, e.g. [11-15]).
In this paper, we consider the extremal rank solutions of the matrix equations
where $A\in R^{p\times m}, B\in R^{p\times n}, C\in R^{n\times q}, D\in R^{m\times q}$ are given matrices.
The paper is organized as follows. At first, we will introduce several lemmas which will be used in the latter sections. In Section 3, applying the matrix rank method, we will discuss the rank of the general solution to the matrix equations $AX=B, XC=D$, where $A\in R^{p\times m}, B\in R^{p\times n}, C\in R^{n\times q}, D\in R^{m\times q}$ are given matrices.
Lemma 2.1 (see [6]) Let $A$, $B$, $C$, and $D$ be $m\times n$, $m\times k$, $l\times n$, $l\times k$ matrices, respectively. Then
where $G=CF_{A}$ and $H=E_{A}B$.
Lemma 2.2(see [16]) Given $A\in R^{p\times m}, B\in R^{p\times n}, C\in R^{n\times q}, D\in R^{m\times q}$. Let the singular value decompositions of $A$ be,
where $U=(U_{1}, U_{2})\in OR^{p\times p}$, $U_{1}\in R^{p\times k}$, $V=(V_{1}, V_{2})\in OR^{m\times m}$, $V_{1}\in R^{m\times k}$, $k=r(A)$, $\Sigma={\rm diag}(\sigma_{1}, \sigma_{2}, \cdots \sigma_{k})$, $\sigma_{1}\geq\cdots\geq\sigma_{k}>0$. Let the singular value decompositions of $B$ be,
where $P=(P_{1}, P_{2})\in OR^{n\times n}$, $P_{1}\in R^{n\times t}$, $Q=(Q_{1}, Q_{2})\in OR^{q\times q}$, $Q_{1}\in R^{q\times t}$, $t=r(C)$, $\Gamma={\rm diag}(\gamma_{1}, \gamma_{2}, \cdots \gamma_{t})$, $\gamma_{1}\geq\cdots\geq\gamma_{t}>0$. Then the matrix equations (1.1) have a solution in $R^{m\times n}$ if and only if
Moreover, its general solution can be expressed as
Lemma 2.3 Suppose that matrix equations (1.1) is consistent. Let the singular value decompositions of $A$ and $C$ given by (2.3) and (2.4), respectively. Denote by $X$ the solution of matrix equations (1.1). Then matrix $V^{T}XP$ can be partitioned into
where
is arbitrary.
Proof By (2.6), $Z$ is arbitrary, we claim from (2.3), (2.4) and (2.7) that $X_{22}$ is arbitrary too. We omit the proof.
Assume the matrix equations $(1.1)$ has a solution $X\in R^{m\times n}$, and the general solution can be written as
Let $G_{1}=X_{21}F_{X_{11}}$, $H_{1}=E_{X_{11}}X_{12}$. Assume the singular value decomposition of $G_{1}$ and $H_{1}^{+}$ be, respectively,
where $U_{G_{1}}=(U_{11}, U_{12})\in OR^{(m-k)\times (m-k)}$, $U_{11}\in R^{(m-k)\times k_{1}}$, $V_{G_{1}}=(V_{11}, V_{12})\in OR^{k\times k}$, $V_{11}\in R^{k\times k_{1}}$, $k_{1}=r(G_{1})$, $\Sigma_{1}={\rm diag}(\alpha_{11}, \alpha_{21}, \cdots \alpha_{k_{1}1})$, $\alpha_{11}\geq\cdots\geq\alpha_{k_{1}1}>0$.
where $P_{H_{1}^{+}}=(P_{11}, P_{12})\in OR^{(n-t)\times (n-t)}$, $P_{11}\in R^{(n-t) \times t_{1}}$, $Q_{H_{1}^{+}}=(Q_{11}, Q_{12})\in OR^{k\times k}$, $Q_{11}\in R^{k\times t_{1}}$, $t_{1}=r(H_{1}^{+})$, $\Gamma_{1}={\rm diag}(\beta_{11}, \beta_{21}, \cdots \beta_{t_{1}1})$, $\beta_{11}\geq\cdots\geq\beta_{t_{1}1}>0$.
Now we can establish the existence theorems as follows.
Theorem 3.1 Given $A\in R^{p\times m}, B\in R^{p\times n}, C\in R^{n\times q}, D\in R^{m\times q}$. the singular value decompositions of the matrices $A$, $C$ and $G_{1}$, $H_{1}^{+}$ are given by $(2.3)$, $(2.4)$ and $(3.2)$, $(3.3)$, respectively. Then equations $(1.1)$ has a solution $X$ if and only if
In this case, let $\Omega$ be the set of all solutions of equations $(1.1)$, then the extreme ranks of $X$ are as follows:
$(1)$ The minimal rank of $X$ is
The general expression of $A$ satisfying $(3.5)$ is
where $X_{0}=DC^{+}+A^{+}B-A^{+}ADC^{+}+(I-AA^{+})DC^{+}(A^{+}BCC^{+})^{+}A^{+}B(I-CC^{+})$, and $\tilde{Y}\in R^{(m-k)\times (n-t)}$ is arbitrary matrix.
$(2)$ The maximal rank of $X$ is
The general expression of $X$ satisfying $(3.7)$ is
and the arbitrary matrix $Y\in R^{(m-k)\times (n-t)}$ satisfies
Proof Suppose the matrix equation (1.1) has a solution $X$, then it follows from Lemma 2.2 that (3.4) hold. In this case, let $\Omega$ be the set of all solutions of equations $(1.1)$. By (3.1),
By Lemma 2.1, we have
where $G_{1}=X_{21}F_{X_{11}}$, $H_{1}=E_{X_{11}}X_{12}$.
$(1)$ By (3.10),
Then (3.4) hold. By Lemma 2.2, The general expression of $X$ satisfying $(3.5)$ can be expressed as
where $Y\in R^{(m-k)\times (n-t)}$ satisfies $E_{G_{1}}YF_{H_{1}}=0$.
By (3.1),
By (2.3), (2.4), $A^{+}=V_{1}\Sigma^{-1}U_{1}^{T}$, $CC^{+}=P_{1}P_{1}^{T}$. Then
Thus we obtain
By (3.2), (3.3),
Thus $E_{G_{1}}YF_{H_{1}}=0$, i, e. $U_{12}U_{12}^{T}YP_{12}P_{12}^{T}=0$, we have
where $\tilde{Y}\in R^{(m-k)\times (n-t)}$ is arbitrary.
Taking (3.12), (3.13) into (3.11) yields (3.6).
$(2)$ By (3.10),
Since $E_{G_{1}}$ and $F_{H_{1}}$ are idempotent matrices, we have
Then the maximal rank of the matrix equations (1.1) is
By Lemma 2.2, The general expression of $X$ satisfying $(3.7)$ can be expressed as
where $X_{0}=DC^{+}+A^{+}B-A^{+}ADC^{+}+(I-AA^{+})DC^{+}(A^{+}BCC^{+})^{+}A^{+}B(I-CC^{+})$, and the arbitrary matrix $Y\in R^{(m-k)\times (n-t)}$ satisfies
The proof is completed.
The result in (3.5) implies Theorem 3 in (see [2]) as a corollary.
Corollary Assume $r(B)\leq r(D)$, and matrix equations (1.1) is consistent. Then the matrix equations (1.1) have solution with rank of $r(D)$ if and only if $r(BC)=r(B)$.