I find it may cost me so much time in doing such solutions to exercises and problems....I am sorry that I could not be persistent in doing it...Wish I could just recover it later on. [Bhatia.Matrix Analysis.Solutions to Exercises and Problems]PrI.6.1…
Let $x,y,z$ be linearly independent vectors in $\scrH$. Find a necessary and sufficient condition that a vector $w$ mush satisfy in order that the bilinear functional $$\bex F(u,v)=\sef{x,u}\sef{y,v}+\sef{z,u}\sef{w,v} \eex$$ is elementary. Solution.…
For every matrix $A$, the matrix $$\bex \sex{\ba{cc} I&A\\ 0&I \ea} \eex$$ is invertible and its inverse is $$\bex \sex{\ba{cc} I&-A\\ 0&I \ea}. \eex$$ Use this to show that if $A,B$ are any two $n\times n$ matrices, then $$\bex \sex{\ba{c…
Every $k\times k$ positive matrix $A=(a_{ij})$ can be realised as a Gram matrix, i.e., vectors $x_j$, $1\leq j\leq k$, can be found so that $a_{ij}=\sef{x_i,x_j}$ for all $i,j$. Solution. By Exercise I.2.2, $A=B^*B$ for some $B$. Let $$\bex B=(x_1,\c…
Show that the inner product $$\bex \sef{x_1\vee \cdots \vee x_k,y_1\vee \cdots\vee y_k} \eex$$ is equal to the permanent of the $k\times k$ matrix $\sex{\sef{x_i,y_j}}$. Solution. $$\beex \bea &\quad \sef{x_1\vee \cdots \vee x_k,y_1\vee \cdots \vee y…
Show that the inner product $$\bex \sef{x_1\wedge \cdots \wedge x_k,y_1\wedge \cdots\wedge y_k} \eex$$ is equal to the determinant of the $k\times k$ matrix $\sex{\sef{x_i,y_j}}$. Solution. $$\beex \bea &\quad \sef{x_1\wedge\cdots \wedge x_k,y_1\wedg…
Let $A$ and $B$ be two matrices (not necessarily of the same size). Relative to the lexicographically ordered basis on the space of tensors, the matrix for $A\otimes B$ can be written in block form as follows: if $A=(a_{ij})$, then $$\bex A\otimes B=…
(1). There is a natural isomorphism between the spaces $\scrH\otimes \scrH^*$ and $\scrL(\scrH,\scrK)$ in which the elementary tensor $k\otimes h^*$corresponds to the linear map that takes a vector $u$ of $\scrH$ to $\sef{h,u}k$. This linear transfor…
For any matrix $A$ the series $$\bex \exp A=I+A+\frac{A^2}{2!}+\cdots+\frac{A^n}{n!}+\cdots \eex$$ converges. This is called the exponential of $A$. The matrix $A$ is always invertible and $$\bex (\exp A)^{-1}=\exp(-A). \eex$$ Conversely, every inver…
The set of all invertible matrices is a dense open subset of the set of all $n\times n$ matrices. The set of all unitary matrices is a compact subset of all $n\times n$ matrices. These two sets are also groups under multiplication. They are called th…
If $\sen{A}<1$, then $I-A$ is invertible, and $$\bex (I-A)^{-1}=I+A+A^2+\cdots, \eex$$ aa convergent power series. This is called the Neumann series. Solution.  Since $\sen{A}<1$, $$\bex \sum_{n=0}^\infty \sen{A}^n=\frac{1}{1-\sen{A}}<\infty. \ee…
Show that matrices with distinct eigenvalues are dense in the space of all $n\times n$ matrices. (Use the Schur triangularisation) Solution.  By the Schur triangularisation, for each matrix $A$, there exists a unitary $U$ such that $$\bex A=U\sex{\ba…
For fixed basis of in $\scrH$ and $\scrK$, the matrix $A^*$ is the conjugate transpose of the matrix of $A$. Solution.  $$\beex \bea (A^*)_{ij}&=e_i^*A^*f_j\\ &=\sef{e_i,A^*f_j}_\scrH\\ &=\sef{Ae_i,f_j}_\scrK\\ &=\overline{\sef{f_j,Ae_i}}\…
Let $X$ be nay basis of $\scrH$ and let $Y$ be the basis biorthogonal to it. Using matrix multiplication, $X$ gives a linear transformation from $\bbC^n$ to $\scrH$. The inverse of this is given by $Y^*$. In the special case when $X$ is orthonormal (…
Given a basis $U=(u_1,\cdots,u_n)$ not necessarily orthonormal, in $\scrH$, how would you compute the biorthogonal basis $\sex{v_1,\cdots,v_n}$? Find a formula that expresses $\sef{v_j,x}$ for each $x\in\scrH$ and $j=1,\cdots,k$ in terms of Gram matr…
(Schur's Theorem) If $A$ is positive, then $$\bex \per(A)\geq \det A. \eex$$ Solution. By Exercise I.2.2, $A=T^*T$ for some upper triangular $T$ with non-negative diagonals. Thus $$\beex \bea \det A&=\det T^*\cdot \det T\\ &=\per T^*\cdot \per T\\…
Prove that for any matrices $A,B$ we have $$\bex |\per (AB)|^2\leq \per (AA^*)\cdot \per (B^*B). \eex$$ (The corresponding relation for determinants is an easy equality.) Solution. Let $$\bex A=\sex{\ba{cc} \al_1\\ \vdots\\ \al_n \ea},\quad B=\sex{\b…
Prove that for any vectors $$\bex u_1,\cdots,u_k,\quad v_1,\cdots,v_k, \eex$$ we have $$\bex |\det(\sef{u_i,v_j})|^2 \leq \det\sex{\sef{u_i,u_j}}\cdot \det \sex{\sef{v_i,v_j}}, \eex$$ $$\bex |\per(\sef{u_i,v_j})|^2 \leq \per\sex{\sef{u_i,u_j}}\cdot \…
Let $A$ be a nilpotent operator. Show how to obtain, from aJordan basis for $A$, aJordan basis of $\wedge^2A$. Solution. Since $A$ is nilpotent, each eigenvalue of $A$ is zero, and thus there exists an basis $e_1,\cdot,e_n$ of $\scrH$ such that $$\be…
If $\dim \scrH=3$, then $\dim \otimes^3\scrH =27$, $\dim \wedge^3\scrH =1$ and $\dim \vee^3\scrH =10$. In terms of an orthonormal basis of $\scrH$, write an element of $(\wedge^3\scrH )\oplus \vee^3\scrH)^\perp$. Solution. Let $e_1,e_2,e_3$ be an ort…
Let $\scrM$ be a $p$-dimensional subspace of $\scrH$ and $\scrN$ its orthogonal complement. Choosing $j$ vectors from $\scrM$ and $k-j$ vectors from $\scrN$ and forming the linear span of the antisymmetric tensor products of all such vectors, we get…
The elementary tensors $x\otimes \cdots \otimes x$, with all factors equal, are all in the subspace $\vee^k\scrH$. Do they span it? Solution. Yes. Indeed, take $$\beex \bea &\quad (x+y)\otimes (x+y)-x\otimes x-y\otimes y\\ &=x\otimes y+y\otimes x\…
Suppose it is known that $\scrM$ is an invariant subspace for $A$. What invariant subspaces for $A\otimes A$ can be obtained from this information alone? Solution. It is $\scrM\otimes \scrM$ that is an invariant subspace of $A\otimes A$. Indeed, if $…
If $A$ is a contraction, show that $$\bex A^*(I-AA^*)^{1/2}=(I-A^*A)^{1/2}A^*. \eex$$ Use this to show that if $A$ is a contraction on $\scrH$, then the operators $$\bex U=\sex{\ba{cc} A&(I-AA^*)^{1/2}\\ (I-A^*A)^{1/2}&-A^* \ea}, \eex$$ $$\bex V=\…
Let $A=A_1\oplus A_2$. Show that (1). $W(A)$ is the convex hull of $W(A_1)$ and $W(A_2)$; i.e., the smallest convex set containing $W(A_1)\cup W(A_2)$. (2). $$\beex \bea \sen{A}&=\max\sed{\sen{A_1},\sen{A_2}},\\ \spr(A)&=\max\sed{\spr(A_1),\spr(A_…
(1). The numerical radius defines a norm on $\scrL(\scrH)$. (2). $w(UAU^*)=w(A)$ for all $U\in \U(n)$. (3). $w(A)\leq \sen{A}\leq 2w(A)$ for all $A$. (4). $w(A)=\sen{A}$ if (but not only if) $A$ is normal. Solution. (1). We only need to show that $$\…
(1). When $A$ is normal, the set $W(A)$ is the convex hull of the eigenvalues of $A$. For nonnormal matrices, $W(A)$ may be bigger than the convex hull of its eigenvalues. For Hermitian operators, the first statement says that $W(A)$ is the close int…
(1). The singular value decomposition leads tot eh polar decomposition: Every operator $A$ can be written as $A=UP$, where $U$ is unitary and $P$ is positive. In this decomposition the positive part $P$ is unique, $P=|A|$. The unitary part $U$ is uni…
(1). Let $\sed{A_\al}$ be a family of mutually commuting operators. Then, there exists a common Schur basis for $\sed{A_\al}$. In other words, there exists a unitary $Q$ such that $Q^*A_\al Q$ is upper triangular for all $\al$. (2). Let $\sed{A_\al}$…
Show that the following statements are equivalent: (1). $A$ is positive. (2). $A=B^*B$ for some $B$. (3). $A=T^*T$ for some upper triangular $T$. (4). $A=T^*T$ for some upper triangular $T$ with nonnegative diagonal entries. If $A$ is positive defini…