Linear algebra paper 2019

 
    1. Let $V$ be a finite dimensional vector space over $ā„‚$ and let $T: V ā†’ V$ be a linear transformation.
      1. Define the minimal polynomial $m_T(x)$ of $T$. Show that $Ī» āˆˆ ā„‚$ is a root of $m_T(x)$ if and only if $Ī»$ is an eigenvalue of $T$.
      2. Show that $m_T(x)$ has distinct roots if and only if $T$ is diagonalizable (i.e. $V$ has a basis consisting of eigenvectors of $T$).
      3. Assume that $\dim V>1$. Prove that there exists a linear transformation $B: V ā†’ V$ such that for every polynomial $p(x) āˆˆ ā„‚[x]$ we have $B ā‰  p(T)$.
        [You may use without proof properties of polynomials over fields and the Cayley-Hamilton theorem provided you state them clearly. If you use the Primary Decomposition Theorem you should prove it.]
    2. For a field $š”½$ we denote by $S L(2, F)$ the set of 2 by 2 matrices with entries in F and having determinant 1.
      1. Suppose that $A āˆˆ S L(2, ā„‚)$ is diagonalizable. Show that $A=B^2$ for some $B āˆˆ S L(2, ā„‚)$
      2. Show that the map $A ā†¦ A^2$ for $A āˆˆ S L(2, ā„‚)$ is not surjective onto $S L(2, ā„‚)$.
      3. Suppose now $š”½=š”½_p$ is a finite field of size $p$, where $p$ is a prime number. Does there exist $p$ such that the map $A ā†¦ A^2$ with $A āˆˆ S L\left(2, š”½_p\right)$ is surjective onto $S L\left(2, š”½_p\right)$ ? Justify your answer.
  1. Let $V, W$ be two vector spaces over a field $š”½$ and let $V'$ and $W'$ denote the dual spaces of $V$ and $W$ respectively. Let $T: V ā†’ W$ be a linear transformation and let $T': W' ā†’ V'$ be the dual of $T$.
    1. Assume that both $V$ and $W$ are finite-dimensional spaces.
      1. Let $B$ be a basis of $V$ and let $C$ be a basis of $W$. Define the dual basis $B'$ to $B$ in $V'$. Let $C'$ be the dual basis to $C$ in $W'$. Prove that the matrix $B_{B'}\left[T'\right]_{C'}$ is the transpose of $C[T]_B$.
      2. Suppose $V=W$. Show that $T$ and $T'$ have the same minimal polynomial and the same characteristic polynomial.
      3. For a subspace $U$ of $V$ define its annihilator $U^0$ in $V'$ and prove that $\dim U+\dim U^0=\dim V$.
      1. Suppose $\dim V$ and $\dim W$ are both finite. Prove that $\dim \operatorname{Im}(T)=\dim \operatorname{Im}\left(T'\right)$.
      2. Now suppose $\dim V$ is infinite but $\dim W$ is finite. Is it still true that $\dim \operatorname{Im}(T)=\dim \operatorname{Im}\left(T'\right)$ ? Give a proof or a counterexample.
    2. Assume that $\dim V$ is finite. Let $f_1, ā€¦, f_k āˆˆ V'$. Show that $\left\{f_1, ā€¦, f_k\right\}$ span $V'$ if and only if $\bigcap_{i=1}^k \ker f_i=\{0\}$.
  2. Let $V$ be a finite-dimensional inner product space over $ā„‚$ and let $N: V ā†’ V$ be a linear transformation.
    1. For a subspace $U āŠ† V$, prove that $V=U āŠ• U^āŸ‚$. [You may assume that $U$ has an orthonormal basis.]
      1. Define the adjoint $N^*$ of $N$. [You don't have to show its existence or uniqueness.]
      2. Suppose that $U$ is a subspace of $V$ such that $N(U) āŠ† U$. Show that $N^*\left(U^āŸ‚\right) āŠ† U^āŸ‚$.
      3. Prove that $N$ can be written uniquely as a sum $N=S+A$ where $S^*=S$ and $A^*=-A$. Show that $N$ and $N^*$ commute (i.e. $N N^*=N^* N$) if and only if $S$ and $A$ commute.
      1. Suppose that $N N^*=N^* N$. Show that $\ker N=\ker N^*$ and $\operatorname{Im}(N)=\operatorname{Im}\left(N^*\right)$.
      2. Suppose that $ā€–N(v)ā€–=\left\|N^*(v)\right\|$ for all $v āˆˆ V$. Show that $N N^*=N^* N$.

Solution

      1. The minimal polynomial $m_T(x)$ is defined to be the monic polynomial $f(x)$ of least degree such that $f(T)=0$. It exists since the Cayley-Hamilton theorem states that $Ļ‡(T)=0$ where $Ļ‡(x)=\det(x I-T)$ is the characteristic polynomial of $T$. Now if $Ī»$ is an eigenvalue of $T$ with eigenvector $v$, then $0=m_T(T)(v)=m_T(Ī») v$ and hence $m_T(Ī»)=0$. Conversely if $Ī»$ is a root of $m_T$ then $m_T=(x-Ī») g(x)$ with $\deg g(x)<\deg m_T$. Therefore $g(T) ā‰  0$ and we can find a nonzero vector $v$ such that $w:=g(T) v ā‰  0$. But then $(T-Ī») w=(T-Ī») g(T) v=m_T(T) v=0$ and so $Ī»$ is an eigenvalue of $T$ with eigenvector $w$.
      2. If $T$ is diagonalizable with respect to some basis $B$ so that $X={ }_B[T]_B$ is a diagonal matrix, we consider a polynomial $f(x)$ to be product of linear factors $(x-Ī¼)$ where $Ī¼$ ranges over the distinct diagonal entries of $X$. We see $f(T)=0$ and together with (a)(i) we deduce that $f=m_T$ and has distinct roots. Conversely suppose $m_T=\prod_{i=1}^k\left(x-Ī¼_k\right)$ has distinct roots. We argue by induction on $k$, the case $k=1$ being clear. Let $m_T(x)=\left(x-Ī¼_1\right) g(x)$. Then $g\left(Ī¼_1\right) ā‰  0$. Let $U_1=\ker\left(T-Ī¼_1\right), U_2=\ker g(x)$. If $v āˆˆ U_1 āˆ© U_2$ then $0=g(T) v=g\left(Ī¼_1\right) v$ and so $v=0$ since $Ī¼_1$ is not a root of $g(x)$. We now show $V=U_1+U_2$. Let $v āˆˆ V$ and define $v_1=g\left(Ī¼_1\right)^{-1} g(T) v, v_2=v-v_1$. Since $m_T(v)=0=\left(T-Ī¼_1\right) g(T) v$ it follows that $T v_1=Ī¼_1 v$, i.e. $v_1 āˆˆ U_1$. Also $g(T) v_2=g(T) v-g(T) v_1=g(T) v-g\left(Ī¼_1\right) v_1=0$ by the definition of $v_1$. So $v_2 āˆˆ U_2$. Therefore $V=U_1 āŠ• U_2$. Now $T$ acts as the scalar $Ī¼_1$ on $U_1$ and $g(T)=0$ on $U_2$ hence the minimal polynomial of $T$ on $U_2$ has degree less than $k$. By induction we can choose a basis diagonalizing $T$ on $U_2$ and adding any basis of $U_1$ we are finished.
      3. Let $v$ be an eigenvector of $T$, and note that $v$ is still an eigenvector for any $p(T)$. We can extend $v$ to a basis $v, w_1, w_2 ā€¦$ of $V$ and define $B(v)=w_1$ and $B\left(w_i\right)=0$. Then $v$ is not an eigenvector of $B$ and so $B ā‰  p(T)$ for any polynomial $p(x)$. The students can also argue using that the dimension of the space spanned by $\left\{1, T, T^2, ā€¦,\right\}$ in $\operatorname{End}(V)$ is exactly deg $m_T ā‰¤ n$.
      1. If $A$ is diagonalizable, then for some change of basis matrix $P$ the matrix $P^{-1} A P$ is a diagonal matrix with eigenvalues $Ī», Ī»^{-1}$ for some nonzero $Ī» āˆˆ ā„‚$. Take $Ī¼ āˆˆ ā„‚$ such that $Ī¼^2=Ī»$ and let $X$ be the diagonal matrix with entries $Ī¼, Ī¼^{-1}$. Take $B=P^{-1} X P$.
      2. Take $A=\left(\begin{array}{cc}-1&1 \\ 0&-1\end{array}\right)$. Suppose $A=B^2$. The eigenvalues of $B$ must be $Ā± i$ and since $\det(B)=1$ they must be distinct: $i$ and $-i$. By part (a) $B$ is similar to a diagonal matrix with diagonal entries $\{i,-i\}$ and so $B^2=-I d ā‰  A$. Contradiction.
      3. SL$(2, š”½)$ is a finite set so if the map $A ā†¦ A^2$ is surjective it must be injective. But this is not true, as for odd $p$ we have $I d^2=(-I d)^2$ while if $p=2$ then Id$^{2}=\left(\begin{array}{ll}1&1 \\ 0&1\end{array}\right)^{2}$
      1. Suppose $B=\left\{b_1, ā€¦, b_n\right\}$ is a basis of $V$ and $C=\left\{c_1, ā€¦, c_m\right\}$ is a basis of $W$. We define $B'=\left\{b_1', ā€¦, b_n'\right\}$ where $b_i' āˆˆ V'$ is such that $b_i'\left(b_j\right)=Ī“_{i j}$.
        Similarly we take $C'=\left\{c_1', ā€¦, c_m'\right\}$. Suppose $_C[T]_B=\left(a_{i j}\right)$ so that $T\left(b_j\right)=\sum_{i=1}^m a_{i j} c_i$. We compute $T'\left(c_j'\right)\left(b_s\right)=c_j'\left(T\left(b_s\right)\right)=c_j'\left(\sum_{i=1}^m a_{i s} c_i\right)=a_{j s}$ This gives $T'\left(c_j'\right)=\sum_{i=1}^n a_{j i} b_i'$ and so $_{B'}\left[T'\right]_{C^r}$ is the transpose of $_C[T]_B$.
      2. For a polynomial $f(x)$ and a square matrix $X$ we have $f\left(X^t\right)=(f(X))^t$ and so $f(T)=0$ if and only if $f\left(X^t\right)=0$. It follows that $m_T=m_{T'}$. Let $A=B[T]_B$. Then $Ļ‡_T(x)=\det(x \text{Id}-A)=\det(x\text{Id}-A)^t=\det\left(x \text{Id}-A^t\right)=Ļ‡_{T^r}(x)$.
      3. We define $U^0:=\left\{f āˆˆ V' āˆ£ f(u)=0,āˆ€ u āˆˆ U\right\}$. Let $b_1, ā€¦, b_k$ be a basis of $U$ and extend this to a basis $B=\left\{b_1, ā€¦, b_n\right\}$ of $V$. We claim that $U^0$ has basis $b_{k+1}', ā€¦, b_n'$. Indeed these functionals are linearly independent (since they are a subset of $B'$), and for $f=\sum_{i=1}^n Ī±_i b_i'$ the condition $f āˆˆ U^0$ is equivalent to $f\left(b_i\right)=0$ for $i=1, ā€¦, k$, which is equivalent to $Ī±_1=ā‹Æ=Ī±_k=0$. This proves the claim. Hence $\dim U^0=n-k=\dim V-\dim U$ and we are done.
      1. There are many ways to argue this, here is an argument which also applies to (ii).
        We note that $(\operatorname{Im}(T))^0=\ker T'$. Indeed $f āˆˆ(\operatorname{Im}(T))^0$ iff $f(T v)=0$ for all $v āˆˆ V$ iff $f āˆ˜ T=0$ iff $f āˆˆ \ker T'$. Now by part (a) (iii) and the Rank-Nullity theorem applied to $T'$ we have $$ \dim \operatorname{Im}(T)=\dim W-\dim(\operatorname{Im}(T))^0=\dim W'-\dim \ker T'=\dim \operatorname{Im}\left(T'\right) $$
      2. The above argument only uses that $\dim W$ is finite, so the result remains true even if $\dim V$ is infinite.
    1. Let $U=\bigcap_{i=1}^k\ker f_i$ and let $L$ be the subspace of $V'$ spanned by all $f_i$. Observe that $U=\bigcap_{h āˆˆ L}$ ker $h$. Now if $L=V'$ then choosing a basis $B$ of $V$ we consider the dual basis $B'$ and then $U=\bigcap_{b āˆˆ B'} \ker b=\{0\}$.
      For the converse the students may argue using the natural isomorphism between $V$ and $V''$. Here is an alternative short argument: Suppose $\dim V=n$ and $L ā‰  V'$. Choose a basis $g_1, ā€¦, g_k$ of $L$ and note $k<n$. Thus $$ U=\bigcap_{i=1}^k \ker g_i=\ker Ļ• $$ where $Ļ•: V ā†’ \mathrm{F}^k$ is the linear map $Ļ•(v)=\left(g_1(v), g_2(v), ā€¦, g_k(v)\right)$. Since $k<\dim V$ the Rank-Nullity Theorem applied to $Ļ•$ gives that $U=\ker Ļ• ā‰ \{0\}$. Contradiction, therefore $L=V'$.
    1. If $v āˆˆ U āˆ© U^āŸ‚$ then $āŸØv, vāŸ©=0$ and hence $v=0$ since the inner product is positive definite. Hence $U āˆ© U^āŸ‚=\{0\}$. We now show that $V=U+U^āŸ‚$.
      Let $v āˆˆV$. Let $e_1, ā€¦, e_k$ be an orthonormal basis of $U$ and define $v_1=\sum_{i=1}^k\left< v, e_i\right> e_i$. Then $v_1 āˆˆ U$ and $\left< v, e_i\right>=\left< v_1, e_i\right>$ for all $i$ which implies that $v-v_1$ is orthogonal to each $e_i$, i.e. $v-v_1 āˆˆ U^āŸ‚$. Hence $V=U+U^āŸ‚$ and therefore $V=U āŠ• U^āŸ‚$.
      1. The adjoint $N^*$ is the unique linear transformation $N^*: V ā†’ V$ such that $\left< N^*(v), w\right>=āŸØv, N(w)āŸ©$ for all $v, w āˆˆ V$.
      2. Fix $w āˆˆ U^āŸ‚$ and let $v āˆˆ U$. We have $\left< N^*(w), v\right>=āŸØw, N(v)āŸ©=0$ since $N(v) āˆˆ U$. This holds for all $v āˆˆ U$ and hence $N^*(w) āˆˆ U^āŸ‚$. The vector $w āˆˆ U^āŸ‚$ was arbitrary and so $N^*\left(U^āŸ‚\right) āŠ† U^āŸ‚$
      3. If $N=S+A$ as required then $N^*=S-A$ and so we can solve $S=\left(N+N^*\right) / 2, A=\left(N-N^*\right) / 2$. This $A$ and $S$ are uniquely determined by $N$. Conversely we check $\left(\frac{N+N^*}{2}\right)^*=\frac{N+N^*}{2}$ and $\left(\frac{N-N^*}{2}\right)^*=-\frac{N-N^*}{2}$ so $A$ and $S$ exist for any $N$. Now if $N N^*=N^* N$ then we check $$ \frac{N+N^*}{2} \frac{N-N^*}{2}=\frac{N^2-\left(N^*\right)^2}{2}=\frac{N-N^*}{2} \frac{N+N^*}{2} $$ Conversely if $A$ and $S$ commute than $N N^*=(A+S)(S-A)=S^2-A^2=(S-A)(A+S)=N^* N$
      1. Suppose $v āˆˆ \ker N$. Then $\left\|N^*(v)\right\|^2=\left< N^* v, N^* v\right>=\left< v, N N^* v\right>=\left< v, N^* N v\right>=0$ and so $N^*(v)=0$ giving that $v āˆˆ$ ker $N^*$. Hence ker $N āŠ†$ ker $N^*$ The same argument applied with $N^*$ instead of $N$ gives the opposite containment and hence ker $N^*=$ ker $N$.
        Let $U=\ker N=\ker N^*$. Since both $N$ and $N^*$ send $U$ into $U$, from part (b) (ii) we have $N\left(U^āŸ‚\right) āŠ† U^āŸ‚$ and $N^*\left(U^āŸ‚\right) āŠ† U^āŸ‚$. Also both $N$ and $N^*$ are injective when restricted to $U^āŸ‚$ and hence both maps are bijections when restricted to $U^āŸ‚$. Finally $N(V)=N\left(U+U^āŸ‚\right)=N\left(U^āŸ‚\right)=U^āŸ‚$ and arguing with $N^*$ in place of $N$ we get $\operatorname{Im}(N)=\operatorname{Im}\left(N^*\right)=U^āŸ‚$.
      2. We have $\left\|N^*(v)\right\|=\left< v, N N^*(v)\right>$ and ${\|N(v)\|}=\left< v, N^* N(v)\right>$. Therefore if we set $A=N N^*-N^* N$ we get $āŸØv, A vāŸ©=0$ for all $v āˆˆ V$. From this point the students can argue with the spectral theorem to deduce $A=0$ but there is a direct way: Let $u, v āˆˆ V$ and apply the above equality to $u+v$. So $0=āŸØu+v, A(u+v)āŸ©$. Using $āŸØv, A vāŸ©=āŸØu, A(u)āŸ©=0$ we obtain $$ āŸØu, A(v)āŸ©+āŸØv, A(u)āŸ©=0 $$ Now replace $v$ with $iv$ to obtain $$ iāŸØu, A(v)āŸ©-iāŸØv, A(u)āŸ©=0 $$ Solving the two equations we get $āŸØu, A(v)āŸ©=0$ for all $u, v āˆˆ V$ and so $A=0$ and $NN^*=N^* N$.