# Section 1.11

7. Show that two 2-dimensional subspaces of a 3-dimensional subspace must have nontrivial intersection.

Proof:
(by contradiction) Suppose $M,N$ are both 2-dimensional subspaces of a 3-dimension vector space $V$ and assume that $M,N$ have trivial intersection. Then $M+N$ is also a subspace of $V$ , and since $M,N$ have a trivial intersection $M+N=M\oplus N$ . But then:

$\dim(M+N)=\dim M+\dim N=2+2$ . However subspaces must have a smaller dimension than the whole vector space and $4>3$ . This is a contradiction and so $M,N$ must have trivial intersection.

8. Let $M_{1},M_{2}\subset V$ be subspaces of a finite dimensional vector space $V$ . Show that $\dim(M_{1}\cap M_{2})+\dim(M_{1}\cup M_{2})=\dim M_{1}+\dim M_{2}$ .

Proof:
Define the linear map $L:M_{1}\times M_{2}\to V$ by $L(x_{1},x_{2})=x_{1}-x_{2}$ . Then by dimension formula $\dim(M_{1}\times M_{2})=\dim \ker(L)+\dim {\text{im}}(K)$ First note that in general $\dim(V\times W)=\dim V+\dim W$ . This fact I won’t prove here but is why $\dim \mathbb {R} ^{2}=1+1=2$ . Now $\ker(L)=\{(x_{1},x_{2}):L(x_{1},x_{2})=0\}$ . That is, $(x_{1},x_{2})\in \ker(L)$ iff $x_{1}-x_{2}=0\Rightarrow x_{1}=x_{2}$ . But since $x_{1}\in M_{1}$ and $x_{2}\in M_{2}$ and they are actually the same vector, $x_{1}=x_{2}$ , then we must have $x_{1}=x_{2}\in M_{1}\cap M_{2}$ . That says that the elements of the kernel are ordered pairs where the first and second component are equal and must be in $M_{1}\cap M_{2}$ . Then we can write $\ker(L)=\{(x,x):x\in M_{1}\cap M_{2}\}$ . I claim that this is isomorphic to $M_{1}\cap M_{2}$ . To prove this consider the function $\phi :M_{1}\cap M_{2}\to \ker(L)$ as $\phi (x)=(x,x)$ . This map $\phi$ is an isomorphism which you can check. Since we have an isomorphism, the dimensions must equal and so $\dim(M_{1}\cap M_{2})=\dim(\ker(L))$ . Finally let us examine ${\text{im}}(L)=\{x_{1}-x_{2}:x_{1}\in M_{1},x_{2}\in M_{2}\}$ . I claim that ${\text{im}}(L)=M_{1}+M_{2}$ . Note, this is equal and not just isomorphic. To see this, we note that if $x_{2}\in M_{2}$ then $-x_{2}\in M_{2}$ by subspace property. So then any $x_{1}+x_{2}\in M_{1}+M_{2}$ is also equal to $x_{1}-(-x_{2})\in {\text{im}}(L)$ . So these sets do indeed contain the exact same elements. That means $\dim(M_{1}+M_{2})=\dim {\text{im}}(L)$ . Putting this all together gives:

$\dim M_{1}+\dim M_{2}=\dim(M_{1}\times M_{2})=\dim \ker(L)+\dim {\text{im}}(L)=\dim(M_{1}\cap M_{2})+\dim(M_{1}+M_{2})$ .

16. Show that the matrix
${\begin{bmatrix}0&1\\0&1\end{bmatrix}}$ as a linear map satisfies $\ker(L)={\text{im}}(L)$ .

Proof:
The matrix is already in eschelon form and has one pivot in the second column. That means that a basis for the column space which is the same as the image would be the second column. In other words, ${\text{im}}(L)={\text{Span}}\left({\begin{bmatrix}1\\0\end{bmatrix}}\right)$ . Now for the kernel space. Writing out the equation $Lx=0$ reads $0x_{1}+1x_{2}=0$ or in other words $x_{2}=0$ . Then an arbitrary element of the kernel ${\begin{bmatrix}x_{1}\\x_{2}\end{bmatrix}}=x_{2}{\begin{bmatrix}1\\0\end{bmatrix}}$ . So again $\ker(L)={\text{Span}}\left({\begin{bmatrix}1\\0\end{bmatrix}}\right)$ . In other words, $\ker(L)={\text{im}}(L)$ .

17. Show that
${\begin{bmatrix}0&0\\\alpha &1\end{bmatrix}}$ defines a projection for all $\alpha \in \mathbb {F}$ . Compute the kernel and image.

Proof:
First I will deal with the case $\alpha =0$ . In this case the matrix is ${\begin{bmatrix}0&0\\0&1\end{bmatrix}}$ and we see by the procedure in the last problem that: ${\text{im}}(L)={\text{Span}}\left({\begin{bmatrix}0\\1\end{bmatrix}}\right)$ and $\ker(L)={\text{Span}}\left({\begin{bmatrix}1\\0\end{bmatrix}}\right)$ .

Now for the case $\alpha \neq 0$ . Then we still have only one pivot and either column can form a basis for the image. Using the second column makes it look nicer, and is the same as the previous case. ${\text{im}}(L)={\text{Span}}\left({\begin{bmatrix}0\\1\end{bmatrix}}\right)$ . The difference is when we write out the equation $Lx=0$ to find the kernel, we get $\alpha x_{1}+x_{2}=0$ . With $x_{2}$ as our free variable this means $x_{1}=-{\frac {1}{\alpha }}x_{2}$ so that a basis for the kernel is $\ker(L)={\text{Span}}\left({\begin{bmatrix}-{\frac {1}{\alpha }}\\1\end{bmatrix}}\right)$ .