# Section 1.11

7. Show that two 2-dimensional subspaces of a 3-dimensional subspace must have nontrivial intersection.

Proof:
(by contradiction) Suppose ${\displaystyle M,N}$ are both 2-dimensional subspaces of a 3-dimension vector space ${\displaystyle V}$ and assume that ${\displaystyle M,N}$ have trivial intersection. Then ${\displaystyle M+N}$ is also a subspace of ${\displaystyle V}$, and since ${\displaystyle M,N}$ have a trivial intersection ${\displaystyle M+N=M\oplus N}$. But then:

${\displaystyle \dim(M+N)=\dim M+\dim N=2+2}$. However subspaces must have a smaller dimension than the whole vector space and ${\displaystyle 4>3}$. This is a contradiction and so ${\displaystyle M,N}$ must have trivial intersection.

8. Let ${\displaystyle M_{1},M_{2}\subset V}$ be subspaces of a finite dimensional vector space ${\displaystyle V}$. Show that ${\displaystyle \dim(M_{1}\cap M_{2})+\dim(M_{1}\cup M_{2})=\dim M_{1}+\dim M_{2}}$.

Proof:
Define the linear map ${\displaystyle L:M_{1}\times M_{2}\to V}$ by ${\displaystyle L(x_{1},x_{2})=x_{1}-x_{2}}$. Then by dimension formula ${\displaystyle \dim(M_{1}\times M_{2})=\dim \ker(L)+\dim {\text{im}}(K)}$ First note that in general ${\displaystyle \dim(V\times W)=\dim V+\dim W}$. This fact I won’t prove here but is why ${\displaystyle \dim \mathbb {R} ^{2}=1+1=2}$. Now ${\displaystyle \ker(L)=\{(x_{1},x_{2}):L(x_{1},x_{2})=0\}}$. That is, ${\displaystyle (x_{1},x_{2})\in \ker(L)}$ iff ${\displaystyle x_{1}-x_{2}=0\Rightarrow x_{1}=x_{2}}$. But since ${\displaystyle x_{1}\in M_{1}}$ and ${\displaystyle x_{2}\in M_{2}}$ and they are actually the same vector, ${\displaystyle x_{1}=x_{2}}$, then we must have ${\displaystyle x_{1}=x_{2}\in M_{1}\cap M_{2}}$. That says that the elements of the kernel are ordered pairs where the first and second component are equal and must be in ${\displaystyle M_{1}\cap M_{2}}$. Then we can write ${\displaystyle \ker(L)=\{(x,x):x\in M_{1}\cap M_{2}\}}$. I claim that this is isomorphic to ${\displaystyle M_{1}\cap M_{2}}$. To prove this consider the function ${\displaystyle \phi :M_{1}\cap M_{2}\to \ker(L)}$ as ${\displaystyle \phi (x)=(x,x)}$. This map ${\displaystyle \phi }$ is an isomorphism which you can check. Since we have an isomorphism, the dimensions must equal and so ${\displaystyle \dim(M_{1}\cap M_{2})=\dim(\ker(L))}$. Finally let us examine ${\displaystyle {\text{im}}(L)=\{x_{1}-x_{2}:x_{1}\in M_{1},x_{2}\in M_{2}\}}$. I claim that ${\displaystyle {\text{im}}(L)=M_{1}+M_{2}}$. Note, this is equal and not just isomorphic. To see this, we note that if ${\displaystyle x_{2}\in M_{2}}$ then ${\displaystyle -x_{2}\in M_{2}}$ by subspace property. So then any ${\displaystyle x_{1}+x_{2}\in M_{1}+M_{2}}$ is also equal to ${\displaystyle x_{1}-(-x_{2})\in {\text{im}}(L)}$. So these sets do indeed contain the exact same elements. That means ${\displaystyle \dim(M_{1}+M_{2})=\dim {\text{im}}(L)}$. Putting this all together gives:

${\displaystyle \dim M_{1}+\dim M_{2}=\dim(M_{1}\times M_{2})=\dim \ker(L)+\dim {\text{im}}(L)=\dim(M_{1}\cap M_{2})+\dim(M_{1}+M_{2})}$.

16. Show that the matrix
${\displaystyle {\begin{bmatrix}0&1\\0&1\end{bmatrix}}}$ as a linear map satisfies ${\displaystyle \ker(L)={\text{im}}(L)}$.

Proof:
The matrix is already in eschelon form and has one pivot in the second column. That means that a basis for the column space which is the same as the image would be the second column. In other words, ${\displaystyle {\text{im}}(L)={\text{Span}}\left({\begin{bmatrix}1\\0\end{bmatrix}}\right)}$. Now for the kernel space. Writing out the equation ${\displaystyle Lx=0}$ reads ${\displaystyle 0x_{1}+1x_{2}=0}$ or in other words ${\displaystyle x_{2}=0}$. Then an arbitrary element of the kernel ${\displaystyle {\begin{bmatrix}x_{1}\\x_{2}\end{bmatrix}}=x_{2}{\begin{bmatrix}1\\0\end{bmatrix}}}$. So again ${\displaystyle \ker(L)={\text{Span}}\left({\begin{bmatrix}1\\0\end{bmatrix}}\right)}$. In other words, ${\displaystyle \ker(L)={\text{im}}(L)}$.

17. Show that
${\displaystyle {\begin{bmatrix}0&0\\\alpha &1\end{bmatrix}}}$ defines a projection for all ${\displaystyle \alpha \in \mathbb {F} }$. Compute the kernel and image.

Proof:
First I will deal with the case ${\displaystyle \alpha =0}$. In this case the matrix is ${\displaystyle {\begin{bmatrix}0&0\\0&1\end{bmatrix}}}$ and we see by the procedure in the last problem that: ${\displaystyle {\text{im}}(L)={\text{Span}}\left({\begin{bmatrix}0\\1\end{bmatrix}}\right)}$ and ${\displaystyle \ker(L)={\text{Span}}\left({\begin{bmatrix}1\\0\end{bmatrix}}\right)}$.

Now for the case ${\displaystyle \alpha \neq 0}$. Then we still have only one pivot and either column can form a basis for the image. Using the second column makes it look nicer, and is the same as the previous case. ${\displaystyle {\text{im}}(L)={\text{Span}}\left({\begin{bmatrix}0\\1\end{bmatrix}}\right)}$. The difference is when we write out the equation ${\displaystyle Lx=0}$ to find the kernel, we get ${\displaystyle \alpha x_{1}+x_{2}=0}$. With ${\displaystyle x_{2}}$ as our free variable this means ${\displaystyle x_{1}=-{\frac {1}{\alpha }}x_{2}}$ so that a basis for the kernel is ${\displaystyle \ker(L)={\text{Span}}\left({\begin{bmatrix}-{\frac {1}{\alpha }}\\1\end{bmatrix}}\right)}$.