Notes about math, research, and more.
Suppose that \(A,B \in M_{n}(\mathbb{C})\) are commuting matrices. Let
\(V_{\lambda}\) be an eigenspace of \(A\nothing\). Show that \(V_{\lambda}\) is
\(B\text{-invariant}\).
Let \(v \in V_{\lambda}\). Since \(A,B\) commute, \(A(Bv) = B(Av)\). Since \(v \in V_{\lambda}\),
\[
Av = \lambda v \implies B(Av) = B(\lambda v) = \lambda (Bv) \implies Bv \in V_{\lambda}.
\]
Hence, \(V_{\lambda}\) is \(B-\text{invariant.}\)
Let \(V\) be an inner product space and let \(W \leq V\) be a
subspace. Let \(v \in V\) and define \(\hat{v} \in W\) as in the proof of
Proposition 2.2.3. Prove that if \(w \in W\) with \(w\neq \hat{v}\), then
\(\left\| v-\hat{v} \right\| < \left\| v-w \right\|\). Deduce that
\(\hat{v}\) is independent of the choice of orthonormal basis for
\(W\). It is called the orthogonal projection of \(v\) onto \(W\).
Hint: Use the notation of the proof of Proposition 2.2.3. Let \(\{
e_{m+1},\ldots,e_{n} \}\) be an orthonormal basis for \(W^\perp\). Then
\(\{ e_{1},\ldots,e_{n} \}\) is an orthonormal basis for \(V\). Say \(v =
\sum_{i=1}^n a_i e_{i}\) and \(w = \sum_{i=1}^m b_{i}e_{i}\). To compute
the norms of \(v - \hat{v}\) and \(v-w\), express both vectors as linear
combinations of \(e_{1},\ldots,e_{n}\).
Let \(B = \{ e_{1},\ldots,e_{n} \}\) be an orthonormal basis for \(V\) and
\(W\) the subspace of \(V\) spanned by \(\{ e_{m+1},\ldots,e_{n} \}\). Let
- \(v = \sum_{i=1}^n a_{i}e_{i}\),
- \(\hat{v} = \sum_{i=m+1}^n a_{i} e_{i}\), and
- \(W \ni w =\sum_{i=m+1}^{n} b_{i}e_{i} \neq \hat{v}\).
We calculate \(\left\| v-\hat{v}\right\|\) and \(\left\| v-w \right\|\):
\begin{align*}
\left\| v-\hat{v} \right\| = \left\|\sum_{i=1}^{n} a_{i} e_{i} - \sum_{i=m+1}^{n} a_{i} e_{i} \right\| = \left\|\sum_{i=1}^{m} a_{i} e_{i} \right \|
\end{align*} and
\begin{align*}
\left\| v-w \right\|^2 &= \left\| \sum_{i=1}^n a_{i}e_{i} - \sum_{i=m+1}^{n} b_{i}e_{i} \right\|^2 \\
&= \left\| \sum_{i=1}^m a_{i}e_{i} + \sum_{i=m+1}^{n} (a_{i}-b_{i})e_{i} \right\|^2 \\
&= \left\| \sum_{i=1}^m a_{i}e_{i}\right\|^2 + \left\|\sum_{i=m+1}^{n} (a_{i}-b_{i})e_{i} \right\|^2 \\
\left\| v-w \right\| &= \sqrt{\left\| v-\hat{v} \right\|^2 + \left\|\sum_{i=m+1}^{n} (a_{i}-b_{i})e_{i} \right\|^2} \\
\end{align*}
Since \(w \neq \hat{v}\), there is some \(i \in \{ m+1,\ldots,n \}\) such
that \(a_{i}\neq b_{i} \implies\left\|\sum_{i=m+1}^{n}
(a_{i}-b_{i})e_{i}\right\|^2 > 0\). Hence, \(\left\| v-w \right\| >
\left\| v-\hat{v} \right\|\). Further, the norm of an element does not
depend on the choice of orthonormal basis the element is expressed in,
so \(\left\| v-\hat{v} \right\| < \left\| v-w \right\|\) holds independent
of the choice of orthonormal basis for \(W\).
If \(B\) is an orthonormal basis of an inner product space \(V\), and if \(T \in \text{End}(V)\), then
\[
[T^*]_{B} = ([T]_{B})^*.
\]
Hint: Say \(B = \{ v_{1},\ldots,v_{n} \}\). Then
\[
([T^*]_{B})_{ij} = (i^{th} \text{ entry of } [T^*v_{j}]_{B}) = \langle T^* v_{j}, v_{i} \rangle.
\]
We show that \([T^*]_{B} = ([T]_{B})^*\) by showing that for each \(i,j \in \{1,\ldots,n \}\),
\(([T^*]_{B})_{ij} = \overline{([T]_{B})_{ji}}\). Let \(i,j \in \{1,\ldots,n \}\). By the hint,
\begin{align*}
([T^*]_{B})_{ij} &= \langle T^* v_{j}, v_{i} \rangle\\
&= \langle v_{j}, Tv_{i} \rangle \text{ since }(T^*)^* = T; \\
&= \overline{\langle Tv_{i}, v_{j} \rangle} \\
&= \overline{([T]_{B})_{ji}} \text{ by the hint.}
\end{align*}
Hence, for each \(i,j \in \{1,\ldots,n \}\), \(([T^*]_{B})_{ij} = \overline{([T]_{B})_{ji}}\) which implies \([T^*]_{B} =
\overline{([T]_{B})^T} = ([T]_{B})^*\).
Let \(V\) be an inner product space and \(T \in \text{End}(V)\). Assume
that for some orthonormal basis \(B\) of \(V\), the matrix \([T]_{B}\) is
self-adjoint. Then \(T\) is a self-adjoint linear operator on \(V\).
To show \(T\) is self-adjoint we must show that for all \(v,w \in V\),
\(\langle Tv, w \rangle = \langle v, Tw \rangle\). Denote the elements of
\(B\) as \(\{ v_{1},\ldots,v_{n} \}\). For each \(i,j \in \{1,\ldots,n \}\),
\begin{align*}
\left\langle T v_{i}, v_{j} \right\rangle &= ([T]_{B})_{ji} \text{ by hint in Exercise A;}\\
&= \overline{([T]_{B})_{ij}} \text{ by assumption;} \\
&= \overline{\left\langle Tv_{j}, v_{i} \right\rangle} \\
&= \left\langle v_{i}, Tv_{j} \right\rangle.
\end{align*}
Let \(v = \sum_{i=1}^{n} a_{i}v_{i}, w = \sum_{i=1}^n b_{i}v_{i} \in V\). From above,
\begin{align*}
\left\langle Tv, w \right\rangle &= \left\langle T\sum_{i=1}^{n} a_{i}v_{i}, \sum_{j=1}^{n} b_{j}v_{j} \right\rangle \\
&= \sum_{i=1}^{n} \sum_{j=1}^{n} a_{i}\overline{b_{j}} \left\langle Tv_{i}, v_{j} \right\rangle \text{ by linearity of \(T\);}\\
&= \sum_{i=1}^{n} \sum_{j=1}^{n} a_{i}\overline{b_{j}} \left\langle v_{i}, Tv_{j} \right\rangle \text{ from above;}\\
&= \langle v, Tw \rangle.
\end{align*}
Since \(T^*\) is the unique operator such that \(\langle Tv, w \rangle =
\langle v, T^*w \rangle\), we deduce \(T = T^*\).