Linear Algebra for Team-Based Inquiry Learning

2023 Edition

Steven Clontz Drew Lewis
University of South Alabama

August 24, 2023

Section 5.3: Eigenvalues and Characteristic Polynomials (GT3)

Activity 5.3.1 (~5 min)

An invertible matrix \(M\) and its inverse \(M^{-1}\) are given below:

\begin{equation*} M=\left[\begin{array}{cc}1&2\\3&4\end{array}\right] \hspace{2em} M^{-1}=\left[\begin{array}{cc}-2&1\\3/2&-1/2\end{array}\right] \end{equation*}

Which of the following is equal to \(\det(M)\det(M^{-1})\text{?}\)

  1. \(\displaystyle -1\)

  2. \(\displaystyle 0\)

  3. \(\displaystyle 1\)

  4. \(\displaystyle 4\)

Fact 5.3.1

For every invertible matrix \(M\text{,}\)

\begin{equation*} \det(M)\det(M^{-1})= \det(I)=1 \end{equation*}
so \(\det(M^{-1})=\frac{1}{\det(M)}\text{.}\)

Furthermore, a square matrix \(M\) is invertible if and only if \(\det(M)\not=0\text{.}\)

Observation 5.3.2

Consider the linear transformation \(A : \IR^2 \rightarrow \IR^2\) given by the matrix \(A = \left[\begin{array}{cc} 2 & 2 \\ 0 & 3 \end{array}\right]\text{.}\)

Figure 1. Transformation of the unit square by the linear transformation \(A\)

It is easy to see geometrically that

\begin{equation*} A\left[\begin{array}{c}1 \\ 0 \end{array}\right] = \left[\begin{array}{cc} 2 & 2 \\ 0 & 3 \end{array}\right]\left[\begin{array}{c}1 \\ 0 \end{array}\right]= \left[\begin{array}{c}2 \\ 0 \end{array}\right]= 2 \left[\begin{array}{c}1 \\ 0 \end{array}\right]\text{.} \end{equation*}

It is less obvious (but easily checked once you find it) that

\begin{equation*} A\left[\begin{array}{c} 2 \\ 1 \end{array}\right] = \left[\begin{array}{cc} 2 & 2 \\ 0 & 3 \end{array}\right]\left[\begin{array}{c}2 \\ 1 \end{array}\right]= \left[\begin{array}{c} 6 \\ 3 \end{array}\right] = 3\left[\begin{array}{c} 2 \\ 1 \end{array}\right]\text{.} \end{equation*}

Definition 5.3.3

Let \(A \in M_{n,n}\text{.}\) An eigenvector for \(A\) is a vector \(\vec{x} \in \IR^n\) such that \(A\vec{x}\) is parallel to \(\vec{x}\text{.}\)

Figure 2. The map \(A\) stretches out the eigenvector \(\left[\begin{array}{c}2 \\ 1 \end{array}\right]\) by a factor of \(3\) (the corresponding eigenvalue).

In other words, \(A\vec{x}=\lambda \vec{x}\) for some scalar \(\lambda\text{.}\) If \(\vec x\not=\vec 0\text{,}\) then we say \(\vec x\) is a nontrivial eigenvector and we call this \(\lambda\) an eigenvalue of \(A\text{.}\)

Activity 5.3.2 (~5 min)

Finding the eigenvalues \(\lambda\) that satisfy

\begin{equation*} A\vec x=\lambda\vec x=\lambda(I\vec x)=(\lambda I)\vec x \end{equation*}
for some nontrivial eigenvector \(\vec x\) is equivalent to finding nonzero solutions for the matrix equation
\begin{equation*} (A-\lambda I)\vec x =\vec 0\text{.} \end{equation*}

Part 1.

If \(\lambda\) is an eigenvalue, and \(T\) is the transformation with standard matrix \(A-\lambda I\text{,}\) which of these must contain a non-zero vector?

  1. The kernel of \(T\)

  2. The image of \(T\)

  3. The domain of \(T\)

  4. The codomain of \(T\)

Activity 5.3.2 (~5 min)

Finding the eigenvalues \(\lambda\) that satisfy

\begin{equation*} A\vec x=\lambda\vec x=\lambda(I\vec x)=(\lambda I)\vec x \end{equation*}
for some nontrivial eigenvector \(\vec x\) is equivalent to finding nonzero solutions for the matrix equation
\begin{equation*} (A-\lambda I)\vec x =\vec 0\text{.} \end{equation*}

Part 2.

Therefore, what can we conclude?

  1. \(A\) is invertible

  2. \(A\) is not invertible

  3. \(A-\lambda I\) is invertible

  4. \(A-\lambda I\) is not invertible

Activity 5.3.2 (~5 min)

Finding the eigenvalues \(\lambda\) that satisfy

\begin{equation*} A\vec x=\lambda\vec x=\lambda(I\vec x)=(\lambda I)\vec x \end{equation*}
for some nontrivial eigenvector \(\vec x\) is equivalent to finding nonzero solutions for the matrix equation
\begin{equation*} (A-\lambda I)\vec x =\vec 0\text{.} \end{equation*}

Part 3.

And what else?

  1. \(\displaystyle \det A=0\)

  2. \(\displaystyle \det A=1\)

  3. \(\displaystyle \det(A-\lambda I)=0\)

  4. \(\displaystyle \det(A-\lambda I)=1\)

Fact 5.3.4

The eigenvalues \(\lambda\) for a matrix \(A\) are exactly the values that make \(A-\lambda I\) non-invertible.

Thus the eigenvalues \(\lambda\) for a matrix \(A\) are the solutions to the equation

\begin{equation*} \det(A-\lambda I)=0. \end{equation*}

Definition 5.3.5

The expression \(\det(A-\lambda I)\) is called characteristic polynomial of \(A\text{.}\)

For example, when \(A=\left[\begin{array}{cc}1 & 2 \\ 5 & 4\end{array}\right]\text{,}\) we have

\begin{equation*} A-\lambda I= \left[\begin{array}{cc}1 & 2 \\ 5 & 4\end{array}\right]- \left[\begin{array}{cc}\lambda & 0 \\ 0 & \lambda\end{array}\right]= \left[\begin{array}{cc}1-\lambda & 2 \\ 5 & 4-\lambda\end{array}\right]\text{.} \end{equation*}

Thus the characteristic polynomial of \(A\) is

\begin{equation*} \det\left[\begin{array}{cc}1-\lambda & 2 \\ 5 & 4-\lambda\end{array}\right] = (1-\lambda)(4-\lambda)-(2)(5) = \lambda^2-5\lambda-6 \end{equation*}
and its eigenvalues are the solutions \(-1,6\) to \(\lambda^2-5\lambda-6=0\text{.}\)

Activity 5.3.3 (~10 min)

Let \(A = \left[\begin{array}{cc} 5 & 2 \\ -3 & -2 \end{array}\right]\text{.}\)

Part 1.

Compute \(\det (A-\lambda I)\) to determine the characteristic polynomial of \(A\text{.}\)

Activity 5.3.3 (~10 min)

Let \(A = \left[\begin{array}{cc} 5 & 2 \\ -3 & -2 \end{array}\right]\text{.}\)

Part 2.

Set this characteristic polynomial equal to zero and factor to determine the eigenvalues of \(A\text{.}\)

Activity 5.3.4 (~5 min)

Find all the eigenvalues for the matrix \(A=\left[\begin{array}{cc} 3 & -3 \\ 2 & -4 \end{array}\right]\text{.}\)

Activity 5.3.5 (~5 min)

Find all the eigenvalues for the matrix \(A=\left[\begin{array}{cc} 1 & -4 \\ 0 & 5 \end{array}\right]\text{.}\)

Activity 5.3.6 (~10 min)

Find all the eigenvalues for the matrix \(A=\left[\begin{array}{ccc} 3 & -3 & 1 \\ 0 & -4 & 2 \\ 0 & 0 & 7 \end{array}\right]\text{.}\)