Skip to main content

Section 5.1 Eigenvalues and eigenvectors: preliminaries

Eigenvalues and eigenvectors are defined for a square matrix \(A.\)

Definition 5.1.1. The eigenvalue of a matrix.

A number \(\lambda\) is an eigenvalue of a square matrix \(A\) if

\begin{equation*} A\vec x=\lambda\vec x \end{equation*}

for some \(\vec x\not=\vec0\text{.}\)

Definition 5.1.2. The eigenvector of a matrix.

If

\begin{equation*} A\vec x=\lambda\vec x \end{equation*}

for some \(\vec x\not=\vec0\text{,}\) then \(\vec x\) is called an eigenvector of \(A\) corresponding to the eigenvalue \(\lambda\text{.}\)

Notice that if \(\vec x=\vec 0\text{,}\) then \(A\vec x=\lambda\vec x\) is simply the equation \(\vec0=\vec0\) for any value of \(\lambda\text{.}\) This is not too interesting, and so we always have the restriction \(\vec x\not=\vec0\text{.}\)

\begin{equation*} \begin{bmatrix} 5\amp -1\amp -2\\ 1\amp 3\amp -2\\ -1\amp -1\amp 4 \end{bmatrix} \begin{bmatrix} 1\\1\\1 \end{bmatrix} = \begin{bmatrix} 2\\2\\2 \end{bmatrix}=2 \begin{bmatrix} 1\\1\\1 \end{bmatrix} \end{equation*}

which makes \(\vec x=\begin{bmatrix}1\\1\\1\end{bmatrix}\) an eigenvector with eigenvalue \(\lambda=2\text{.}\)

\begin{equation*} \begin{bmatrix} 5\amp -1\amp -2\\ 1\amp 3\amp -2\\ -1\amp -1\amp 4 \end{bmatrix} \begin{bmatrix} 1\\-1\\1 \end{bmatrix} = \begin{bmatrix} 4\\-4\\4 \end{bmatrix}=4 \begin{bmatrix} 1\\-1\\1 \end{bmatrix} \end{equation*}

which makes makes \(\vec x=\begin{bmatrix} 1\\-1\\1 \end{bmatrix}\) an eigenvector with eigenvalue \(\lambda=4\text{.}\)

\begin{equation*} \begin{bmatrix} 5\amp -1\amp -2\\ 1\amp 3\amp -2\\ -1\amp -1\amp 4 \end{bmatrix} \begin{bmatrix} -1\\-1\\1 \end{bmatrix}= \begin{bmatrix} -6\\-6\\6 \end{bmatrix}=6 \begin{bmatrix} -1\\-1\\1 \end{bmatrix} \end{equation*}

which makes makes \(\vec x=\begin{bmatrix}-1\\-1\\1\end{bmatrix}\) an eigenvector with eigenvalue \(\lambda=6\text{.}\)

We have now computed eigenvalues of \(A\text{:}\) \(\lambda=2\text{,}\) \(\lambda=4\) and \(\lambda=6\text{.}\) With a little more theory, we will see that there are no others.

Let \(A=\begin{bmatrix}2\amp 1\amp 4\\ 0\amp 3\amp 0\\ 2\amp -2\amp -5 \end{bmatrix}\) Then we have

\begin{equation*} \begin{bmatrix} 2\amp 1\amp 4\\ 0\amp 3\amp 0\\ 2\amp -2\amp -5\end{bmatrix} \begin{bmatrix}1\\1\\0\end{bmatrix}= \begin{bmatrix}3\\3\\0\end{bmatrix} =3 \begin{bmatrix}1\\1\\0\end{bmatrix} \end{equation*}

which makes \(\vec x=\begin{bmatrix}1\\1\\0\end{bmatrix}\) an eigenvector with eigenvalue \(\lambda=3\text{.}\) Also

\begin{equation*} \begin{bmatrix} 2\amp 1\amp 4\\ 0\amp 3\amp 0\\ 2\amp -2\amp -5\end{bmatrix} \begin{bmatrix}4\\0\\1\end{bmatrix}= \begin{bmatrix}12\\0\\3\end{bmatrix}=3 \begin{bmatrix}4\\0\\1\end{bmatrix} \end{equation*}

which makes \(\vec x=\begin{bmatrix}4\\0\\1\end{bmatrix}\) an eigenvector with eigenvalue \(\lambda=3\) Finally

\begin{equation*} \begin{bmatrix} 2\amp 1\amp 4\\ 0\amp 3\amp 0\\ 2\amp -2\amp -5\end{bmatrix} \begin{bmatrix}-1\\0\\2\end{bmatrix}= \begin{bmatrix}6\\0\\-12\end{bmatrix}=-6 \begin{bmatrix}-1\\0\\2\end{bmatrix} \end{equation*}

which makes \(\vec x=\begin{bmatrix}-1\\0\\2\end{bmatrix}\) an eigenvector with eigenvalue \(\lambda=-6\) Hence the demonstrated eigenvalues are \(\lambda=3\) and \(\lambda=-6\text{.}\) We will soon see that there are no more.

Let \(A=\begin{bmatrix} 0\amp -1\\ 1\amp 0\end{bmatrix}\) and consider the equation \(A\vec x=\lambda\vec x\text{.}\) Setting \(\vec x=\begin{bmatrix} x_1\\x_2\end{bmatrix}\text{,}\)

\begin{equation*} \begin{bmatrix} 0\amp -1\\ 1\amp 0\end{bmatrix}\begin{bmatrix} x_1\\x_2\end{bmatrix} =\lambda\begin{bmatrix} x_1\\x_2\end{bmatrix} \end{equation*}

This is equivalent to the system of equations

\begin{equation*} \begin{array}{rl} \lambda x_1 \amp = -x_2\\ \lambda x_2 \amp = x_1. \end{array} \end{equation*}

This implies that \((\lambda^2+1)x_1 = \lambda^2 x_1 + x_1 = -\lambda x_2 +\lambda x_2=0 \) and \((\lambda^2+1)x_2 = \lambda^2 x_2 + x_2 = \lambda x_1 -\lambda x_21=0.\) Since \(\vec x \not=\vec 0\text{,}\) either \(x_1\not=0\) or \(x_2\not=0\text{,}\) and consequently \(\lambda^2=-1\text{.}\) Since no real number satisfies this equation, we conclude that there are no eigenvalues or eigenvectors for \(A\text{.}\) An aside: if we consider complex numbers, then \(i\) and \(-i\) are both eigenvalues of \(A\) with \(\begin{bmatrix} 1\\-i\end{bmatrix}\) and \(\begin{bmatrix} 1\\i\end{bmatrix}\) as corresponding eigenvectors.

Consider the matrix

\begin{equation*} A= \begin{bmatrix} 1 \amp 1 \amp 1 \amp 1\\ 0 \amp 2 \amp 2 \amp 2\\ 0 \amp 0 \amp 3 \amp 3\\ 0 \amp 0 \amp 0 \amp 4 \end{bmatrix} \end{equation*}

Then it's easy to verify that

\begin{equation*} \begin{bmatrix} 1 \amp 1 \amp 1 \amp 1\\ 0 \amp 2 \amp 2 \amp 2\\ 0 \amp 0 \amp 3 \amp 3\\ 0 \amp 0 \amp 0 \amp 4 \end{bmatrix} \begin{bmatrix} 1\\0\\0\\0\end{bmatrix} = \begin{bmatrix} 1\\0\\0\\0\end{bmatrix} = 1\begin{bmatrix} 1\\0\\0\\0\end{bmatrix} \end{equation*}
\begin{equation*} \begin{bmatrix} 1 \amp 1 \amp 1 \amp 1\\ 0 \amp 2 \amp 2 \amp 2\\ 0 \amp 0 \amp 3 \amp 3\\ 0 \amp 0 \amp 0 \amp 4 \end{bmatrix} \begin{bmatrix} 1\\1\\0\\0\end{bmatrix} = \begin{bmatrix} 2\\2\\0\\0\end{bmatrix} =2\begin{bmatrix} 1\\1\\0\\0\end{bmatrix} \end{equation*}
\begin{equation*} \begin{bmatrix} 1 \amp 1 \amp 1 \amp 1\\ 0 \amp 2 \amp 2 \amp 2\\ 0 \amp 0 \amp 3 \amp 3\\ 0 \amp 0 \amp 0 \amp 4 \end{bmatrix} \begin{bmatrix} 3\\4\\2\\0\end{bmatrix} = \begin{bmatrix} 9\\12\\6\\0\end{bmatrix} = 3\begin{bmatrix} 3\\4\\2\\0\end{bmatrix} \end{equation*}
\begin{equation*} \begin{bmatrix} 1 \amp 1 \amp 1 \amp 1\\ 0 \amp 2 \amp 2 \amp 2\\ 0 \amp 0 \amp 3 \amp 3\\ 0 \amp 0 \amp 0 \amp 4 \end{bmatrix} \begin{bmatrix} 8\\12\\9\\3\end{bmatrix} = \begin{bmatrix} 32\\48\\36\\12\end{bmatrix} =4\begin{bmatrix} 8\\12\\9\\3\end{bmatrix} \end{equation*}

and so we have \(1\text{,}\) \(2\text{,}\) \(3\) and \(4\) as eigenvalues. Notice that in this case the eigenvalues are just the diagonal elements and that the matrix is upper triangular. We shall see in Theorem 5.5.1 that for every upper (or lower) triangular matrix, the eigenvalues are the diagonal entries.

Definition 5.1.7. Eigenspaces.

Suppose that \(A\) is a square matrix of order \(n\text{.}\) Then for any real number \(\lambda\text{,}\) we define the eigenspace \(E_\lambda\) by

\begin{equation*} E_\lambda=\{\vec x\in \R^n\mid A\vec x=\lambda \vec x\}. \end{equation*}

Clearly \(\vec0\) is in \(E_\lambda\) for any value of \(\lambda\text{,}\) and \(\lambda\) is an eigenvalue if and only if there is some \(\vec x\not=\vec0\) in \(E_\lambda\text{.}\)

From Definition 4.9.17 it is sufficient to show that two properties of closure under addition and of closure under scalar multiplication are satisfied. Suppose \(\vec x\) and \(\vec y\) are in \(E_\lambda\text{.}\)

  • Closure under addition:

    \begin{align*} A(\vec x+\vec y)\amp=A(\vec x)+A(\vec y)\\ \amp=\lambda \vec x + \lambda \vec y \\ \amp= \lambda (\vec x + \vec y) \end{align*}
  • Closure under scalar multiplication:

    \begin{align*} A(r\vec x)\amp=rA(\vec x)\\ \amp=r(\lambda \vec x)\\ \amp= \lambda (r\vec x) \end{align*}