Eigenvalue and Eigenvectors
Contents
Eigenvalue and Eigenvectors#
Recap from last class#
System of linear equations represented as:
- Non-homogeneous system if \(\vec{b}\neq\vec{0}\) - If \(\det(\arr{A})\neq0\) (e.g. \(\arr{A}\) is invertible), then a unique solution exists. 
 
- Homogeneous system if \(\vec{b}=\vec{0}\) - If \(\det(\arr{A})\neq0 \rightarrow\) has only the trivial solution \(\vec{x} = \vec{0}\) 
- If \(\det(\arr{A})=0 \rightarrow\) also has a series of nontrivial solutions 
 
Intro to eigenvalues and eigenvectors#
- For engineering applications, eigenvalue problems are among the most important problems concerning matrices. - For example, wherever there are vibrations, there are eigenvalues, which are the natural frequencies of the vibration. 
- If you’ve ever tuned a guitar, you’ve solved an eigenvalue problem! 
- Google’s page-rank algorithm for determining which pages are important is also built on eigenvectors of a very specific matrix. 
- Quantum mechanics 
- Face detection 
 
- For the following expression: 
\(\vec{x}=0\) is always a solution (trivial)
If \(\vec{x}\neq0\) exists, then \(\lambda=\) eigenvalue of \(\arr{A}\) and \(\vec{x}=\) eigenvector of \(\arr{A}\).
- Often, there are are many eigenvectors, which together with the \(\vec{0}\) vector, form the eigenspace of \(\arr{A}\). 
- How do we find \(\lambda\)’s and associated \(\vec{x}\)’s? What do they mean? 
- To find \(\lambda\), rearrange our key equation: 
\(\arr{A}\) is matrix and \(\lambda\) a scalar, but we can factor out \(\vec{x}\) using \(\arr{I}\).
This is a homogeneous system. We just learned that if the determinant of the matrix on the LHS is zero, then there is a non-trivial solution that is actually a series.
\(\therefore\) if \(|\arr{A} - \lambda\arr{I}|=0\),
then there is a non trivial value of \(\vec{x}\) and a set of \(\lambda\), \(\vec{x}\) that satisfy \(\arr{A}\vec{x}=\lambda\vec{x}\)
Example for a 2x2 matrix#
\(\arr{A} = \begin{bmatrix}-5 & 0\\ 1 & 2\end{bmatrix}\)
We want
which is the “characteristic equation” of \(\arr{A}\).
- Equation is satisfied by \(\lambda_1=2\); \(\lambda_2=-5\) 
- Now, there is one eigenvector assiciated with each eigenvalue: \(\vec{x}^{(1)}\) and \(\vec{x}^{(2)}\) 
- Let’s start with \(\lambda_1=2\): 
\(\therefore x_1^{(1)} =0\) and \(x_1^{(2)}\) can be anything but zero since solution is non-trivial.
\(\therefore \vec{x}^{(1)} = \begin{bmatrix}0\\\alpha\end{bmatrix}\) where \(\alpha\neq 0\). This \(\vec{x}^{(1)}\) is married to \(\lambda_1=2\).
- Now, \(\lambda_2=-5:\) 
Which gives \(0=0\) and
\(\therefore \vec{x}^{(2)} = \begin{bmatrix}-7\beta\\ \beta \end{bmatrix}\), where \(\beta \neq 0\). This \(\vec{x}^{(2)}\) is married to \(\lambda_2 = -5\)
- Make sure to check that \(\lambda + \vec{x}\) ‘s satisfy the original equation\ 
\
\
- Eigenspaces are: \begin{array}{} 
\end{array}
The basis of two non zero eignspaces are \(\begin{bmatrix} 0 \\ 1\end{bmatrix}\) and \(\begin{bmatrix} -7 \\ 1\end{bmatrix}\) respectively.
Recap#
- For any n\(\times\)n matrix with \(\text{rank}(\arr{A})=n\), you’ll get an \(n^{(th)}\) degree polynomial to solve in \(\lambda\) from \(|\arr{A} - \lambda \arr{I}| = 0\). 
- An n\(\times\)n matrix has at least one \(\lambda\) and \(n\) distinct \(\lambda\) ‘s. 
- Recap to solving eigenvalue problems: - Set up \(|\arr{A} - \lambda \arr{I}| = 0\) 
- Determine the characteristic equation to solve for \(\lambda\) 
- For each \(\lambda\), determine \(\vec{x}\) 
 
Example for a 3x3#
- Let’s try a larger example: 
 \(\arr{A} = \begin{bmatrix}6&10&6 \\ 0&8&12 \\ 0&0&2 \end{bmatrix} \rightarrow\) we want \(\lambda\), \(\vec{x}\) such that \(\arr{A} \vec{x} = \lambda \vec{x}\)
We choose to work with column 1
We don’t really need to go further unless we want expanded characteristic equation.
\(\therefore\) eigenvalues are: \begin{array}{} \lambda_1=6, & \lambda_2=8, & \lambda_3=3 \end{array} We can get maximum of 3 eigenvalues since \(n=3\)
- Now, find \(\vec{x}^{(i)}\) for each \(\lambda_i\) using \((\arr{A}- \lambda_i \arr{I})\vec{x}^{(i)} = \vec{0}\) - For \(\lambda_1=6\): 
 (31)#\[\begin{align} \begin{bmatrix} 0&10&6 \\ 0&2&12 \\ 0&0&-4 \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix} \end{align}\]- Which gives: (32)#\[\begin{align} 10x_2 + 6x_3 = 0;\\ 2 x_2 + 12 x_3 = 0;\\ -4x_3 = 0 \end{align}\](33)#\[\begin{align} \implies x_3 = x_2 = 0 &&\text{and}&& \text{$x_1=$arbitrary} \end{align}\](34)#\[\begin{align} \vec{x}^{(1)} = \begin{bmatrix} \alpha \\ 0 \\ 0 \end{bmatrix} & \text{, where $\alpha\neq0$} \end{align}\]- or the basis vector \(\vec{x}^{(1)} = \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix}\) (really a series of solutions) - For \(\lambda_2=8\): 
 (35)#\[\begin{align} \begin{bmatrix} -2&10&6 \\ 0&0&12 \\ 0&0&-6 \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix} \end{align}\]- Which gives: (36)#\[\begin{align} -2x_1 + 10x_2 + 6x_3= 0;\\ 12x_3 = 0;\\ -6x_3 = 0 \end{align}\](37)#\[\begin{align} \implies x_3 = 0, && x_1 = 5x_2 + 3x_3 &&\text{and}&& \text{$x_2=$arbitrary} \end{align}\](38)#\[\begin{align} \vec{x}^{(2)} = \begin{bmatrix} 5\beta \\ \beta \\ 0 \end{bmatrix}& \text{, $\beta\neq0$} \end{align}\]- or the basis vector \(\vec{x}^{(2)} = \begin{bmatrix} 5 \\ 1 \\ 0 \end{bmatrix}\) - For \(\lambda_3=2\): 
 (39)#\[\begin{align} \begin{bmatrix} 4&10&6 \\ 0&6&12 \\ 0&0&0 \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix} \end{align}\]- Which gives: (40)#\[\begin{align} 4x_1 + 10x_2 + 6x_3 = 0;\\ 6x_2 + 12x_3 = 0;\\ 0 = 0 \end{align}\](41)#\[\begin{align} \implies x_2 &= -2 x_3;\\ x_1 &= -\frac{10}{4}(-2x_3) - \frac{6}{4}x_3 \\ &= 5x_3 -\frac{3}{2}x_3\\ x_1 &= \frac{7}{2}x_3 \end{align}\](42)#\[\begin{align} \vec{x}^{(3)} = \begin{bmatrix} \frac{7}{2}\delta \\ -2\delta \\ \delta \end{bmatrix}& \text{, $\delta\neq0$} \end{align}\]- or the basis vector \(\vec{x}^{(3)} = \begin{bmatrix} \frac{7}{2} \\ -2 \\ 1 \end{bmatrix} = \begin{bmatrix} 7 \\ -4 \\ 2 \end{bmatrix}\) 
Eigenspace corresponding to \(\arr{A}\) : \begin{array}{} 6, \begin{bmatrix} 1\0\0 \end{bmatrix} ; & 8,\begin{bmatrix} 5\1\0 \end{bmatrix}; & 2, \begin{bmatrix} 7-4\2\end{bmatrix} ; & \vec{x}=\vec{0} \end{array}
Example with a double root#
Consider \(\arr{A}=\begin{bmatrix}-2&2&-3\\2&1&-6\\-1&-2&0\end{bmatrix}\)
Find the characteristic equation#
Calculating this, we get \(\lambda^3+\lambda^2-21\lambda-45=0\). No way for us to calculate this easily by hand!
Using python, we find the roots are \(\lambda_1=5,\lambda_2=\lambda_3=-3\). Note the double root!
Find the eigenvector for \(\lambda_1=5\), corresponding to \((\arr{A}-5\arr{I})\vec{x}^{(1)}=\vec{0}\)#
We need to use Gauss Elimination to solve this:
- \(R_1=R_2/2, R_2=R_1\) 
- \(R_2=R_2+7R_1\), \(R_3=R_3+R_1\) 
- \(R_3=-R_3/4, R_2=-R_2/12\) 
- \(R_3=R_3-R_2, R_1=R_1+2R_2\) 
So \(x_1+x_3=0\) and \(x_2+2x_3=0\). Let’s let \(x_3=1\). That gives us \(x_1=-\alpha\) and \(x_2=-2\alpha\), so
We can do a quick check that this is correct:
Eigenvector for \(\lambda_2=\lambda_3=-3\)#
For these, we get:
After Gauss elimination we get:
This tells us that \(x_1+2x_2-3x_3=0\). This has one equation and three unknowns! So, let’s write our system as a system of three equations:
We can write this as three linearly independent vectors:
where \(x_2\) and \(x_3\) are arbitrary. This gives us two eigenvectors:
So, the fulll eigenspace of \(\arr{A}\) is
Recap of the characteristic equation#
\(|\arr{A}-\lambda \arr{I}|=0\) gives us the characteristic equation of \(\arr{A}\), which is a polynomial. An nth order polynomial has several solution possibilities:
- \(n\) distinct, real roots 
- redundant, real roots 
- complex (i.e. imaginary) roots 
Consider the case of complex roots:#
\( \arr{A} = \begin{bmatrix} 1&2 \\ -2&1 \end{bmatrix} \)
Find \(\lambda\) ‘s by solving \(|\arr{A} - \lambda \vec{x}|\) = 0#
Now find eigenvectors:#
- Find \(\vec{x}^{(1)}\) by solving \((\arr{A} - \lambda_1 \arr{I}) \vec{x} = \vec{0}\) 
- \(R_1 = \frac{1}{2}R_1\) and \(R_2=\frac{1}{2}iR_2\): 
* $R_2 = R_2 - R_1$:
or,
- Find \(\vec{x}^{(2)}\) by solving \((\arr{A} - \lambda_2 \arr{I}) \vec{x} = \vec{0}\) 
\(R_1 = \frac{1}{2}R_1\) and \(R_2=\frac{1}{2}iR_2\):
\(R_2 = R_2 + R_1\):
or,
\(\therefore\) Eigenspace of \(A\) is
