Small update to the site today that put the "Next Section" and "Previous Section" buttons in a slightly more obvious place. If they appear below (and slightly overlapping) the "Notes", "Practice Problems" and "Assigment Problems" buttons please clear your browsers cache. Some browsers (Chrome I'm looking at you.....) do not always look to the server to see if newer versions of some of the "background" files are available. Clearing your browsers cache will force them to get the newer versions.

Paul

January 27, 2020

*i.e.*you are probably on a mobile phone). Due to the nature of the mathematics on this site it is best views in landscape mode. If your device is not in landscape mode many of the equations will run off the side of your device (should be able to scroll to see them) and some of the menu items will be cut off due to the narrow screen width.

### Section 5-3 : Review : Eigenvalues & Eigenvectors

If you get nothing out of this quick review of linear algebra you must get this section. Without this section you will not be able to do any of the differential equations work that is in this chapter.

So, let’s start with the following. If we multiply an \(n \times n\) matrix by an \(n \times 1\) vector we will get a new \(n \times 1\) vector back. In other words,

\[A\,\vec \eta = \vec y\]What we want to know is if it is possible for the following to happen. Instead of just getting a brand new vector out of the multiplication is it possible instead to get the following,

\[\begin{equation}A\,\vec \eta = \lambda \vec \eta \label{eq:eq1} \end{equation} \]In other words, is it possible, at least for certain \(\lambda \) and \(\vec \eta \), to have matrix multiplication be the same as just multiplying the vector by a constant? Of course, we probably wouldn’t be talking about this if the answer was no. So, it is possible for this to happen, however, it won’t happen for just any value of \(\lambda \) or \(\vec \eta \). If we do happen to have a \(\lambda \) and \(\vec \eta \) for which this works (and they will always come in pairs) then we call \(\lambda\) an **eigenvalue** of \(A\) and \(\vec \eta \) an **eigenvector** of \(A\).

So, how do we go about finding the eigenvalues and eigenvectors for a matrix? Well first notice that if \(\vec \eta = \vec 0\) then \(\eqref{eq:eq1}\) is going to be true for any value of \(\lambda \) and so we are going to make the assumption that \(\vec \eta \ne \vec 0\). With that out of the way let’s rewrite \(\eqref{eq:eq1}\) a little.

\[\begin{align*}A\vec \eta - \lambda \vec \eta & = \vec 0\\ A\vec \eta - \lambda {I_n}\vec \eta & = \vec 0\\ \left( {A - \lambda {I_n}} \right)\vec \eta & = \vec 0\end{align*}\]Notice that before we factored out the \(\vec \eta \) we added in the appropriately sized identity matrix. This is equivalent to multiplying things by a one and so doesn’t change the value of anything. We needed to do this because without it we would have had the difference of a matrix, \(A\), and a constant, \(\lambda \), and this can’t be done. We now have the difference of two matrices of the same size which can be done.

So, with this rewrite we see that

\[\begin{equation}\left( {A - \lambda {I_n}} \right)\vec \eta = \vec 0 \label{eq:eq2}\end{equation}\]is equivalent to \(\eqref{eq:eq1}\). In order to find the eigenvectors for a matrix we will need to solve a homogeneous system. Recall the fact from the previous section that we know that we will either have exactly one solution (\(\vec \eta = \vec 0\)) or we will have infinitely many nonzero solutions. Since we’ve already said that we don’t want \(\vec \eta = \vec 0\) this means that we want the second case.

Knowing this will allow us to find the eigenvalues for a matrix. Recall from this fact that we will get the second case only if the matrix in the system is singular. Therefore, we will need to determine the values of \(\lambda \) for which we get,

\[\det \left( {A - \lambda I} \right) = 0\]Once we have the eigenvalues we can then go back and determine the eigenvectors for each eigenvalue. Let’s take a look at a couple of quick facts about eigenvalues and eigenvectors.

#### Fact

If \(A\) is an \(n \times n\) matrix then \(\det \left( {A - \lambda I} \right) = 0\) is an \(n^{\text{th}}\) degree polynomial. This polynomial is called the **characteristic polynomial**.

To find eigenvalues of a matrix all we need to do is solve a polynomial. That’s generally not too bad provided we keep \(n\) small. Likewise this fact also tells us that for an \(n \times n\) matrix, \(A\), we will have \(n\) eigenvalues if we include all repeated eigenvalues.

#### Fact

If \({\lambda _{\,1}}, {\lambda _{\,2}}, \ldots ,{\lambda _{\,n}}\) is the complete list of eigenvalues for \(A\) (including all repeated eigenvalues) then,

- If \(\lambda \) occurs only once in the list then we call \(\lambda \)
**simple**. - If \(\lambda \) occurs \(k>1\) times in the list then we say that \(\lambda \) has
**multiplicity \(k\)**. - If \({\lambda _{\,1}},{\lambda _{\,2}}, \ldots ,{\lambda _{\,k}}\) (\(k \le n\)) are the simple eigenvalues in the list with corresponding eigenvectors \({\vec \eta ^{\left( 1 \right)}}\), \({\vec \eta ^{\left( 2 \right)}}\), …, \({\vec \eta ^{\left( k \right)}}\) then the eigenvectors are all linearly independent.
- If \(\lambda \) is an eigenvalue of multiplicity \(k > 1\) then \(\lambda \) will have anywhere from 1 to \(k\) linearly independent eigenvectors.

The usefulness of these facts will become apparent when we get back into differential equations since in that work we will want linearly independent solutions.

Let’s work a couple of examples now to see how we actually go about finding eigenvalues and eigenvectors.

The first thing that we need to do is find the eigenvalues. That means we need the following matrix,

\[A - \lambda I = \left( {\begin{array}{*{20}{c}}2&7\\{ - 1}&{ - 6}\end{array}} \right) - \lambda \left( {\begin{array}{*{20}{c}}1&0\\0&1\end{array}} \right) = \left( {\begin{array}{*{20}{c}}{2 - \lambda }&7\\{ - 1}&{ - 6 - \lambda }\end{array}} \right)\]In particular we need to determine where the determinant of this matrix is zero.

\[\det \left( {A - \lambda I} \right) = \left( {2 - \lambda } \right)\left( { - 6 - \lambda } \right) + 7 = {\lambda ^2} + 4\lambda - 5 = \left( {\lambda + 5} \right)\left( {\lambda - 1} \right)\]So, it looks like we will have two simple eigenvalues for this matrix, \({\lambda _{\,1}} = - 5\) and \({\lambda _{\,2}} = 1\). We will now need to find the eigenvectors for each of these. Also note that according to the fact above, the two eigenvectors should be linearly independent.

To find the eigenvectors we simply plug in each eigenvalue into and solve. So, let’s do that.

\({\lambda _{\,1}} = - 5\) :

In this case we need to solve the following system.

Recall that officially to solve this system we use the following augmented matrix.

\[\left( {\begin{array}{*{20}{c}}7&7&0\\{ - 1}&{ - 1}&0\end{array}} \right)\begin{array}{*{20}{c}}{\frac{1}{7}{R_1} + {R_2}}\\ \Rightarrow \end{array}\left( {\begin{array}{*{20}{c}}7&7&0\\0&0&0\end{array}} \right)\]Upon reducing down we see that we get a single equation

\[7{\eta _1} + 7{\eta _2} = 0\hspace{0.25in} \Rightarrow \hspace{0.25in}{\eta _1} = - {\eta _2}\]that will yield an infinite number of solutions. This is expected behavior. Recall that we picked the eigenvalues so that the matrix would be singular and so we would get infinitely many solutions.

Notice as well that we could have identified this from the original system. This won’t always be the case, but in the \(2 \times 2\) case we can see from the system that one row will be a multiple of the other and so we will get infinite solutions. From this point on we won’t be actually solving systems in these cases. We will just go straight to the equation and we can use either of the two rows for this equation.

Now, let’s get back to the eigenvector, since that is what we were after. In general then the eigenvector will be any vector that satisfies the following,

\[\vec \eta = \left( {\begin{array}{*{20}{c}}{{\eta _1}}\\{{\eta _2}}\end{array}} \right) = \left( {\begin{array}{*{20}{c}}{ - {\eta _2}}\\{{\eta _2}}\end{array}} \right)\hspace{0.25in},{\eta _2} \ne 0\]To get this we used the solution to the equation that we found above.

We really don’t want a general eigenvector however so we will pick a value for \({\eta _{\,2}}\) to get a specific eigenvector. We can choose anything (except \({\eta _{\,2}} = 0\)), so pick something that will make the eigenvector “nice”. Note as well that since we’ve already assumed that the eigenvector is not zero we must choose a value that will not give us zero, which is why we want to avoid \({\eta _{\,2}} = 0\) in this case. Here’s the eigenvector for this eigenvalue.

\[{\vec \eta ^{\left( 1 \right)}} = \left( {\begin{array}{*{20}{c}}{ - 1}\\1\end{array}} \right),\hspace{0.25in}{\mbox{using }}{\eta _2} = 1\]Now we get to do this all over again for the second eigenvalue.

\({\lambda _{\,2}} = 1\) :

We’ll do much less work with this part than we did with the previous part. We will need to solve the following system.

Clearly both rows are multiples of each other and so we will get infinitely many solutions. We can choose to work with either row. We’ll run with the first because to avoid having too many minus signs floating around. Doing this gives us,

\[{\eta _1} + 7{\eta _2} = 0\hspace{0.25in}{\eta _1} = - 7{\eta _2}\]Note that we can solve this for either of the two variables. However, with an eye towards working with these later on let’s try to avoid as many fractions as possible. The eigenvector is then,

\[\vec \eta = \left( {\begin{array}{*{20}{c}}{{\eta _1}}\\{{\eta _2}}\end{array}} \right) = \left( {\begin{array}{*{20}{c}}{ - 7{\eta _2}}\\{{\eta _2}}\end{array}} \right)\hspace{0.25in},{\eta _2} \ne 0\] \[{\vec \eta ^{\left( 2 \right)}} = \left( {\begin{array}{*{20}{c}}{ - 7}\\1\end{array}} \right),\hspace{0.25in}{\mbox{using }}{\eta _2} = 1\]Summarizing we have,

\[\begin{align*}{\lambda _1} & = - 5 & \hspace{0.25in}{{\vec \eta }^{\left( 1 \right)}} & = \left( {\begin{array}{*{20}{c}}{ - 1}\\1\end{array}} \right)\\{\lambda _2} & = 1 & \hspace{0.25in}{{\vec \eta }^{\left( 1 \right)}} & = \left( {\begin{array}{*{20}{c}}{ - 7}\\1\end{array}} \right)\end{align*}\]Note that the two eigenvectors are linearly independent as predicted.

This matrix has fractions in it. That’s life so don’t get excited about it. First, we need the eigenvalues.

\[\begin{align*}\det \left( {A - \lambda I} \right) & = \left| {\begin{array}{*{20}{c}}{1 - \lambda }&{ - 1}\\{\frac{4}{9}}&{ - \frac{1}{3} - \lambda }\end{array}} \right|\\ & = \left( {1 - \lambda } \right)\left( { - \frac{1}{3} - \lambda } \right) + \frac{4}{9}\\ & = {\lambda ^2} - \frac{2}{3}\lambda + \frac{1}{9}\\ & = {\left( {\lambda - \frac{1}{3}} \right)^2}\hspace{0.25in} \Rightarrow \hspace{0.25in}{\lambda _{1,2}} = \frac{1}{3}\end{align*}\]So, it looks like we’ve got an eigenvalue of multiplicity 2 here. Remember that the power on the term will be the multiplicity.

Now, let’s find the eigenvector(s). This one is going to be a little different from the first example. There is only one eigenvalue so let’s do the work for that one. We will need to solve the following system,

\[\left( {\begin{array}{*{20}{c}}{\frac{2}{3}}&{ - 1}\\{\frac{4}{9}}&{ - \frac{2}{3}}\end{array}} \right)\left( {\begin{array}{*{20}{c}}{{\eta _1}}\\{{\eta _2}}\end{array}} \right) = \left( {\begin{array}{*{20}{c}}0\\0\end{array}} \right)\hspace{0.25in} \Rightarrow \hspace{0.25in}{R_1} = \frac{3}{2}{R_2}\]So, the rows are multiples of each other. We’ll work with the first equation in this example to find the eigenvector.

\[\frac{2}{3}{\eta _1} - {\eta _2} = 0\hspace{0.25in}{\eta _2} = \frac{2}{3}{\eta _1}\]Recall in the last example we decided that we wanted to make these as “nice” as possible and so should avoid fractions if we can. Sometimes, as in this case, we simply can’t so we’ll have to deal with it. In this case the eigenvector will be,

\[\vec \eta = \left( {\begin{array}{*{20}{c}}{{\eta _1}}\\{{\eta _2}}\end{array}} \right) = \left( {\begin{array}{*{20}{c}}{{\eta _1}}\\{\frac{2}{3}{\eta _1}}\end{array}} \right),\hspace{0.25in}{\eta _1} \ne 0\] \[{\vec \eta ^{\left( 1 \right)}} = \left( {\begin{array}{*{20}{c}}3\\2\end{array}} \right),\hspace{0.25in}{\eta _1} = 3\]Note that by careful choice of the variable in this case we were able to get rid of the fraction that we had. This is something that in general doesn’t much matter if we do or not. However, when we get back to differential equations it will be easier on us if we don’t have any fractions so we will usually try to eliminate them at this step.

Also, in this case we are only going to get a single (linearly independent) eigenvector. We can get other eigenvectors, by choosing different values of \({\eta _{\,1}}\). However, each of these will be linearly dependent with the first eigenvector. If you’re not convinced of this try it. Pick some values for \({\eta _{\,1}}\) and get a different vector and check to see if the two are linearly dependent.

Recall from the fact above that an eigenvalue of multiplicity \(k\) will have anywhere from 1 to \(k\) linearly independent eigenvectors. In this case we got one. For most of the \(2 \times 2\) matrices that we’ll be working with this will be the case, although it doesn’t have to be. We can, on occasion, get two.

So, we’ll start with the eigenvalues.

\[\begin{align*}\det \left( {A - \lambda I} \right) & = \left| {\begin{array}{*{20}{c}}{ - 4 - \lambda }&{ - 17}\\2&{2 - \lambda }\end{array}} \right|\\ & = \left( { - 4 - \lambda } \right)\left( {2 - \lambda } \right) + 34\\ & = {\lambda ^2} + 2\lambda + 26\end{align*}\]This doesn’t factor, so upon using the quadratic formula we arrive at,

\[{\lambda _{1,2}} = - 1 \pm 5i\]In this case we get complex eigenvalues which are definitely a fact of life with eigenvalue/eigenvector problems so get used to them.

Finding eigenvectors for complex eigenvalues is identical to the previous two examples, but it will be somewhat messier. So, let’s do that.

\({\lambda _{\,1}} = - 1 + 5\,i\) :

The system that we need to solve this time is

Now, it’s not super clear that the rows are multiples of each other, but they are. In this case we have,

\[{R_1} = - \frac{1}{2}\left( {3 + 5i} \right){R_2}\]This is not something that you need to worry about, we just wanted to make the point. For the work that we’ll be doing later on with differential equations we will just assume that we’ve done everything correctly and we’ve got two rows that are multiples of each other. Therefore, all that we need to do here is pick one of the rows and work with it.

We’ll work with the second row this time.

\[2{\eta _1} + \left( {3 - 5i} \right){\eta _2} = 0\]Now we can solve for either of the two variables. However, again looking forward to differential equations, we are going to need the “\(i\)” in the numerator so solve the equation in such a way as this will happen. Doing this gives,

\[\begin{align*}2{\eta _1} & = - \left( {3 - 5i} \right){\eta _2}\\ {\eta _1} & = - \frac{1}{2}\left( {3 - 5i} \right){\eta _2}\end{align*}\]So, the eigenvector in this case is

\[\vec \eta = \left( {\begin{array}{*{20}{c}}{{\eta _1}}\\{{\eta _2}}\end{array}} \right) = \left( {\begin{array}{*{20}{c}}{ - \frac{1}{2}\left( {3 - 5i} \right){\eta _2}}\\{{\eta _2}}\end{array}} \right),\hspace{0.25in}{\eta _2} \ne 0\] \[{\vec \eta ^{\left( 1 \right)}} = \left( {\begin{array}{*{20}{c}}{ - 3 + 5i}\\2\end{array}} \right),\hspace{0.25in}{\eta _2} = 2\]As with the previous example we choose the value of the variable to clear out the fraction.

Now, the work for the second eigenvector is almost identical and so we’ll not dwell on that too much.

\({\lambda _{\,2}} = - 1 - 5\,i\) :

The system that we need to solve here is

Working with the second row again gives,

\[2{\eta _1} + \left( {3 + 5i} \right){\eta _2} = 0\hspace{0.25in} \Rightarrow \hspace{0.25in}{\eta _1} = - \frac{1}{2}\left( {3 + 5i} \right){\eta _2}\]The eigenvector in this case is

\[\vec \eta = \left( {\begin{array}{*{20}{c}}{{\eta _1}}\\{{\eta _2}}\end{array}} \right) = \left( {\begin{array}{*{20}{c}}{ - \frac{1}{2}\left( {3 + 5i} \right){\eta _2}}\\{{\eta _2}}\end{array}} \right),\hspace{0.25in}{\eta _2} \ne 0\] \[{\vec \eta ^{\left( 2 \right)}} = \left( {\begin{array}{*{20}{c}}{ - 3 - 5i}\\2\end{array}} \right),\hspace{0.25in}{\eta _2} = 2\]Summarizing,

\[\begin{align*}{\lambda _{\,1}} & = - 1 + 5i & \hspace{0.25in}{{\vec \eta }^{\left( 1 \right)}} & = \left( {\begin{array}{*{20}{c}}{ - 3 + 5i}\\2\end{array}} \right)\\ {\lambda _{\,2}} & = - 1 - 5i & \hspace{0.25in}{{\vec \eta }^{\left( 2 \right)}} & = \left( {\begin{array}{*{20}{c}}{ - 3 - 5i}\\2\end{array}} \right)\end{align*}\]There is a nice fact that we can use to simplify the work when we get complex eigenvalues. We need a bit of terminology first however.

If we start with a complex number,

\[z = a + bi\]then the **complex conjugate** of \(z\) is

To compute the complex conjugate of a complex number we simply change the sign on the term that contains the “\(i\)”. The complex conjugate of a vector is just the conjugate of each of the vector’s components.

We now have the following fact about complex eigenvalues and eigenvectors.

#### Fact

If \(A\) is an \(n \times n\) matrix with only real numbers and if \({\lambda _{\,1}} = a + bi\) is an eigenvalue with eigenvector \({\vec \eta ^{\left( 1 \right)}}\). Then \({\lambda _{\,2}} = \overline {{\lambda _{\,1}}} = a - bi\) is also an eigenvalue and its eigenvector is the conjugate of \({\vec \eta ^{\left( 1 \right)}}\).

This fact is something that you should feel free to use as you need to in our work.

Now, we need to work one final eigenvalue/eigenvector problem. To this point we’ve only worked with \(2 \times 2\) matrices and we should work at least one that isn’t \(2 \times 2\). Also, we need to work one in which we get an eigenvalue of multiplicity greater than one that has more than one linearly independent eigenvector.

Despite the fact that this is a \(3 \times 3\) matrix, it still works the same as the \(2 \times 2\) matrices that we’ve been working with. So, start with the eigenvalues

\[\begin{align*}\det \left( {A - \lambda I} \right) & = \left| {\begin{array}{*{20}{c}}{ - \lambda }&1&1\\1&{ - \lambda }&1\\1&1&{ - \lambda }\end{array}} \right|\\ & = - {\lambda ^3} + 3\lambda + 2\\ & = \left( {\lambda - 2} \right){\left( {\lambda + 1} \right)^2}\hspace{0.25in}{\lambda _1} = 2,\,\,\,{\lambda _{2,3}} = - 1\end{align*}\]So, we’ve got a simple eigenvalue and an eigenvalue of multiplicity 2. Note that we used the same method of computing the determinant of a \(3 \times 3\) matrix that we used in the previous section. We just didn’t show the work.

Let’s now get the eigenvectors. We’ll start with the simple eigenvector.

\({\lambda _{\,1}} = 2\) :

Here we’ll need to solve,

This time, unlike the \(2 \times 2\) cases we worked earlier, we actually need to solve the system. So let’s do that.

\[\left( {\begin{array}{*{20}{c}}{ - 2}&1&1&0\\1&{ - 2}&1&0\\1&1&{ - 2}&0\end{array}} \right)\begin{array}{*{20}{c}}{{R_1} \leftrightarrow {R_2}}\\ \Rightarrow \end{array}\left( {\begin{array}{*{20}{c}}1&{ - 2}&1&0\\{ - 2}&1&1&0\\1&1&{ - 2}&0\end{array}} \right)\begin{array}{*{20}{c}}{{R_2} + 2{R_1}}\\{{R_3} - {R_1}}\\ \Rightarrow \end{array}\left( {\begin{array}{*{20}{c}}1&{ - 2}&1&0\\0&{ - 3}&3&0\\0&3&{ - 3}&0\end{array}} \right)\] \[\begin{array}{*{20}{c}}{ - \frac{1}{3}{R_2}}\\ \Rightarrow \end{array}\left( {\begin{array}{*{20}{c}}1&{ - 2}&1&0\\0&1&{ - 1}&0\\0&3&{ - 3}&0\end{array}} \right)\begin{array}{*{20}{c}}{{R_3} - 3{R_2}}\\{{R_1} + 2{R_2}}\\ \Rightarrow \end{array}\left( {\begin{array}{*{20}{c}}1&0&{ - 1}&0\\0&1&{ - 1}&0\\0&0&0&0\end{array}} \right)\]Going back to equations gives,

\[\begin{align*}{\eta _1} - {\eta _3} & = 0 & \Rightarrow \hspace{0.25in}{\eta _1} & = {\eta _3}\\ {\eta _2} - {\eta _3} & = 0 & \Rightarrow \hspace{0.25in}{\eta _2} & = {\eta _3}\end{align*}\]So, again we get infinitely many solutions as we should for eigenvectors. The eigenvector is then,

\[\vec \eta = \left( {\begin{array}{*{20}{c}}{{\eta _1}}\\{{\eta _2}}\\{{\eta _3}}\end{array}} \right) = \left( {\begin{array}{*{20}{c}}{{\eta _3}}\\{{\eta _3}}\\{{\eta _3}}\end{array}} \right),\hspace{0.25in}{\eta _3} \ne 0\] \[{\vec \eta ^{\left( 1 \right)}} = \left( {\begin{array}{*{20}{c}}1\\1\\1\end{array}} \right),\hspace{0.25in}{\eta _3} = 1\]Now, let’s do the other eigenvalue.

\({\lambda _{\,2}} = - 1\) :

Here we’ll need to solve,

Okay, in this case is clear that all three rows are the same and so there isn’t any reason to actually solve the system since we can clear out the bottom two rows to all zeroes in one step. The equation that we get then is,

\[{\eta _1} + {\eta _2} + {\eta _3} = 0\hspace{0.25in} \Rightarrow \hspace{0.25in}{\eta _1} = - {\eta _2} - {\eta _3}\]So, in this case we get to pick two of the values for free and will still get infinitely many solutions. Here is the general eigenvector for this case,

\[\vec \eta = \left( {\begin{array}{*{20}{c}}{{\eta _1}}\\{{\eta _2}}\\{{\eta _3}}\end{array}} \right) = \left( {\begin{array}{*{20}{c}}{ - {\eta _2} - {\eta _3}}\\{{\eta _2}}\\{{\eta _3}}\end{array}} \right),\hspace{0.25in}{\eta _2} \ne 0{\mbox{ and }}{\eta _3} \ne 0{\mbox{ at the same time}}\]Notice the restriction this time. Recall that we only require that the eigenvector not be the zero vector. This means that we can allow one or the other of the two variables to be zero, we just can’t allow both of them to be zero at the same time!

What this means for us is that we are going to get two linearly independent eigenvectors this time. Here they are.

\[{\vec \eta ^{\left( 2 \right)}} = \left( {\begin{array}{*{20}{c}}{ - 1}\\0\\1\end{array}} \right)\hspace{0.25in}{\eta _2} = 0{\mbox{ and }}{\eta _3} = 1\] \[{\vec \eta ^{\left( 3 \right)}} = \left( {\begin{array}{*{20}{c}}{ - 1}\\1\\0\end{array}} \right)\hspace{0.25in}{\eta _2} = 1{\mbox{ and }}{\eta _3} = 0\]Now when we talked about linear independent vectors in the last section we only looked at \(n\) vectors each with \(n\) components. We can still talk about linear independence in this case however. Recall back with we did linear independence for functions we saw at the time that if two functions were linearly dependent then they were multiples of each other. Well the same thing holds true for vectors. Two vectors will be linearly dependent if they are multiples of each other. In this case there is no way to get \({\vec \eta ^{\left( 2 \right)}}\) by multiplying \({\vec \eta ^{\left( 3 \right)}}\) by a constant. Therefore, these two vectors must be linearly independent.

So, summarizing up, here are the eigenvalues and eigenvectors for this matrix

\[\begin{align*}{\lambda _1} & = 2 & \hspace{0.25in}{{\vec \eta }^{\left( 1 \right)}} & = \left( {\begin{array}{*{20}{c}}1\\1\\1\end{array}} \right)\\ {\lambda _2} & = - 1 & \hspace{0.25in}{{\vec \eta }^{\left( 2 \right)}} & = \left( {\begin{array}{*{20}{c}}{ - 1}\\0\\1\end{array}} \right)\\ {\lambda _3} & = - 1 & \hspace{0.25in}{{\vec \eta }^{\left( 3 \right)}} & = \left( {\begin{array}{*{20}{c}}{ - 1}\\1\\0\end{array}} \right)\end{align*}\]