Section 7.6 : Systems of Differential Equations
In this section we want to take a brief look at systems of differential equations that are larger than \(2 \times 2\). The problem here is that unlike the first few sections where we looked at \(n\)th order differential equations we can’t really come up with a set of formulas that will always work for every system. So, with that in mind we’re going to look at all possible cases for a \(3 \times 3\) system (leaving some details for you to verify at times) and then a couple of quick comments about 4 x 4 systems to illustrate how to extend things out to even larger systems and then we’ll leave it to you to actually extend things out if you’d like to.
We will also not be doing any actual examples in this section. The point of this section is just to show how to extend out what we know about \(2 \times 2\) systems to larger systems.
Initially the process is identical regardless of the size of the system. So, for a system of 3 differential equations with 3 unknown functions we first put the system into matrix form,
\[\vec x' = A\,\vec x\]where the coefficient matrix, \(A\), is a \(3 \times 3\) matrix. We next need to determine the eigenvalues and eigenvectors for \(A\) and because \(A\) is a \(3 \times 3\) matrix we know that there will be 3 eigenvalues (including repeated eigenvalues if there are any).
This is where the process from the \(2 \times 2\) systems starts to vary. We will need a total of 3 linearly independent solutions to form the general solution. Some of what we know from the \(2 \times 2\) systems can be brought forward to this point. For instance, we know that solutions corresponding to simple eigenvalues (i.e. they only occur once in the list of eigenvalues) will be linearly independent. We know that solutions from a set of complex conjugate eigenvalues will be linearly independent. We also know how to get a set of linearly independent solutions from a double eigenvalue with a single eigenvector.
There are also a couple of facts about eigenvalues/eigenvectors that we need to review here as well. First, provided \(A\) has only real entries (which it always will here) all complex eigenvalues will occur in conjugate pairs (i.e.\(\lambda = \alpha \pm \beta \,i\)) and their associated eigenvectors will also be complex conjugates of each other. Next, if an eigenvalue has multiplicity \(k \ge 2\) (i.e. occurs at least twice in the list of eigenvalues) then there will be anywhere from 1 to \(k\) linearly independent eigenvectors for the eigenvalue.
With all these ideas in mind let’s start going through all the possible combinations of eigenvalues that we can possibly have for a \(3 \times 3\) case. Let’s also note that for a \(3 \times 3\) system it is impossible to have only 2 real distinct eigenvalues. The only possibilities are to have 1 or 3 real distinct eigenvalues.
Here are all the possible cases.
3 Real Distinct Eigenvalues
In this case we’ll have the real, distinct eigenvalues \({\lambda _{\,1}} \ne {\lambda _{\,2}} \ne {\lambda _{\,3}}\) and their associated eigenvectors, \({\vec \eta _1}\), \({\vec \eta _2}\) and \({\vec \eta _3}\) are guaranteed to be linearly independent and so the three linearly independent solutions we get from this case are,
\[{{\bf{e}}^{{\lambda _{\,1}}t}}{\vec \eta _1}\hspace{0.25in}{{\bf{e}}^{{\lambda _{\,2}}t}}{\vec \eta _2}\hspace{0.25in}{{\bf{e}}^{{\lambda _{\,3}}t}}{\vec \eta _3}\]1 Real and 2 Complex Eigenvalues
From the real eigenvalue/eigenvector pair, \({\lambda _{\,1}}\) and \({\vec \eta _1}\), we get one solution,
\[{{\bf{e}}^{{\lambda _{\,1}}t}}{\vec \eta _1}\]We get the other two solutions in the same manner that we did with the \(2 \times 2\) case. If the eigenvalues are \({\lambda _{\,2,3}} = \alpha \pm \beta \,i\) with eigenvectors \({\vec \eta _2}\) and \({\vec \eta _3} = \overline {\left( {{{\vec \eta }_2}} \right)} \) we can get two real‑valued solution by using Euler’s formula to expand,
\[{{\bf{e}}^{{\lambda _{\,2}}t}}{\vec \eta _2} = {{\bf{e}}^{\left( {\alpha + \beta \,i} \right)t}}{\vec \eta _2} = {{\bf{e}}^{\alpha \,t}}\left( {\cos \left( {\beta t} \right) + i\sin \left( {\beta t} \right)} \right){\vec \eta _2}\]into its real and imaginary parts, \(\vec u + i\,\vec v\). The final two real valued solutions we need are then,
\[\vec u\hspace{0.25in}\vec v\]1 Real Distinct and 1 Double Eigenvalue with 1 Eigenvector
From the real eigenvalue/eigenvector pair, \({\lambda _{\,1}}\) and \({\vec \eta _1}\), we get one solution,
\[{{\bf{e}}^{{\lambda _{\,1}}t}}{\vec \eta _1}\]From our work in the \(2 \times 2\) systems we know that from the double eigenvalue \({\lambda _{\,2}}\) with single eigenvector, \({\vec \eta _2}\), we get the following two solutions,
\[{{\bf{e}}^{{\lambda _{\,2}}t}}{\vec \eta _2}\hspace{0.25in}t\,{{\bf{e}}^{{\lambda _{\,2}}t}}\vec \xi + {{\bf{e}}^{{\lambda _{\,2}}t}}\vec \rho \]where \(\vec \xi \) and \(\vec \rho \) must satisfy the following equations,
\[\left( {A - {\lambda _{\,2}}I} \right)\vec \xi = \vec 0\hspace{0.25in}\left( {A - {\lambda _{\,2}}I} \right)\vec \rho = \vec \xi \]Note that the first equation simply tells us that \(\vec \xi \) must be the single eigenvector for this eigenvalue, \({\vec \eta _2}\), and we usually just say that the second solution we get from the double root case is,
\[t\,{{\bf{e}}^{{\lambda _{\,2}}t}}{\vec \eta _2} + {{\bf{e}}^{{\lambda _{\,2}}t}}\vec \rho \hspace{0.25in}{\mbox{where}}\,\vec \rho {\mbox{ satisfies}}\,\,\left( {A - {\lambda _{\,2}}I} \right)\vec \rho = {\vec \eta _2}\]1 Real Distinct and 1 Double Eigenvalue with 2 Linearly Independent Eigenvectors
We didn’t look at this case back when we were examining the \(2 \times 2\) systems but it is easy enough to deal with. In this case we’ll have a single real distinct eigenvalue/eigenvector pair, \({\lambda _{\,1}}\) and \({\vec \eta _1}\), as well as a double eigenvalue \({\lambda _{\,2}}\) and the double eigenvalue has two linearly independent eigenvectors, \({\vec \eta _2}\) and \({\vec \eta _3}\).
In this case all three eigenvectors are linearly independent and so we get the following three linearly independent solutions,
\[{{\bf{e}}^{{\lambda _{\,1}}t}}{\vec \eta _1}\hspace{0.25in}{{\bf{e}}^{{\lambda _{\,\,2}}t}}{\vec \eta _2}\hspace{0.25in}{{\bf{e}}^{{\lambda _{\,\,2}}t}}{\vec \eta _3}\]We are now out of the cases that compare to those that we did with \(2 \times 2\) systems and we now need to move into the brand new case that we pick up for \(3 \times 3\) systems. This new case involves eigenvalues with multiplicity of 3. As we noted above we can have 1, 2, or 3 linearly independent eigenvectors and so we actually have 3 sub cases to deal with here. So, let’s go through these final 3 cases for a \(3 \times 3\) system.
1 Triple Eigenvalue with 1 Eigenvector
The eigenvalue/eigenvector pair in this case are \(\lambda \) and \(\vec \eta \). Because the eigenvalue is real we know that the first solution we need is,
\[{{\bf{e}}^{\lambda t}}\vec \eta \]We can use the work from the double eigenvalue with one eigenvector to get that a second solution is,
\[t\,{{\bf{e}}^{\lambda t}}\vec \eta + {{\bf{e}}^{\lambda t}}\vec \rho \hspace{0.25in}{\mbox{where}}\,\vec \rho {\mbox{ satisfies}}\,\,\left( {A - \lambda I} \right)\vec \rho = \vec \eta \]For a third solution we can take a clue from how we dealt with \(n\)th order differential equations with roots multiplicity 3. In those cases, we multiplied the original solution by a \({t^2}\). However, just as with the double eigenvalue case that won’t be enough to get us a solution. In this case the third solution will be,
\[\frac{1}{2}{t^2}{{\bf{e}}^{\lambda t}}\vec \xi + t{{\bf{e}}^{\lambda t}}\vec \rho + {{\bf{e}}^{\lambda t}}\vec \mu \]where \(\vec \xi \), \(\vec \rho \), and \(\vec \mu \) must satisfy,
\[\left( {A - \lambda I} \right)\vec \xi = \vec 0\hspace{0.25in}\left( {A - \lambda I} \right)\vec \rho = \vec \xi \hspace{0.25in}\left( {A - \lambda I} \right)\vec \mu = \vec \rho \]You can verify that this is a solution and the conditions by taking a derivative and plugging into the system.
Now, the first condition simply tells us that \(\vec \xi = \vec \eta \) because we only have a single eigenvector here and so we can reduce this third solution to,
\[\frac{1}{2}{t^2}{{\bf{e}}^{\lambda t}}\vec \eta + t{{\bf{e}}^{\lambda t}}\vec \rho + {{\bf{e}}^{\lambda t}}\vec \mu \]where \(\vec \rho \), and \(\vec \mu \) must satisfy,
\[\left( {A - \lambda I} \right)\vec \rho = \vec \eta \hspace{0.25in}\left( {A - \lambda I} \right)\vec \mu = \vec \rho \]and finally notice that we would have solved the new first condition in determining the second solution above and so all we really need to do here is solve the final condition.
As a final note in this case, the \(\frac{1}{2}\) is in the solution solely to keep any extra constants from appearing in the conditions which in turn allows us to reuse previous results.
1 Triple Eigenvalue with 2 Linearly Independent Eigenvectors
In this case we’ll have the eigenvalue \(\lambda \) with the two linearly independent eigenvectors \({\vec \eta _1}\) and \({\vec \eta _2}\) so we get the following two linearly independent solutions,
\[{{\bf{e}}^{\lambda t}}{\vec \eta _1}\hspace{0.25in}{{\bf{e}}^{\lambda t}}{\vec \eta _2}\]We now need a third solution. The third solution will be in the form,
\[t\,{{\bf{e}}^{\lambda t}}\vec \xi + {{\bf{e}}^{\lambda t}}\vec \rho \]where \(\vec \xi \) and \(\vec \rho \) must satisfy the following equations,
\[\left( {A - \lambda I} \right)\vec \xi = \vec 0\hspace{0.25in}\left( {A - \lambda I} \right)\vec \rho = \vec \xi \]We’ve already verified that this will be a solution with these conditions in the double eigenvalue case (that work only required a repeated eigenvalue, not necessarily a double one).
However, unlike the previous times we’ve seen this we can’t just say that \(\vec \xi \) is an eigenvector. In all the previous cases in which we’ve seen this condition we had a single eigenvector and this time we have two linearly independent eigenvectors. This means that the most general possible solution to the first condition is,
\[\vec \xi = {c_1}{\vec \eta _1} + {c_2}{\vec \eta _2}\]This creates problems in solving the second condition. The second condition will not have solutions for every choice of \({c_1}\) and \({c_2}\) and the choice that we use will be dependent upon the eigenvectors. So upon solving the first condition we would need to plug the general solution into the second condition and then proceed to determine conditions on \({c_1}\) and \({c_2}\) that would allow us to solve the second condition.
1 Triple Eigenvalue with 3 Linearly Independent Eigenvectors
In this case we’ll have the eigenvalue \(\lambda \) with the three linearly independent eigenvectors \({\vec \eta _1}\), \({\vec \eta _2}\), and \({\vec \eta _3}\) so we get the following three linearly independent solutions,
\[{{\bf{e}}^{\lambda t}}{\vec \eta _1}\hspace{0.25in}{{\bf{e}}^{\lambda t}}{\vec \eta _2}\hspace{0.25in}{{\bf{e}}^{\lambda t}}{\vec \eta _3}\]4 x 4 Systems
We’ll close this section out with a couple of comments about 4 x 4 systems. In these cases we will have 4 eigenvalues and will need 4 linearly independent solutions in order to get a general solution. The vast majority of the cases here are natural extensions of what \(3 \times 3\) systems cases and in fact will use a vast majority of that work.
Here are a couple of new cases that we should comment briefly on however. With 4 x 4 systems it will now be possible to have two different sets of double eigenvalues and two different sets of complex conjugate eigenvalues. In either of these cases we can treat each one as a separate case and use our previous knowledge about double eigenvalues and complex eigenvalues to get the solutions we need.
It is also now possible to have a “double” complex eigenvalue. In other words, we can have \(\lambda = \alpha \pm \beta \,i\) each occur twice in the list of eigenvalues. The solutions for this case aren’t too bad. We get two solutions in the normal way of dealing with complex eigenvalues. The remaining two solutions will come from the work we did for a double eigenvalue. The work we did in that case did not require that the eigenvalue/eigenvector pair to be real. Therefore, if the eigenvector associated with \(\lambda = \alpha + \beta \,i\) is \(\vec \eta \) then the second solution will be,
\[t\,{{\bf{e}}^{\left( {\alpha + \beta \,i} \right)t}}\vec \eta + {{\bf{e}}^{\left( {\alpha + \beta \,i} \right)t}}\vec \rho \hspace{0.25in}{\mbox{where}}\,\vec \rho {\mbox{ satisfies}}\,\,\left( {A - \lambda I} \right)\vec \rho = \vec \eta \]and once we’ve determined \(\vec \rho \) we can again split this up into its real and imaginary parts using Euler’s formula to get two new real valued solutions.
Finally, with 4 x 4 systems we can now have eigenvalues with multiplicity of 4. In these cases, we can have 1, 2, 3, or 4 linearly independent eigenvectors and we can use our work with \(3 \times 3\) systems to see how to generate solutions for these cases. The one issue that you’ll need to pay attention to is the conditions for the 2 and 3 eigenvector cases will have the same complications that the 2 eigenvector case has in the \(3 \times 3\) systems.
So, we’ve discussed some of the issues involved in systems larger than \(2 \times 2\) and it is hopefully clear that when we move into larger systems the work can be become vastly more complicated.