How would we solve equations like:

\begin{equation} \begin{cases} y’’ - 2xy’ + 2\lambda y = 0 \\ y’’ - xy = 0 \end{cases} \end{equation}

## Taylor Series

Its time to have a blast from the past! Taylor Series time.

\begin{equation} p_{n}(x) = \sum_{i=0}^{n} \frac{f^{(n)}(0) x^{n}}{n!} \end{equation}

Taylor’s Theorem with Remainder gives us that, at some \(n\), \(|f(x) - p_{n}(x)|\) is bounded.

\begin{equation} |x(t+h) - (x(t) + h x’(t))| \leq Ch \end{equation}

Two constraints:

- need \(f^{(n)}\) to exist infinitely
- and there’s a set of functions that are representable by Taylor Series (even if differentiable; such as \(e^{-\frac{1}{|x|}}\)

## variable-coefficient ODEs

\begin{equation} \dv[2]{y}{x} + a(x) \dv{y}{x} + b(x) y = 0 \end{equation}

We can no longer use any linearizion facilities we have developed before because matrix exponentiation (i.e. the eigenvalue trick) no longer work very well as squaring independent variable within the expression actually have consequences now.

## Solving ODEs via power series

if \(a_0(t), …, a_{n}(t), f(t)\) are all convergent power series on an interval centered at \(t_0\) then, solutions of \(a_{n}(t)y^{(n)} + … a_0(t)y = f(t)\) is also a convergent power series on an interval at \(t_{0}\), provided that \(a_{n}(t)\) doesn’t go to \(0\) on that interval.

- write down solutions in terms of \(y(t) = \sum_{n=0}^{\infty} c_{n}(t-t_0)^{n}\)
- take enough derivatives of that expression \(y(t)\) above
- solve for \(c_0\), \(c_1\), etc. by using the fact that \(c_{n} = \frac{y^{(n)}(t_0)}{n!}\) (i.e. plug in the given \(y^{(n)}\) from the IVP and solve for \(c_{j}\))
- plug what you have in terms of derivatives as well as the initial coefficients, and relate to a general power series
- notice patterns

### Case Study

Take \(y’ = 2y\). Consider:

\begin{equation} y = \sum_{n=0}^{\infty} a_{n}x^{n} \end{equation}

We hope that our solution function can be fit to this form.

If we differentiate:

\begin{equation} y’ = \sum_{n=0}^{\infty} a_{n} n x^{n-1} \end{equation}

We want to line up powers of \(x\), which makes life earlier. Because this is an infinite series, and at \(n=0\) the whole differentiated term looks like \(0\), we can actually just shift \(n\) one over and we’d be good.

\begin{equation} y’ = \sum_{n=0}^{\infty} a_{n+1} (n+1) x^{n} \end{equation}

We can now plug the whole thing into our original equation:

\begin{equation} \sum_{n=0}^{\infty} a_{n+1} (n+1) x^{n} = \sum_{n=0}^{\infty} 2a_{n}x^{n} \end{equation}

Because these are two polynomials that equal, corresponding coefficients should match:

\begin{equation} a_{n+1}(n+1) = 2a_{n} \end{equation}

So, we have:

\begin{equation} a_{n+1} = \frac{2a_{n}}{n+1} \end{equation}

At \(y(0)=a_{0}\), so we can start the recursion relationship at any initial condition we’d like.

We notice that the value:

\begin{equation} a_{n} = \frac{2^{n}}{n!} a_{0} \end{equation}

satisfies the system above. Which means we can write out the general answer as \(a_0 \sum_{i=0}^{\infty} \frac{2^{n}x^{n}}{n!}\)

### Case Study 2

We have:

\begin{equation} y’’ - 2xy’ + 2\lambda y = 0 \end{equation}

Let’s calculate our Taylor series:

\begin{equation} y = \sum_{i=0}^{\infty} a_{n} x^{n} \end{equation}

\begin{equation} y’ = \sum_{i=0}^{\infty} n a_{n}x^{n-1} \end{equation}

\begin{equation} y’’ = \sum_{n=0}^{\infty} n(n-1)a_{n}x^{n-2} \end{equation}

Reindexing:

\begin{equation} y’’ = \sum_{n=0}^{\infty} (n+1)(n+1) a_{n+2} x^{n} \end{equation}

Because \(2xy’\) appears in the equation, we can actually write:

\begin{equation} -2xy’ = -\sum_{i=0}^{\infty} 2n a_{n} x^{n} \end{equation}

and the final term:

\begin{equation} 2\lambda = \sum_{n=0}^{\infty} a_{n} x^{n} \end{equation}

Adding the whole thing up, we obtain that:

\begin{equation} \sum_{n=0}^{\infty} \qty[(n+2)(n+1) a_{n+2} - 2_{n}a_{n} + 2\lambda a_{n}] x^{n} = 0 \end{equation}

For each term, we get a recursion relationship in:

\begin{equation} a_{n+2} = \frac{2(n-\lambda)}{(n+2)(n+1)} a_{n} \end{equation}