Definition (Ordinary Differential Equation)
An Ordinary Differential Equation (ODE) is a mathematical equation that relates a function of a single independent variable (in physics, almost always time, ) to its derivatives. Formally, for a state vector , an ODE specifies a constraint of the form
To solve an ODE means finding the specific trajectory that satisfies this constraint for all moments in time, given a specific starting state. In particular, .
Classification by Order
We can classify ODEs by their order, which is the highest derivative that appears in the equation. If , we call the ODE a first order ODE. If , we call it a second order ODE, and so on.
Classification by Dimensionality
This defines the “size” of the state space, or the degrees of freedom of the system. If , we call the ODE a scalar ODE. If , we call it a system of ODEs, or a vector ODE.
Classification by Linearity
Linearity dictates how easily a system can be solved analytically and how predictably it behaves.
- Linear ODE: The function is a linear combination of the state variables and their derivatives. There are no variables multiplied by each other, squared, or trapped inside transcendental functions like sine or cosine. They can generally be written in matrix form:
- Nonlinear ODE: The variables interact in non-proportional ways (e.g., , , or ). The pendulum equation () is strictly nonlinear. Nonlinear systems can exhibit chaos, multiple equilibrium states, and are generally impossible to solve with exact formulas, making numerical step-by-step simulation strictly necessary.
Classification by Time-Dependence (Autonomy)
The ODE is called autonomous if the function does not explicitly depend on time . In this case, the rules governing the system’s evolution are fixed and do not change as time progresses.
Classification by Homogeneity
This distinction is primarily used when finding analytical (exact) solutions to linear equations.
- Homogeneous: The system has no external driving forces; the equation equals zero (). If the system starts at rest at the origin, it will stay there forever. .
- Inhomogeneous: The system includes an isolated forcing term () that drives the system independently of its current state. The complete solution requires finding the general homogeneous solution (how the system naturally wants to ring or oscillate) and adding a particular solution (how the system responds directly to the external force).
Theorem (Homogeneous Solutions Form a Vector Space)
If and are solutions to a homogeneous linear ODE, then any linear combination
will also be a solution. That is, the set of all solutions to a homogeneous linear ODE forms a Vector Space. The dimension of this vector space is .
Also known as the Superposition Principle, this is a fundamental property of linear systems that allows us to construct new solutions from known ones.
Definition (Fundamental Solutions)
A set of basis for the solution space of homogeneous linear ODEs is called the fundamental solutions.
Theorem (Affine Solution)
If the ODE is linear (but inhomogeneous), then the solution space is an affine space. That is, all solutions can be written as a solution plus linear combinations of the fundamental solutions to the homogeneous part of the ODE.
Visually, think of affine spaces as a shifted version of a vector space. In a vector space, is always a solution. In an affine space, there is some particular solution that serves as the “origin,” and all other solutions can be reached by adding linear combinations of the homogeneous solutions to this particular solution.
Separable ODEs
A first-order ODE is called separable if it can be written in the form
For example,
Remark (Non-Separable ODEs)
If a first order ODE is not separable, we can still solve it when it is linear. (First solve the homogeneous part, then solve the full system using variation of constants.)
Theorem (Reduction to First-Order Systems)
An -th order ODE system of size is equivalent to a first-order ODE system of size .
For example, consider the second-order ODE for a pendulum:
We see that the motion is independent of the mass , so we can simplify to
To eliminate the second derivative , we must define a new variable to represent the first derivative or angular velocity. By substituting into the original equation, we can express the second-order ODE as a system of two first-order ODEs:
Finally, we can write this system in vector form as
Note that is system is strictly nonlinear.
Remark (Reduction to Higher-Order Systems)
The conversion preserves notions of linearity, homogeneity, and autonomy. At higher dimensions, linear ODEs can all be transformed to
When depends on , there is not an easy way of solving it (unless is ). When is constant, we can solve its “eigenvalue problem” for the homogeneous solutions. Explicitly, homogeneous solutions are given by the matrix exponential . To find a solution with a nontrivial , use variation of constants.
To solve , the idea is that we can assume the solution takes the form of an exponential function, similar to scalar ODEs, but involving vectors.
Here, is a constant vector and is a scalar constant. Derivation gives . By substituting into the original equation, we get
Since is never zero, we can divide both sides by it to get
Solving this equation amounts to finding the eigenvalues and corresponding eigenvectors of the matrix .
General Analytical Methods
Every ODE can be converted into . Indeed,
can be through as of a Vector Field on -space and the solution is a path on this space tangential to the vector field (best when it is autonomous). For autonomous systems, points where are called steady states or static states.
Rather than solving the entire system, we can analyze how the system behaves very close to these steady states through linearization. By introducing a small perturbation around the steady state , the function can be approximated using a Taylor series expansion.
and get
Note that is the Jacobian matrix of evaluated at the steady state . Indeed, this allows us to analyze the stability of the steady state by looking at the eigenvalues of .
For harder, more complex ODEs, we’ll need Numerical Methods to solve them.
Non-Autonomous Systems
For non-autonomous systems, one can still track stable “quasi-static” states that depends on time . These are points where . This approach requires the assumption that the function changes with respect to time at a much slower rate than the system takes to relax back to its steady state.
Consider the population of a species with the presence of an adversarial predator.
where is the population of the predator and is the population of the prey.