Differentiation
Overview
Differentiation is a fundamental operation in calculus that measures how a function changes as its input varies. The derivative of a function at a point represents the instantaneous rate of change or the slope of the tangent line at that point. This concept extends naturally to multivariable functions through partial derivatives, which measure how the function changes with respect to one variable while holding others constant.
In computational and applied mathematics, differentiation serves critical roles across diverse domains. In optimization, derivatives identify where functions reach minima or maxima. In sensitivity analysis, they quantify how model outputs respond to parameter changes. In physics and engineering, derivatives describe velocities, accelerations, gradients, and flux. In machine learning, derivatives drive gradient-based training algorithms for neural networks.
Implementation: These tools leverage symbolic differentiation through CasADi and support numerical differentiation through NumPy and SciPy. Symbolic approaches excel for exact derivatives with arbitrary precision, while numerical methods offer flexibility when symbolic expressions are unavailable.
When working with multivariate functions, first and second derivatives organize into matrix structures that capture the complete landscape of function behavior. Figure 1 illustrates the hierarchy of derivative structures.
First-Order Derivatives: The JACOBIAN captures first-order information, defining the tangent plane that linearly approximates the function. Use Jacobian matrices when linearizing multivariate functions for sensitivity analysis, optimization constraints, or understanding local behavior.
Second-Order Derivatives: The HESSIAN captures second-order curvature information. Hessian matrices reveal the topological structure of functions: positive definite regions indicate convex regions (minima), negative definite regions are concave (maxima), and indefinite regions contain saddle points. Second derivatives are essential for Newton’s method, trust-region optimization, and convergence analysis.
Parameter Sensitivity: The SENSITIVITY tool specializes in computing how scalar model outputs respond to parameter changes—a critical capability for uncertainty quantification, robustness analysis, and gradient-based parameter estimation.
Tools
| Tool | Description |
|---|---|
| HESSIAN | Compute the Hessian matrix (second derivatives) of a scalar function using CasADi symbolic differentiation. |
| JACOBIAN | Calculate the Jacobian matrix of mathematical expressions with respect to specified variables. |
| SENSITIVITY | Compute the sensitivity of a scalar model with respect to its parameters using CasADi. |