Linear Algebra

Overview

Linear algebra is the mathematical study of vectors, matrices, and linear transformations. It provides the foundational framework for solving systems of linear equations, performing coordinate transformations, and analyzing multidimensional data. Modern applications span machine learning (where data and model parameters are represented as matrices), computer graphics (3D transformations and projections), signal processing (filtering and compression), quantum mechanics (state vectors and operators), and econometrics (multiple regression and cointegration analysis).

Implementation: These tools are built on NumPy for core linear algebra operations and SciPy, which provides highly optimized, production-grade implementations of matrix decompositions and linear system solvers. SciPy’s scipy.linalg module wraps BLAS and LAPACK libraries—the same numerical kernels used in MATLAB, Julia, and R—ensuring consistency and reliability across scientific computing ecosystems.

At the heart of linear algebra are several core matrix decompositions that factorize a matrix into simpler, more interpretable components. These decompositions not only reveal the underlying structure of data but also enable efficient and numerically stable algorithms for solving complex problems. The choice of decomposition depends on the matrix properties and the problem at hand. Figure 1 illustrates two fundamental decompositions that serve different purposes.

Orthogonal Decompositions: The QR decomposition factors any matrix $ A $ into $ A = QR $, where $ Q $ is orthogonal (columns are mutually perpendicular unit vectors) and $ R $ is upper triangular. QR decomposition is numerically stable and forms the foundation for solving least-squares problems, eigenvalue computations, and the QR algorithm. It’s particularly valuable when the matrix is rectangular or when numerical stability is critical.

Singular Value Decomposition (SVD): The SVD decomposes any matrix $ A $ (rectangular or square) into $ A = U V^T $, where $ U $ and $ V $ are orthogonal and $ $ is diagonal with non-negative singular values. SVD reveals the effective rank of a matrix, enables low-rank approximation for compression, computes pseudoinverses via the PINV tool, and underlies principal component analysis (PCA). SVD is the most general and robust decomposition available.

Solving Linear Systems: When solving $ Ax = b $, the choice of tool depends on matrix structure. The LSTSQ tool solves overdetermined systems (more equations than unknowns) using least-squares, while LSQ_LINEAR handles bounded least-squares problems where solution components are constrained. For symmetric positive-definite matrices (common in optimization and statistics), the Cholesky decomposition computes $ A = LL^T $, enabling fast forward-backward substitution without full matrix inversion.

Matrix Functions: Beyond decomposition, computing functions of matrices—such as exponentials—is essential in systems theory, differential equations, and physics. The EXPM tool computes the matrix exponential $ e^A $, which governs the solution of linear systems of differential equations $ dx/dt = Ax $.

These decompositions and tools work synergistically: SVD can compute pseudoinverses more robustly than other methods, QR provides a numerically stable foundation for least-squares, and Cholesky offers efficiency for special matrix classes. Understanding when and how to apply each decomposition is key to writing robust numerical code.

Figure 1: Matrix Decompositions and Their Structure: (A) QR decomposition factors a matrix into orthogonal Q (with orthonormal columns) and upper triangular R, shown here for a 4×3 matrix. (B) SVD decomposes any matrix into orthogonal factors U and V and a diagonal matrix of singular values Σ, revealing intrinsic rank and structure.

Tools

Tool Description
CHOLESKY Compute the Cholesky decomposition of a real, symmetric positive-definite matrix.
EXPM Compute the matrix exponential of a square matrix using scipy.linalg.expm
LSQ_LINEAR Solve a bounded linear least-squares problem.
LSTSQ Compute the least-squares solution to Ax = B using scipy.linalg.lstsq.
PINV Compute the Moore-Penrose pseudoinverse of a matrix using singular value decomposition (SVD).
QR Compute the QR decomposition of a matrix and return either Q or R.
SVD Compute the Singular Value Decomposition (SVD) of a matrix using scipy.linalg.svd.