top of page

Numerical Analysis and Linear Algorithms

Stable and efficient python implementations of linear algorithms that handles various decomposition transformations, computation of eigenvalue, and singular values. 

Solving a Linear System

Forward and Backward Substitution

Matrix is a way of interpreting a linear system. Forward Substitution solves a lower triangular matrix and Backward Substitution solves an upper triangular matrix. Row-oriented implementations of these two algorithms are provided.

Gaussian Elimination

Gaussian Elimination is the row reduction method of solving linear equations in matrix form. Implementations of Gaussian Elimination by inner-product and by outer-product are provided. This is achieved by LU decomposition, which factorize a square matrix into the product of an upper-triangular and a lower-triangular matrix. 

GEPP

GEPP stands for Gaussian Elimination with Partial Pivoting. It involves row exchange during solving the linear system with the help of a permutation matrix. GEPP is more stable than outer-product Gaussian Elimination in large-scale linear systems. It is also viewed as the LU-PA Decomposition.

Computing Eigenvalues

QR Decomposition

QR decomposition transforms the matrix into a product of an orthogonal and an upper-triangular matrix. It is often used to solve the least square problem and it is the basis of QR iteration for computing eigenvalues. This implementation turns a matrix into its QR form by householder reflection. 

QR Iteration / Francis Algorithm

QR Iteration, or Francis Algorithm, is a fast, stable, recursive algorithm used to compute the eigenvalues of a matrix by its Schur Decomposition. The algorithm first encodes a matrix into upper-Hessenberg form. It then applies Rayleigh's Shift to keep this form, which is the so-called "chasing the bulge" process. When deflation occurs on the any elements on the sub-diagonal (elements less than 10e-14) and the algorithm recursively calls itself on the two parts separated by this deflation point, until all the eigenvalues are found. 

Orthogonal Iteration

Orthogonal Iteration is a generalization of the power method. It is very slow when the input is larger than 50*50 because the QR decomposition and matrix multiplication at each time step has O(n^3) time complexity. It also takes many power-based iteration for the eigenvalues to converge. The algorithm

breaks a random matrix into its QR form and iteratively updates it to be the product of itself and the orthogonal matrix obtained from the QR form of the last time step, until this matrix converges to its Schur Decomposition.

Repo
bottom of page