Math-Linux.com

Knowledge base dedicated to Linux and applied mathematics.

Home > Mathematics > Linear Systems > Cholesky decomposition

Cholesky decomposition

All the versions of this article: [English] [français]

We will study a direct method for solving linear systems: the Cholelsky decomposition. Given a symmetric positive definite matrix A, the aim is to build a lower triangular matrix L which has the following property: the product of L and its transpose is equal to A.

Given a symmetric positive definite matrix A, the Cholesky decomposition constructs a lower triangular matrix L which has the following property: A=LL^T. A symmetric matrix A is positive definite if, for any vector x, the product x^TAx is positive.

The matrix L is sometimes called the « square root » of A. The Cholesky decomposition is often used to calculate the inverse matrix A^{-1} and the determinant of A (equal to the square of the product of the diagonal elements of L).

Example

The symmetric matrix
A=\begin{pmatrix}
1 & 1 & 1  & 1 \cr
1 & 5 & 5  & 5 \cr
1 & 5 & 14 & 14 \cr
1 & 5 & 14 & 15
\end{pmatrix}

is equal to the product of the triangular matrix L and of its transposed L^T:


\begin{pmatrix}
1 & 1 & 1  & 1 \cr
1 & 5 & 5  & 5 \cr
1 & 5 & 14 & 14 \cr
1 & 5 & 14 & 15
\end{pmatrix}
=
\begin{pmatrix}
1 & 0 & 0  & 0 \cr
1 & 2 & 0  & 0 \cr
1 & 2 & 3  & 0 \cr
1 & 2 & 3  & 1 
\end{pmatrix}
\begin{pmatrix}
1 & 1 & 1  & 1 \cr
0 & 2 & 2  & 2 \cr
0 & 0 & 3  & 3 \cr
0 & 0 & 0  & 1
\end{pmatrix}

with

L=
\begin{pmatrix}
1 & 0 & 0  & 0 \cr
1 & 2 & 0  & 0 \cr
1 & 2 & 3  & 0 \cr
1 & 2 & 3  & 1
\end{pmatrix}

Theorem

Cholesky Factorization:

If A is a symmetric positive definite matrix, there is at least a lower triangular real matrix L such as :

A=LL^T

We can also impose that the diagonal elements of the matrix L are all positive, and corresponding factorization is then unique.

Algorithm

The Cholesky matrix L is given by:


L=\begin{pmatrix}{
l_{11}\cr
l_{21} & l_{22}\cr
\vdots & \vdots & \ddots\cr
l_{n1} & l_{n2} & \cdots & l_{nn}
\end{pmatrix}

Equality A=LL^T gives :
a_{ij}=\left(LL^{T}\right)_{ij}={\sum_{k=1}^{n}l_{ik}l_{jk}}={\sum_{k=1}^{\min\left\{ i,j\right\} }l_{ik}l_{jk}},\;1\leq i,j\leq n

since l_{ij}=0 if 1 \leq i < j \leq n.

The matrix A being symmetric, it is enough that the relations above are checked for i\leq j, i.e. the elements l_{ij} of the matrix L must satisfy:
a_{ij}={\sum_{k=1}^{i}l_{ik}l_{jk}},\;1\leq i,j\leq n

For j=1, we determine the first column of L :
- (i=1) a_{11}=l_{11}l_{11} so l_{11}=\sqrt{a_{11}}
- (i=2) a_{12}=l_{11}l_{21} so l_{21}=\frac{a_{12}}{l_{11}}
- ...
- (i=n) a_{1n}=l_{11}l_{n1} so l_{n1}=\frac{a_{1n}}{l_{11}}

After having calculated the (j-1) first columns, we determine the j-th column of L:
- (i=j) a_{ii}=l_{i1}l_{i1}+\ldots+l_{ii}l_{ii} so l_{ii}=\sqrt{a_{ii}-{\sum_{k=1}^{i-1}l_{ik}^{2}}}
- (i=j+1) \displaystyle a_{i,i+1}=l_{i1}l_{i+1,1}+\ldots+l_{ii}l_{i+1,i} so l_{i+1,i}=\displaystyle\frac{a_{i,i+1}-{\displaystyle\sum_{k=1}^{i-1}l_{ik}l_{i+1,k}}}{l_{ii}}
- ...
- (i=n) \displaystyle a_{i,n}=l_{i1}l_{n1}+\ldots+l_{ii}l_{ni} so l_{ni}=\displaystyle\frac{a_{in}-{\displaystyle\sum_{k=1}^{i-1}l_{ik}l_{nk}}}{l_{ii}}

Solution of linear system

For the resolution of linear system : Ax=b, the system becomes

LL^Tx = b \Leftrightarrow
\left\{\begin{array}{cc}
Ly = b& (1),\\
L^Tx = y &(2).
\end{array}\right.

We solve the system (1) to find the vector y, then the system (2) to find the vector x. The resolution is facilitated by the triangular shape of the matrices.

Calculating Matrix Determinant

The Cholesky decomposition also makes it possible to calculate the determinant of A, which is equal to the square of the product of the diagonal elements of the matrix L, since

det(A) = det(L)\times det(L^T)=det(L)^2

Any message or comments?

comments powered by Disqus