Math-Linux.com

Knowledge base dedicated to Linux and applied mathematics.

Home > Mathematics > Linear Systems > Conjugate gradient method

Conjugate gradient method

All the versions of this article: [English] [français]

I will study here the Conjugate Gradient Method. This numerical method allows you to solve linear systems whose matrix is symmetric and positive definite. The search for successive directions makes possible to reach the exact solution of the linear system.

Problem

We want to solve the following linear system:

Ax=b,

where A is a n\times n symmetric and positive definite matrix (A^\top =A and x^\top Ax>0, for all x\in \mathbf{R}^n non-zero).
Let x_\star be the exact solution of this linear system.

Conjugate directions

As the matrix A is symmetric and positive definite, we can define the following scalar product on \mathbf{R}^n:
\langle u,v \rangle_A = u^\top A v.
Two elements u,v\in \mathbf{R}^n are A-conjuguate if:

u^\top A v = 0

Conjugate Gradient Method consists in building a vectorial sequence (p_k)_{k\in\mathbf{N}^*} of n A-conjugate vectors . Consequently, the sequence p_1,p_2,\ldots,p_n form a basis of \mathbf{R}^n. The exact solution x_\star can be expanded like follows:

x_\star = \alpha_1 p_1 + \cdots + \alpha_n p_n

where \alpha_k = \frac{p_k^\top b}{p_k^\top A p_k},k=1,\ldots,n.

Construction of Conjugate directions

The exact solution x_\star is also the unique one minimizer of the functionnal
J(x) = \frac12 x^\top A x - b^\top x, \quad x\in\mathbf{R}^n
We have clearly \nabla J(x) = A x - b, \quad x\in\mathbf{R}^n
so

\nabla J(x_\star)=0_{\mathbf{R}^n}

We define the residual vector of the linear system
r_k = b - Ax_k=-\nabla J(x_k)

r_k is the direction of the gradient of the functional J in x_k.

The new direction of descent p_{k+1} is the same as its A-conjugaison with p_k, we have then:

p_{k+1} = r_k - \frac{p_k^\top A r_k}{p_k^\top A p_k} p_k

It is the choice of the coefficient \frac{p_k^\top A r_k}{p_k^\top A p_k} wich allows the A-conjugaison of the directions p_k. If you want you can calculate (Ap_{k+1},p_k), it is zero !

Conjugate Gradient Algorirthm

We calculate the residual r_0=b-Ax_0 for any vector x_0. We fix p_0=r_0.

For k=1,2,\ldots

q_k=A p_k

If k=1

p_1=r_{0}

Else

\beta_{k-1}= \frac{r_{k-1}^\top r_{k-1}}{r_{k-2}^\top r_{k-2}}

p_k=r_{k-1} + \beta_{k-1}p_{k-1}

EndIf

\alpha_k=\frac{r_{k-1}^\top r_{k-1}}{p_k^\top q_k}

x_k=x_{k-1}+\alpha_k p_k

r_k=r_{k-1}-\alpha_k q_k

Any message or comments?

comments powered by Disqus