We will study an iterative method for solving linear systems: the Jacobi method. The aim is to build a sequence of approximations that converges to the true solution.

Iterative method

Jacobi method is an iterative method for solving linear systems such as

\[Ax=b\]

For this, we use a sequence $x^{(k)}$ which converges to the fixed point(solution) $x$.

For $x^{(0)}$ given, we build a sequence $x^{(k)}$such $x^{(k+1)}=F(x^{(k)})$ with $k \in \mathbf{N}$.

$A=M-N$ where $M$ is an invertible matrix.

\[\begin{array}{cccc} Ax=b \Leftrightarrow Mx=Nx+b \Leftrightarrow & x &=& M^{-1}Nx+M^{-1}b \\ & &=& F(x) \end{array}\]

where $F$ is an affine function.

Algorithm

\[\left\{ \begin{array}{cc} x^{(0)} \textrm{ given}& ,\\ x^{(k+1)} = M^{-1}Nx^{(k)}+M^{-1}b& \textrm{else}. \end{array} \right.\]

If $x$ is solution of $Ax=b$ then $x = M^{-1}Nx+M^{-1}b$

Error

Let $e^{(k)}$ be the error vector

\[e^{(k+1)}=x^{(k+1)}-x^{(k)}=M^{-1}N(x^{(k)}-x^{(k-1)})=M^{-1}Ne^{(k)}\]

We put $B = M^{-1}N$, which gives

\[e^{(k+1)}=Be^{(k)}=B^{(k+1)}e^{(0)}.\]

Convergence

The algorithm converges if $\lim_{k \to \infty} | e^{(k)} | = 0 \Leftrightarrow \lim_{k \to \infty} | B^k | = 0$ (null matrix).

Theorem: $\lim_{k \to \infty} | B^k | = 0$ if and only if the spectral radius of the matrix $B$ checks:

\[\rho(B)<1,\]

we remind that $\rho(B) = \max_{i =1,\ldots,n} \lvert\lambda_i\rvert$ where $ \lambda_1,\ldots,\lambda_n$ represent the eigenvalues of $B$.

Theorem: If A is strictly diagonally dominant,

\[\left | a_{ii} \right | > \sum_{i \ne j} {\left | a_{ij} \right |},\forall i=1,\ldots,n\]

then for all $x_0$ the Jacobi algorithm will converge to the solution $x$ of the system $Ax=b.$

Jacobi Method

We decompose $A$ in the following way :\(A=D-E-F\) with

  • $D$ the diagonal
  • $-E$ the strictly lower triangular part of $A$
  • $-F$ the strictly upper triangular part of $A$.

In the Jacobi’s method, we choose $M = D$ and $N = E+F$ (in the Gauss-Seidel, $M = D-E$ and $N = F$).

\[x^{(k+1)}=D^{-1}(E+F) x^{(k)}+D^{-1}b\]

The $i$-th line of $D^{-1}(E+F)$ is : $-(\dfrac{a_{i,1}}{a_{i,i}},\cdots, \dfrac{a_{i,i-1}}{a_{i,i}},0,\dfrac{a_{i,i+1}}{a_{i,i}},\cdots, \dfrac{a_{i,n}}{a_{i,i}})$

We obtain :

\[x^{(k+1)}_i= -\frac{1}{a_{ii}} \sum_{j=1,j \ne i}^n a_{ij}x^{(k)}_j + \frac{b_i}{a_{ii}}\]

Residual vector

Let $r^{(k)}=b-Ax^{(k)}$ be the residual vector. We can write $x_i^{(k+1)}=\frac{r_i^{(k)}}{a_{ii}} + x_i^{(k)}$ with $r_i^{(k)}$ calculated like follows

\[r_i^{(k+1)}=-\sum_{j=1,j \ne i}^n a_{ij} \frac{r_i^{(k)}}{a_{jj}}\]

Stop criteria

For the stop criteria , we can use the residual vector, wich gives for a given precision $\epsilon$ :

\[\dfrac{\|r^{(k)} \|}{\|b\|}=\dfrac{\|b-Ax^{(k)} \|}{\|b\|} < \epsilon\]