Solving linear systems resulting from the finite differences method or of the finite elements shows the limits of the conjugate gradient. Indeed, Spectral condition number of such matrices is too high. The technique of Preconditioned Conjugate Gradient Method consists in introducing a matrix C subsidiary.

Problem

We want to solve the following system:

Ax=b,

where A is a n×n symmetric definite and positive matrix(A=A and xAx>0, for all xRn non zero).

Let x be the exact solution of this system.

Spectral condition number

It happens sometimes that the spectral condition number κ(A) is too high (eigenvalues are not well distributed). Preconditionnement consists in introducing regular matrix CMn(R) and solving the system:

C1(Ax)=C1bAx=b

such that the new spectral condition number is smaller for a judicious choice of the matrix C.

Preconditioned Conjugate Gradient Method

Let x0Rn be an intial vector, Preconditioned Gradient Method algorithm is the following one:

r0=bAx0 z0=C1r0 d0=z0

For k=0,1,2,

αk=zkrkdkAdk xk+1=xk+αkdk rk+1=rkαkAdk zk+1=C1rk+1 βk+1=zk+1rk+1zkrk dk+1=zk+1+βk+1dk

EndFor

Jacobi Preconditionner

Jacobi Preconditioner consists in taking the diagonal of A for the matrix C, i.e.

Cij={Aiisi i=j,0sinon.

Advantages of such preconditioner are the facility of its implementation and the low memory it needs. But we can find other preconditioners such that resolution of the linear system is fastest, it is the case of the SSOR Preconditioner.

SSOR Preconditioner(Symmetric Successive Over Relaxation)

We decompose the symmetric matrix A like follows:

A=L+D+L

where L is the strictly lower part of A and D is the diagonal of A. SSOR Preconditioner consists in taking

C=(Dω+L)ω2ωD1(Dω+L)

where ω is a relaxation parameter. A necessary and sufficient condition of the Preconditioned Gradient Method algorithm is to fix the parameter ω in the interval ]0,2[.