CGS – Conjugate Gradient Squared
Usage:
-ksp_type cgs
Options: none
Benefits: Often converges much faster than BICG and does not have to store an increasing number of vectors like in GMRES.
Drawbacks: Exhibits irregular convergence often involving cancellation.
Notes: Often utilizes fewer Newton steps when compared to BCGS or BCGSL. The BICG methods solve the linear systems faster but this does not compensate for the increased number of Newton steps.
The Conjugate Gradient Squared method is a variant of BiCG that applies the updating operations for the -sequence and the -sequences both to the same vectors. Ideally, this would double the convergence rate, but in practice convergence may be much more irregular than for BiCG, which may sometimes lead to unreliable results. A practical advantage is that the method does not need the multiplications with the transpose of the coefficient matrix.
Often one observes a speed of convergence for CGS that is about twice as fast as for the biconjugate gradient method, which is in agreement with the observation that the same "contraction" operator is applied twice.
The method tends to diverge if the starting guess is close to the solution.
CGS requires about the same number of operations per iteration as the biconjugate gradient method, but does not involve computations with A^(T). Hence, in circumstances where computation with A^(T) is impractical, CGS may be attractive.