Preconditioners.
I was in need of doing some parallel computations for space-time DG and the situation on preconditioners is not good.
Here is an overview on what works and what does not:
storage=fem
Jacobi and SOR could and should be implemented, maybe a Bachelor thesis. This would also work parallel.
storage=istl
Basically nothing works parallel. I implemented at least a parallel Jacobi which needs the fix mentioned in core/dune-istl#102 (closed)
storage=petsc
Here, at least some things work, but I needed the cmake fix from !500 (merged) to be able to use it.
Jacobi and SOR work out of the box, OAS (Overlapping Additive Schwarz) works but needs the flag
fem.solver.petsc.blockedmode
set to false
. Not sure if this can already be communicated in the same way as the other flags. pcgamg did not work.
Hypre seemed to work, yet was slower than SOR (25 instead of 9 its), however, the results seems not to be forwarded correctly, since the outside Newton solver did stagnate.
PETSc::KSP: it = 100 res = 8.71793e-10
Converged reason:3
Newton iteration 2: |residual| = 4.69185
The selection of the specific hypre method is unexpected. We select newton.linear.preconditioning.method
: "hypre" but then newton.linear.hypre.method
: "..." where one might expect
newton.linear.preconditioning..hypre.method
. But that is just a minor detail. It would be great if the available parameters could be printed somehow. parasails
and pilu-t
did not work.
ML also needs fem.solver.petsc.blockedmode
set to false
and did not work because of 0
on some diagonals. This can be cured by the blockMode which is not available.