Parallel Preconditioning / Hypre
Hey again! This is kind of a follow-up on this issue:
Building on the combined
space I am currently trying to solve some model where the simulation performance is very sensitive to the choice of preconditioner. I can probably handle simple 2D benchmarks on one processor but I will def. need to run the simulation in parallel later on, which brought me to this "parallel preconditioning" issue
It is a bit unclear which of the preconditioners are supposed to work in parallel in the current state of DUNE-FEM. In particular the list in the tutorial doesn't seem up-to-date.
For example, the ildl
preconditoner is marked as "parallel-ready" but gives
RuntimeError: InvalidStateException [matrixAdapterObject:/home/benjamin/Software/dune-jul23/dune-fem/dune/fem/operator/matrix/istlpreconditioner.hh:844]: ISTL::SeqILDL not working in parallel computations
I am also confused about the hypre
situation, I found the two "concurrent" ways to set up hypre
, one via
"newton.linear.preconditioning.method": "hypre"
,
"newton.linear.preconditioning.hypre.method": "boomeramg"
and one with kspoptions
as mentioned here.
Playing around with the settings I am not sure that "newton.linear.preconditioning.hypre.method"
does anything?
More precise points:
- Are there any minimal working examples on the definite way to use
hypre
? How can I find out which of the hypre preconditioners are truly supported? In particular the parallel ILU packages likeEuclid
,PILUT
andhypre-ILU
don't seem to work? - For the other examples like OAS mentioned here, how do I set the flag "
fem.solver.petsc.blockedmode
"? - I can confirm the finding that for my nonlinear Navier-Stokes-like problem the Newton solver gets stuck with petsc + hypre with converged linear solver while it converges smoothly with ISTL.
Best, Ben