Skip to content
Snippets Groups Projects

Revert "Merge branch 'feature/py-improve-parallel-communication-interface' into 'master'"

Merged Timo Koch requested to merge revert-4eee0b3a into master
3 unresolved threads

This reverts merge request !1230 (merged)

!1239 introduces a new dependency on mpi4py in the C++ part without CMake feature test. HAVE_MPI true doesn’t mean that the header mpi4py.hh is available. Breaks downstream code. Before !1230 (merged) was not a build dependency and was only needed at runtime.

Edited by Timo Koch

Merge request reports

Loading
Loading

Activity

Filter activity
  • Approvals
  • Assignees & reviewers
  • Comments (from bots)
  • Comments (from users)
  • Commits & branches
  • Edits
  • Labels
  • Lock status
  • Mentions
  • Merge request status
  • Tracking
  • assigned to @christi

  • Timo Koch changed the description

    changed the description

    • @tkoch why would dune/python/common/mpihelper.hh be included in any C++ code? If python bindings are used and HAVE_MPI is true then mpi4py is needed otherwise weird error can occur - that is at least our experience. That is why mpi4py is a required package (either included during configure or tested for when importing dune.common - see python/dune/common/__init__.py). So I don't really see where the problem would come from.

    • Author Owner

      It’s a C++ header so obviously it’s only included in C++ code. Mpi4py is needed yes but until now it wasn’t a compile time requirement. see description. Until now you just get an error when you run a test and mpi4py is not found.

      or tested for when importing dune.common - see python/dune/common/__init__.py

      That's where the problem is. With this MR mpi4py is already a build requirement. Before it was only tested for mpi4py when importing dune.common.

      So I think before merging this again, there should be a proper CMake way of finding mpi4py. And then the bindings have to be disabled if mpi4py.hh is not found. These checks are currently not in place.

      Edited by Timo Koch
    • Author Owner

      mpi4py is currently not in Python-Requires: in dune.module. (and I think shouldn't be)

      There was a discussion earlier somewhere (can't find it now) that MPI shouldn't be a hard dependency for Python binding just the same as for C++ where the code also works without MPI.

      The problem with this MR is now that HAVE_MPI is used and if that is true it is assumed that mpi4py is also there. But this is a separate feature check. I don't know what the optimal behavior for HAVE_MPI but NOT_HAVE_MPI4PY is; but probably that the Python bindings are disabled then with an informative error message.

      Edited by Timo Koch
    • Author Owner

      I guess I missed out on https://gitlab.dune-project.org/core/dune-common/-/blob/master/python/setup.py.in Now I don't understand why it fails...

    • Author Owner

      Ah ok new theory: mpi4py is added as a requirement and installed into the internal virtual env. But when I compile the Python bindings the venv is not necessarily activated (at least this wasn't necessary until now). This means mpi4py.hh is not found because CMake doesn't find the path inside the venv where it is installed. So we have to tell CMake to look in the venv path?

    • @samuel.burbulla @andreas.dedner what is the intended behaviour? How would the "correct" solution work?

    • The general assumption so far is that if you have MPI and you use the python bindings, you also need mpi4py - and the current setup takes care of this assumption when running a python script.

      Until now, mpi4py was not necessary to build the bindings. As @tkoch observed, this is what breaks the downstream code now.

      I think the "correct" solution would be to check if mpi4py is available before building the bindings when MPI is available, then either abort (because of a missing dependency) or add it (to the build environment). I prefer adding it.

      @andreas.dedner What was the reason that mpi4py is not automatically installed into the venv, but only checked at runtime printing the warning message? All other dependencies are being installed into the venv. Would it help to install it at the right time? I find a bit surprising to see the warning on the first script execution, too.

      P.S. Another solution would be to move all mpi4py C++ code into JIT modules.

    • @samuel.burbulla building mpi4py is painful because it is not a wheel on pypi but needs to be compiled. That was at least one reason.

      I also think that the problem is solved by JIT compiling the communicator on demand - like we do it for higher dimensional FieldVector for example. The lesson here is that no venv dependent stuff can be included in files like _common.so which are build in the dune-foo module itself outside of dune-py - i.e. not in a just-in-time fashion.

    • Please register or sign in to reply
  • Timo Koch mentioned in merge request dune-grid!699 (merged)

    mentioned in merge request dune-grid!699 (merged)

  • Timo Koch mentioned in merge request !1230 (merged)

    mentioned in merge request !1230 (merged)

  • Author Owner

    Steps to reproduce this error: On a system with MPI (but no system-wide mpi4py installation)

    Prints during configure:

    -- Installing python package abstract requirements: pip>=21.a portalocker numpy wheel setuptools>=41 jinja2 mpi4py 

    Fails during make with

    [ 75%] Building CXX object python/dune/common/CMakeFiles/_common.dir/_common.cc.o
    In file included from dune-common/python/dune/common/_common.cc:14:
    dune-common/dune/python/common/mpihelper.hh:21:10: fatal error: 'mpi4py/mpi4py.h' file not found
    #include <mpi4py/mpi4py.h>
             ^~~~~~~~~~~~~~~~~
    Edited by Timo Koch
  • Timo Koch assigned to @tkoch and unassigned @christi

    assigned to @tkoch and unassigned @christi

  • merged

  • Timo Koch mentioned in commit 8867d95c

    mentioned in commit 8867d95c

  • Timo Koch mentioned in commit 64677bc5

    mentioned in commit 64677bc5

  • Christian Engwer mentioned in issue #330

    mentioned in issue #330

Please register or sign in to reply
Loading