Enable Python Bindings
This MR enable Python bindings for all modules
Merge request reports
Activity
@andreas.dedner I'm not sure what's the problem in the downstream modules with the enabled bindings. The error is of the form
Traceback (most recent call last): File "geometrytype.py", line 1, in <module> from dune.geometry import * File "/duneci/modules/dune-geometry/build-cmake/python/dune/geometry/__init__.py", line 2, in <module> from ._referenceelements import * File "/duneci/modules/dune-geometry/build-cmake/python/dune/geometry/_referenceelements.py", line 1, in <module> from ..generator.generator import SimpleGenerator ModuleNotFoundError: No module named 'dune.generator'
Note, the upstream modules (e.g. dune-common for dune-geometry) are installed into
$CI_PROJECT_DIR/modules
while the currently processed module (e.g. dune-geometry) is located in/dunci/module/dune-module
. Might this irritate the python module finding? Or do I need to additionally copy the/duneci/modules/dune-py
dir between stages?Edited by Simon Praetoriusadded 1 commit
- bc263d42 - set DUNE_PY_DIR inside exported artifacts dir
added 2 commits
Now I have created a venv that can be stored in artifacts and so communicated between stages. This now give the
dune.generator
module. But now thedune-geometry
fails with missingdune.geometry
module:Traceback (most recent call last): File "geometrytype.py", line 1, in <module> from dune.geometry import * ModuleNotFoundError: No module named 'dune.geometry'
In which order do I need to call build, build_tests, test, install, install_python?
Edited by Simon PraetoriusThe process is the following: in a module with python bindings some bindings are precompiled which happens when calling 'make' in the build directory. So you get for example
dune-geometry-build/python/dune/geometry/_geometry.so
which is imported indune.geometry.__init__.py
. If you now runmake python_install
this python module will be installed into the package site together with the other python scripts fromdune.geometry
. In the ci scripts we (so far) had to usePYTHONPATH
, e.g., indune-grid
:PYTHONPATH: /duneci/modules/dune-common/build-cmake/python:/duneci/modules/dune-geometry/build-cmake/python:$CI_PROJECT_DIR/build-cmake/python
That was due to issues with correct rights which made
pip
fail.So you basically need to make sure that
-
make
is called to generate the_*.so
files in the build directory. - either
make python_install
is called (if that is allowed) or thePYTHONPATH
is set - now
make test
should work
-
I tried this, but it still does not yet work. I have to investigate further. The steps I make are the following:
- create venv and activate
- download, build, test and install dune-common, with enabled venv. Also
PYTHONPATH=/duneci/modules/dune-common/build-cmake/python
- upload installed dune-common and venv (including .cache dir) to artifacts
- (new system, does not see anything from before)
- download artifacts and store in same location, activate existing venv
- download, build, test and install dune-geometry, Also
PYTHONPATH=/duneci/modules/dune-geometry/build-cmake/python
After changing the order to build, install_python, test, install it still fails, but with another error. Now dune-common is not found within the python tests.
Edited by Simon Praetorius
If I understand correctly you are setting up a venv and installing the bindings there? Would it be possible to change the docker settings so that installing some python package from pypi would be possible. At the moment one gets
WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7f7b812feb80>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': /simple/nbformat/
I completely appreciate if this just has to be prohibited for security reasons, but I did want to ask. It would make it easier to test python packages that require some additional package (now one needs to add everything to the docker images which is a bit of a hassle). An alternative might be to clone the required python package to the gitlab registry and pip install it from there - assuming that is allowed.
Would it be possible to change the docker settings so that installing some python package from pypi
This is not a setting of the docker image, but it is setting in the docker configuration fo the runners... and there are good reason, why internet access is disabled on mist runners. Alternative: You setup a runner yourself and we fix a runner tag to match the python tests on your runner.
As I said I was assuming that this is a security issue and should be that way. It seems to be possible to access the dune-project gitlab so packaged stored there could probably be accessed? Perhaps we could use that to install additional packages but I'll need to test that - or do you already know that that can't work due to some port not being available that pip needs?
I'm not sure. I think this is also a Runner configuration. So, it might be that it works on some machines on others not. But I have no experience with pip install in the ci pipelines. I tried to write the system test without installing anything additionally. This should work, because there are probably many systems without direct (or only restricted) internet access for downloading/installing stuff
The dune modules themselves are downloaded from the gitlab server, that is why I was hoping that accessing the gitlab registry would perhaps be possible also with pip. I'll check and if not then updating one docker image with the additional python packages would be enough for my testing purposes.
I have to see if it is feasible. The idea would be to upload the tar balls of the python packages needed to the gitlab Python package registry and to then use the corresponding dune gitlab url in the pip install command. At least for testing that would be very helpful since I wouldn't have to modify the docker image, test, notice some other package missing requiring a further rebuild of the docker image etc. Also, if somebody has one special package, they need to run their python tests, this might be a feasible approach to povide that without having to update any docker image.
As I said I haven't tested this yet and don't know if it really would work - I have installed dune Python packages uploaded to the gitlab Python package registry with pip on my local machine, so I know that that part works just not if it works from within the CI.
The major ones are included at least in the debian-11 imagre not sure about the others. That includes 'numpy', 'scipy', 'matplotlib'. Also for example 'ufl' is included but there for example I could image quickly a version issue, i.e., sombody needeing a newer version then the one available through
apt
which is the one included in the image. For those cases, a repository local python package would be needed, I could setup a new image as is done for pdelab as I saw but I was hoping an easier approach would be possible.
An initial fix, see core/dune-common!920 (closed), does solve the first problem, i.e., then dune-common is found, but in the next step
#include <dune/python/geometry/referenceelements.hh>
is not found. I have no idea, where the include paths are set.Am I looking at the setup from https://gitlab.dune-project.org/infrastructure/dune-nightly-test/-/blob/master/.gitlab-ci.yml?
The setup from https://gitlab.dune-project.org/infrastructure/dune-nightly-test/-/blob/python-bindings/.gitlab-ci.yml does contain the enable pyton bindings stuff.
My local commands are as follows:
source ~/dune-env/bin/activate dunecontrol --opts=python.opts --only=dune-common all dunecontrol --opts=python.opts --only=dune-common make install_python dunecontrol --opts=python.opts --only=dune-common make install cd dune-geometry export DUNE_CONTROL_PATH=[DUNE_INSTALL_DIR] export DUNE_CONTROL=[DUNE_INSTALL_DIR]/bin/dunecontrol $DUNE_CONTROL --opts=../python.opts --current all $DUNE_CONTROL --opts=../python.opts --current make install_python $DUNE_CONTROL --opts=../python.opts --current make build_tests $DUNE_CONTROL --opts=../python.opts --current make test
with
python.opts
is as follows:CMAKE_FLAGS="-DBUILD_SHARED_LIBS=ON -DDUNE_ENABLE_PYTHONBINDINGS=ON \ -DCMAKE_INSTALL_PREFIX=[DUNE_INSTALL_DIR]" BUILDDIR="build-python"
with some
[DUNE_INSTALL_DIR]
set so something writable.Edited by Simon PraetoriusAs I said I never use this so I'm struggeling right from the start: I added -DCMAKE_INSTALL_PREFIX=
/tmp/DUNE
to myconfig.opts
Thendunecontrol --opts=config.opts --only=dune-common all
, thenmake install
indune-common
. Then removeddune-common
and ran/tmp/DUNE/bin/dunecontrol --opts=config.opts --only=dune-geometry all
withexport DUNE_CONTROL_PATH=/tmp/DUNE
. I get----- using default flags $CMAKE_FLAGS from /home/dedner/DUNE/config.opts ----- --- going to build dune-geometry --- --- calling all for dune-geometry --- ERROR: could not find /dune.module --- Failed to build dune-geometry --- Terminating dunecontrol due to previous errors!
What am I missing?
So the main issue: you are calling make py_sometest within the dune-geometry build dir. At this point dune-py has not yet been created so it will be created at this point. It doesn't know about dune-geometry because that is not in the DUNE_CONTROL_PATH. If I add the
dune-geometry
source directory to DUNE_CONTROL_PATH` then that problem goes away.The point is that when python is started and the first JIT compilation is started, dune-py is reconfigured and made to depend on all modules that can be found - so dune-common is found since installed but dune-geometry can not be found since the dune.module file is below the build-dir so not found under '.' or anything else in the DUNE_CONTROL_PATH.
source ~/dune-env/bin/activate dunecontrol --opts=python.opts --only=dune-common all dunecontrol --opts=python.opts --only=dune-common make install_python dunecontrol --opts=python.opts --only=dune-common make install cd dune-geometry export DUNE_CONTROL_PATH=[DUNE_INSTALL_DIR] export DUNE_CONTROL=[DUNE_INSTALL_DIR]/bin/dunecontrol $DUNE_CONTROL --opts=../python.opts --current all $DUNE_CONTROL --opts=../python.opts --current make install_python $DUNE_CONTROL --opts=../python.opts --current make build_tests $DUNE_CONTROL --opts=../python.opts --current make test
in this case the DUNE_CONTROL_PATH only contains the installed
dune-common
.I would expect that you use something like this...
source ~/dune-env/bin/activate dunecontrol --opts=python.opts --only=dune-common all : make install_python dunecontrol --opts=python.opts --only=dune-common make install export DUNE_CONTROL_PATH=[DUNE_INSTALL_DIR]:[DUNE_SOURCE_DIR]/dune-geometry export DUNE_CONTROL=[DUNE_INSTALL_DIR]/bin/dunecontrol $DUNE_CONTROL --opts=../python.opts --current all : install_python $DUNE_CONTROL --opts=../python.opts --current make build_tests : test
ok, it works in my local setup. Now I have to find out what is going wrong in the ci-setup. Because, there both, the install-dir and the dune-geometry build-dir are in the DUNE_CONTROL_PATH
BTW, does it work without the patch core/dune-common!920 (closed)?
There not comparably in this case - it's like calling dunecontrol from within the dune-geometry build-dir. That wouldn't work either. Actually, you should be thinking more of duneproject in this case. It's like using
duneproject
to construct a module dune-py which is supposed to depend on all available modules. dune-geometry wouldn't be offered as an option by duneproject either if you are within the dune-geometry build-dir I'd guess.Ok, I see. Thanks for the explanation.
However, I have a few doubts. At least, for me the behavior is surprising. Me, as user, calls dunecontrol from within the source (or any root) directory, not from within the build directory. This happens internally. I do not even see the note, that for python everything happens somewhere else. But, I do not know the internals. Probably, the regular user should also not know the internals.
What bothers me is the question why do we need to call something like dunecontrol again? Why do we need to resolve dependencies again? Everything is already configured, all dependencies are already resolved. In the call to the major dunecontrol for the dune sources. This information should somehow be directly forwarded to the python bindings/tests. Whenever the main cmake runs, it should create all the information you need, store it somehow and use this information when you create the internal python projects that are compiled in the tests.
Maybe an idea could be to create a separate target, e.g.,
dune-common-python
with its corresponding config filedune-common-python-config.cmake
that contains everything you need, And then in the internal project CMakeLists file, you just callfind_package(dune-common-python)
and everything gets resolved automatically. Or, maybe we create a correspondingdune-common-python.pc
file and let pkg-config do the job. No need for running dunecontrol (with different parameters/dirs) again. Note, in the future we might want to get rid of dunecontrol (And maybe even the dune.module file) completely. Then you would be lost with the current setup.Note, this is just an idea from someone who has no deeper insight into the python setup. So, it might not work at the end. Or might be hard and lots of work to implement.
Edited by Simon PraetoriusI'm happy to help to look at some change here.
As I said what we are calling is
duneproject
followed bydunecontrol
but note that both are re-implemented in Python. Once we have a better way of using cmake directly to build a project this can be used fordune-py
as well, i.e., we don't even need to setup a full dune module anymore. But you would still need to be able to find all required / available dune modules from within a running Python script.I want to point out that the situation here is an unusual use case for most uses of the python bindings. Normally you would be building (possibly installing) your dune modules. Then you would go to your Python project directory containing your python scripts and run those using
python myscript.py
or you might have a jupyter notebook. I don't even have a dune project anymore in practically all projects that I'm working on and consequently also no cmake. Using the Python bindings from within a dune python build directory is therefore not what we were aiming for. Of course if some people start using it for embedded python then that changes and of course the build tests are a second issue that has it's own set of issues not envisaged.I'm not a cmake expert so don't know what sort of things are possible. It would be good to have some improvement here since at the moment using the same cmake flags for dune-py (e.g. CXXFLAGS) is an unresolved issue. I talked with Dominic about that some time ago and he said that it might become possible with some newer cmake version to extract that information from the cmake build.
Another option for this case is to use the
setup-dunepy
script, i.e.,source ~/dune-env/bin/activate dunecontrol --opts=python.opts --only=dune-common all dunecontrol --opts=python.opts --only=dune-common make install_python dunecontrol --opts=python.opts --only=dune-common make install cd dune-geometry export DUNE_CONTROL_PATH=[DUNE_INSTALL_DIR] export DUNE_CONTROL=[DUNE_INSTALL_DIR]/bin/dunecontrol export DUNE_PYSETUP=[DUNE_INSTALL_DIR]/bin/setup-dunepy.py $DUNE_CONTROL --opts=../python.opts --current all $DUNE_PYSETUP --opts=../python.opts install $DUNE_CONTROL --opts=../python.opts --current make build_tests $DUNE_CONTROL --opts=../python.opts --current make test
I think this approach is probably better anyway. For this to work I needed to add a small fix to
setup-dunepy.py
since it tried to install python bindings fordune-common
not testing if that module was installed added tofix/setup-dunepy
branch.Edited by Andreas Dedner
mentioned in merge request core/dune-common!919 (merged)
If you give me some hours, I will gladly dig into how to provide a proper PyPI fallback. I have already something working for dune-testtools (https://gitlab.dune-project.org/dominic/package-index) but I will now move it to using the Gitlab package registry and to rebuild itself in CI with nightly triggers.