I'd like to see the Python binding enabled by default for Dune 2.8. Not sure how long we should wait to do this step.
I would also like to do that - any objections? It shouldn't really change a lot except
running the test will be slightly longer
make install will by default also install the python bindings. There is an issue here that without a virtual env and without DUNE_PYTHON_INSTALL_LOCATION=user root rights would be required to run make install after this change.
Designs
Child items ...
Show closed items
Linked items 0
Link issues together to show that they're related.
Learn more.
I know what I wrote and still mean it that way. But the added compilation time get annoying over the time.
Further, in many modules the Python tests are failing for me with make build_tests && ctest. I never found the time to investigate or even report this problem. I'll open at least an issue so it won't be forgotten.
I just had a look. The issue is the following: the python embedding tests assume that the dune packages are installed. Running them without installed packages gives
terminate called after throwing an instance of 'pybind11::error_already_set' what(): ModuleNotFoundError: No module named 'dune'Aborted (core dumped)
I can see two ways of solving this issue:
add something to the tests that 'fails gracefully' if the python package was not found - can a test return something that indicates a warning that it couldn't be run but that this is not an error?
don't compile/run the tests by default but have an extra target for the python tests.
Perhaps someone has another idea. I don't think it is feasible to try to find the python packages in the build directories of the different modules. Might work for tests in dune-common but not in dune-grid (which would have to find the dune-common package as well).
Concerning the compile/run time: the python tests in dune-common are practically instantaneous compared to compiling looptest_vector_BinaryOpsVectorScalar_char.cc and the similar other tests in simd/test. On my machine those seem to each take minutes to compile (I also get quite a few warnings in loop.hh but that is a separate issue).
This is not really a CI problem in and of itself - we have CI tests working with the python tests by simply setting the python paths to the build directories. The issue is with running make testssimply in the build directory locally. You don't want to install in that case? Although we could of course also run the pip install command during configuration of the dune module I guess.
Andreas, you are right: That's the error message I get. I don't think it is acceptable to have failing tests unless the modules are installed. Excluding them would be the minimum. But even that is against the idea of testing in the first place.
I am not sure how to proceed.
From my point of view we can simply remove the embedding tests - or move them to a different test target. As far as I know nobody is actually using the option of embedding the dune modules anyway, so we are testing something that is not used. If someone is using them, then how are they carrying out the tests at the moment? Would be nice to know.
Then there is only the issue of what make install does to resolve. Again, installing the Python packages during make install is not something we need to do, I only use make install_python anyway and if someone needs both they can run both targets (or we add a make install_all target).
If we do the above, enabling python bindings by default only adds a python subfolder to the build dir and the make install_python target and that is all that happens. In future version, if someone actually runs into issues with that setup or we find a way of running embedded tests in build dirs (because someone is actually using that feature), we can improve on how testing/installation is done.
Dune-fufem does use embedded python and would probably start using the python modules if the problem of isolating different builds would be solved. Currently I don't want to tell our users that from now on, they carefully have to switch python environment manually because they can no longer trust that build foo is actually used, just because you're in the foo build tree.
But @andreas.dedner is right: Since it's currently not used we can probably disable the test. Once the other issue has been solved we (including me) can work on this.
@andreas.dedner So, if you run cmake -DCMAKE_INSTALL_PREFIX=/home/xyz/dune ... and then make install it does not work without sudo? What about clusters where you never get root rights? Do you have to make extra preparation to call install?
What about having DUNE_PYTHON_INSTALL_LOCATION=user by default? Where is it installed then?
What about having DUNE_PYTHON_INSTALL_LOCATION=user by default? Where is it installed then?
The issue here is that in that case pip install --user is called which fails in the case of a virtual env.
ERROR: Can not perform a '--user' install. User site-packages are not visible in this virtualenv.
So in the cmake setup one would have to check if one is in a virtual env in that case
the default would have to be DUNE_PYTHON_INSTALL_LOCATION=none and otherwise
use DUNE_PYTHON_INSTALL_LOCATION=user.
That would be perfect because then no additional cmake parameter is needed - except for a system wide install which should be the exception in my view anyway.
I guess it depends upon figure out in cmake if a virtualenv is active or not. I'm assuming that is possible by running a short python script which returns that information. Once we have that the default for DUNE_PYTHON_INSTALL_LOCATION can be set accordingly. But perhaps someone with more cmake knowledge will point out where I'm wrong?
My suggestion was to add something like this in DunePythonInstallPackage.cmake:
# Construct the installation location option string if(HAVE_PYTHON AND DUNE_PYTHON_INSTALL_LOCATION) # this is of course not the right syntax dune_execute_process(COMMAND ${PYTHON_EXECUTABLE} -c "import sys; sys.exit( 1 if sys.prefix != sys.base_prefix else 0 )" RESULT_VARIABLE is_in_venv) if(NOT "${is_in_venv}" STREQUAL "0") set(DUNE_PYTHON_INSTALL_LOCATION "--user") endif() endif()
I think that requiring sudo permissions for local installation is a no-go.
For me there's a vital question that should be answered before enabling the bindings by default: I have (and I always advocate this to colleagues and students) several setups for debug and release builds, varying compilers and standard versions - each with a different build directory. When I last tried the python bindings, there was no automatic solution to isolate those setups also in terms build python modules. The only solution was to manually switch different virtual env setups. Has this problem been solved meanwhile?
If we enable the bindings by default and one has to switch the environment manually this will indeed change a lot. I'm afraid that - even if you don't use the bindings - one will have to do this to avoid a mixup in the tests of the bindings.
I'm not quite sure how this even could be solved? I'm assuming in your scenario above you don't actually install dune anywhere but work in the build directories? I don't see anyway of installing dune (with or without bindings) with different build type/compilers/version. But perahsp I am missing something? At least by switching virtualenvs you could install everything with different versions.
If you want to work in the build directories with the python bindings not installed then you will need to set the PYTHONPATH to point to the build_dir/dune-module/python folders - there is to my knowledge no other way to find Python modules. You could use direnv and add to the PYTHONPATH when entering the build directory. If you are running a C++ program with embedded Python (as in the test) there might be a way of modifying sys.path to include the build the python folders in the build folder. But I'm not quite sure how to figure out the right paths to add.
I'm assuming in your scenario above you don't actually install dune anywhere
Yes, because it doesn't give any benefit, but just leads to problems and restrictions. But that's another story.
If you want to work in the build directories with the python bindings not installed then you will need to set the PYTHONPATH to point ...
Installing just the python modules to a virtual env seems to be feasible. But I don't think that one should force users use direnv or even to get familiar with manually switching python environments. This
dunecontrol --opts=release.opts alldunecontrol --opts=debug.opts all
worked before and should continue working without any introducing issues like mixing/linking stuff from different build setups.
Without knowing how the python modules are currently build, this is what I would consider a viable approach:
By default python modules are always build/installed in a separate virtalenv for each build configuration.
For each build configuration you can pass the virtualenv to dunecontrol in your option file.
If the virtualenv is not set in the option file, dunecontrol creates one in the build directory. This is needed to keep existing option files working. For an absolute BUILDDIR this should be easy. For relative intra-module BUILDDIR one has to find a central place for the virtual env. Putting it below the dune-common build dir could be a (nasty but working) solution.
In the option file you can also explicitly disable virtualenv support to use the global python installation.
Within each build module, the responsible virtualenv should be available in some way. It should be (temporarily) activated automatically whenever python comes into play, e.g. when calling make or ctest.
And as an add-on which would be nice to have:
Assist people using embedded python with how to activate the respective virtualenv for the build configuration within you application using the embedded interpreter.
Isolating setups automatically is IMO a must, to avoid that people should themselves in the foot. If, however, you say that's for experts only who know what they are doing, then it should not be enabled by default.default.
Installing just the python modules to a virtual env seems to be feasible. But I don't think that one should force users use direnv or even to get familiar with manually switching python environments. This
dunecontrol --opts=release.opts alldunecontrol --opts=debug.opts all
worked before and should continue working without any introducing issues like mixing/linking stuff from different build setups.
We already have support in dune and cmake to setup per-configuration
virtualenvs. While I don't like the way it is currently done (I always
have issues getting this to work properly), I think we could reuse
these venvs to handle the installation.
The reason for the "private" venv was that we might require additional
modules and can't require admin privileges. The same reason holds for
installing dune python modules.
Perhaps our cmake gurus can judge whether this would be an option?
Unfortunately DUNE_PYTHON_VIRTUALENV_SETUP and the python bindings seem to be completely unrelated. When activating both, DUNE_PYTHON_VIRTUALENV_SETUP and DUNE_ENABLE_PYTHONBINDINGS, a virtualenv will be created in the build directory (which is what I proposed), but is not used by the bindings.
can someone work me through this build-dir virtualenv. I get that it is constructed during the build process but
is that one virtualenv used for all dune module, e.g., in dune-common
how is it activated, i.e., at which point. Since Carsten doesn't want to do this actively it would have to be activated while executing some build target. So if I run ./embedded_test1 the virtualenv would have to be activated before the program is run (or at least before the python interpreter is started in 'embedded_test1'. Or is there some other feasible approach?
I just don't know about this feature since I've never used it.
Notice that you still need to call activate to persuade setup-dunepy.py to use the environment.
We might move the management of the venv to dunecontrol. That would allow us activate the venv when ever needed.
If we want to persue this route, I'd suggest to to something similar
to the BUILDDIR variable in the opts file.
If the user specifies a the DUNEVENV variable dunecontrol ensures that
there is a usable venv. Either the user has already created this venv,
or dunecontrol will do so.
Then your above call could be stripped down to something like
./dune-common/bin/dunecontrol --opts=test.opts all setup-py
which would
create the builddirectory
setup the venv
active the venv
run target all for each module (i.e. configure + make)
setup python for each module and install the python modules
But that doesn't help with activating the right venv in the build directory does it? So the problem Carsten has is still there - or I've missed something.
Also I would prefer not to have to specify a virtualenv in the opts file if I have an active virtualenv because that would change my workflow. My use case is that I run python scripts outside of any build dir (or even a dune module) because I don't need one). Also the point was a bit to get rid of the opts file. But I would be happy to tell people to setup a virtualenv before building dune so having the default that a global virtualenv is used when activated would (probably) work for me.
dunecontrol can activate the venv before running any python related code. (I updated my previous comment)
The benefit in my proposal would be, that we have an easy possibility to specify any venv and we are free to set it up before running dunecontrol. This allows us to work around problems with strange python setups etc. without the need to fiddle with cmake, which (for the python stuff` never worked reliably for me.
Just to check: so in my workflow I would create my global venv. I would then activate that (I mostly have an activated venv in any case since I'm testing some python script). Then after some change to the C++ code, I might want to rerun dunecontrol. For this to work I would need add that virtualenv to the config.optsfile so that dunecontrol installs the dune modules into that venv?
If that is correct, then the only suggestion I'm making is that dunecontrol uses an active venv if none is specified in the config.opts file because then I don't need to touch that in my case. In the other setting no venv would be active so one would be created by dunecontrol if no VENV variable was found in config.opts.
If you're using DUNE_PYTHON_VIRTUALENV_SETUP the venv is created within CMake and DUNE_PYTHON_VIRTUALENV_PATH and DUNE_PYTHON_VIRTUALENV_EXECUTABLE are provided. This should be available in all downstream modules.
Regarding @christi's alternative: I think creating the venv before (e.g. from dunecontrol) would also be OK. But then it has to be passed to CMake in order to ensure that it's also used when you call make in the build dir manually.
For both approaches the question is, if it's possible to build the bindings within a venv which is provided by a CMake variable. Then this variable could be set to the created module or manually by passing it from dunecontrol. The second approach reduces magic and is probably simpler. If the venv is specified in an options file, it should be possible to also use it in setup-dunepy.py.
Perhaps it makes sense to split this into two separate issues?
making sure tests don't fail when dune python packages can't be found. One could simply add the test to the beginning of the c++ sources to see if dune can be found and return 0 if they are not found. Getting the embedded python tests to run without install modules, doesn't have to be a show stopper for this issue.
making sure that make install works without root privileges. This is a showstopper and some suggestions were made above.