Python packages should not be installed during configure and make
The usual procedure of installing C++ based projects is configure, make, install. With the new Python bindings enabled per default there some IMO unexpected behaviour:
-
During
configure
(e.g.dunecontrol configure
) Python packages (likejinja2
,numpy
) are silently installed into the virtual environment (external if a venv activated, internal in dune-common if we are not in a venv). This doesn't even produce output. -
During
make all
(e.g.dunecontrol make
) Python packages are installed into the virtual environment. This time it's the Dune packages depending on the Python bindings. -
During
make install_python
packages get installed globally (system-wide) even if a virtual env is activated. EDIT: This only happens when the internal venv is used.
Expected behaviour:
-
configure
only configures (and doesn't create a venv) -
make all
only builds (also building Python bindings (the C++ part) that are enabled per default now) -
make install_python
installs Python packages (system-wide may be the default but if a venv is actived it would be nice to install it in there per default) -
make install_python_editable
installs Python packages editable instead (would be nice to have)
Some current issues
-
It currently doesn't seem possible to just configure and build without installing Python modules. This requires an internet connection (ok it prints some warnings if there is no internet and skips it I think). I might want to just build my code locally without an internet connection and compile some C++ stuff. Why do I need to install Python bindings for that? Or start my own virtual env after running configure. Then the packages are installed in a now useless internal venv. Why not wait?
-
Or in a CI pipeline, I might want to separate Python and C++ testing in different build jobs where the C++ job doesn't need to know anything about the Python job.
-
I might want to install my Python modules system-wide with
make install_python
. However, per default everything always gets installed editable inside the internal venv duringmake
already, although this is completely unnecessary in that case. -
In the case of internal venv the build dependency / flow of information is inverted. Usually, downstream modules depend on upstream modules. Now, the dune-common build directory contains information from all downstream modules.
-
Dune now (at least on Ubuntu) requires to install
python-venv
otherwise the dunecontrol fails.
Building the Python C++ binding modules per default is of course ok. But why is the installation of Python packages not optional anymore (like in Dune 2.8)? Also is there a reason not to delay the installation until installed Python packages are actually needed?
Suggestion:
- Don't create an internal virtual env during configure or build/make. If an external venv is activated, use that as default installation destination, if not, install system-wide per default (like for C++) ((alternatively, default-install into an internal documented location like now but then
make install_python
should create the venv and install there and notconfigure
ormake all
)) - Configure only configures (includes configuration of Python packages which depend on CMake information), and one can pass an installation path for the Python packages (if you don't want to install system-wide or in an external activated venv)
-
make all
only builds libraries (no C++ tests, and no Python installation) -
make install_python_editable
installs Python packages into configured destination in editable mode usingpip
-
make install_python
installs Python packages into configured destination in normal mode usingpip
-
make install
? I have no idea what this currently does in combination with Python packages.
I think this would make things both simpler to implement, more transparent, consistent and expected.
Question:
Is the following workflow already possible (by setting some variables) but I just don't know it?:
- disable creation of internal venv and provide path to external venv via CMake (that is not activated yet)
- run
dunecontrol all
which only configures and builds without installing like in 2.8 with Python bindings enabled - run some install command that installs Python modules and depedencies in provided path (possibly in editable mode)
I think this is at least one reasonable workflow that should be supported.
User perspective: suggestion vs status quo:
I think in the minimal case the following would be possible: (Assuming I cloned all modules into a directory)
-
Currently (master), to run a python script with dune is as simple as:
./dune-common/bin/dunecontrol --opts=my.opts all
source ./dune-common/build-cmake/dune-env/bin/activate
python my_python_test.py
-
With what is suggested here, I believe, the same could be achieved with:
./dune-common/bin/dunecontrol --opts=my.opts all
./dune-common/bin/dunecontrol make install_python_editable
python my_python_test.py
Related:
- Even with DUNE_ENABLE_PYTHONBINDINGS=0 a virtual env is installed. Would be nice to have one variable to turn off the Python stuff. (also see #293 (closed))