The usual procedure of installing C++ based projects is configure, make, install. With the new Python bindings enabled per default there some IMO unexpected behaviour:
During configure (e.g. dunecontrol configure) Python packages (like jinja2, numpy) are silently installed into the virtual environment (external if a venv activated, internal in dune-common if we are not in a venv). This doesn't even produce output.
During make all (e.g. dunecontrol make) Python packages are installed into the virtual environment. This time it's the Dune packages depending on the Python bindings.
During make install_python packages get installed globally (system-wide) even if a virtual env is activated. EDIT: This only happens when the internal venv is used.
Expected behaviour:
configure only configures (and doesn't create a venv)
make all only builds (also building Python bindings (the C++ part) that are enabled per default now)
make install_python installs Python packages (system-wide may be the default but if a venv is actived it would be nice to install it in there per default)
make install_python_editable installs Python packages editable instead (would be nice to have)
Some current issues
It currently doesn't seem possible to just configure and build without installing Python modules. This requires an internet connection (ok it prints some warnings if there is no internet and skips it I think). I might want to just build my code locally without an internet connection and compile some C++ stuff. Why do I need to install Python bindings for that? Or start my own virtual env after running configure. Then the packages are installed in a now useless internal venv. Why not wait?
Or in a CI pipeline, I might want to separate Python and C++ testing in different build jobs where the C++ job doesn't need to know anything about the Python job.
I might want to install my Python modules system-wide with make install_python. However, per default everything always gets installed editable inside the internal venv during make already, although this is completely unnecessary in that case.
In the case of internal venv the build dependency / flow of information is inverted. Usually, downstream modules depend on upstream modules. Now, the dune-common build directory contains information from all downstream modules.
Dune now (at least on Ubuntu) requires to install python-venv otherwise the dunecontrol fails.
Building the Python C++ binding modules per default is of course ok. But why is the installation of Python packages not optional anymore (like in Dune 2.8)? Also is there a reason not to delay the installation until installed Python packages are actually needed?
Suggestion:
Don't create an internal virtual env during configure or build/make. If an external venv is activated, use that as default installation destination, if not, install system-wide per default (like for C++) ((alternatively, default-install into an internal documented location like now but then make install_python should create the venv and install there and not configure or make all))
Configure only configures (includes configuration of Python packages which depend on CMake information), and one can pass an installation path for the Python packages (if you don't want to install system-wide or in an external activated venv)
make all only builds libraries (no C++ tests, and no Python installation)
make install_python_editable installs Python packages into configured destination in editable mode using pip
make install_python installs Python packages into configured destination in normal mode using pip
make install? I have no idea what this currently does in combination with Python packages.
I think this would make things both simpler to implement, more transparent, consistent and expected.
Question:
Is the following workflow already possible (by setting some variables) but I just don't know it?:
disable creation of internal venv and provide path to external venv via CMake (that is not activated yet)
run dunecontrol all which only configures and builds without installing like in 2.8 with Python bindings enabled
run some install command that installs Python modules and depedencies in provided path (possibly in editable mode)
I think this is at least one reasonable workflow that should be supported.
User perspective: suggestion vs status quo:
I think in the minimal case the following would be possible: (Assuming I cloned all modules into a directory)
Currently (master), to run a python script with dune is as simple as:
With what is suggested here, I believe, the same could be achieved with:
./dune-common/bin/dunecontrol --opts=my.opts all
./dune-common/bin/dunecontrol make install_python_editable
python my_python_test.py
Related:
Even with DUNE_ENABLE_PYTHONBINDINGS=0 a virtual env is installed. Would be nice to have one variable to turn off the Python stuff. (also see #293 (closed))
Edited
Designs
Child items
...
Show closed items
Linked items
0
Link issues together to show that they're related.
Learn more.
If you have no internet connection during configuration, python bindings are turned off (at least that is what is supposed to happen). So no problem there.
During 'make' the dune package is build and installed 'ediatbale' into the active or internal venv. This is required so that 'make tests' can run as expected. This only installs the Dune package so no internet connection is needed at this stage
Running python_install should not install anything globally at least not in the case of an external venv. So if that is happening then something is going wrong on your system.
(2) Or in a CI pipeline, I might want to separate Python and C++ testing in different build jobs where the C++ job doesn't need to know anything about the Python job.
You can disable Python bindings why isn't that enough?
(3) I might want to install my Python modules system-wide with make install_python. However, per default everything always gets installed editable inside the internal venv during make already, although this is completely unnecessary in that case.
In the builddir Python packages get installed into the internal venv (if no external venv is active) and then make python_install installs system wide non-editable.
This is required so that 'make tests' can run as expected.
Where is make tests used? If I run it in dune-common I get "no targets to make". Why can't I run a hypothetical make python_install_editable before running the tests?
Running python_install should not install anything globally at least not in the case of an external venv.
So the behaviour is different if I have an external venv activated at configure time? If it's activated it installs in this venv but if I rely on the internal venv during configure it installs it system-wide? Would be again unexpected IMO.
You can disable Python bindings why isn't that enough?
I just don't want anything to be installed if I didn't type install. If I disable the Python bindings then they are not compiled, right? (I'm distinguishing between the bindings C++ code (needs compilation) and the Python packages (needs installation)).
In the builddir Python packages get installed into the internal venv (if no external venv is active) and then make python_install installs system-wide non-editable.
That's what I said. IMO this is unnecessary work and unexpected. What makes the internal install different conceptually from the make python_install? I can already configure make python_install to install editable in a given location, no? by setting some CMake variables? So there are two approaches to do the same thing and the first install just happens during make all (IMO unexpected behavior).
I'm trying to understand if this new behaviour is a necessity or a deliberate choice? In Dune 2.8 compiling and installing were still separated. Was the change in behaviour necessary to make something else work? Or was it another change among other changes?
As I said it was there to make the tests work - if you are running a test in dune-common's builddir you don't expect to have to first install anything. And yes command is make test not make tests.
After the configrue/make stage the packages are now expected to be usable based on the sources - that is why they are installed editable. After make install (python_install) they are expected to work without sources/buiulddir.
I don't want to run any extra command anymore just to get the python bindings to work - they are supposed to be there after 'make'. That makes complete sense to me. I consider an 'editable installation' no different then having access to the source header files in the make directory. Calling this an 'installation' is semantics for me, the effect is important.
If you can find a way to get 'make python_install' to install into a newly opened venv then please fix that - I would consider that the better solution as well. I thought about that and I couldn't figure an easy way of doing it since the python interpreter is fixed during the configuration. I remember discussing this with Dominic and we couldn't figure out how to do it (but perhaps it is straightforward).
...
From: Timo Koch (@tkoch) gitlab@dune-project.org
Sent: 20 November 2021 12:35
To: Dedner, Andreas A.S.Dedner@warwick.ac.uk
Subject: Re: dune-common | Python packages should generally not be installed during configure and make (#279)
I'm trying to understand if this new behaviour is a necessity or a deliberate choice? In Dune 2.8 compiling and installing were still separated. Was the change in behaviour necessary to make something else work? Or was it another change among other changes?
make test also doesn't work for the C++ tests unless I build them with make build_tests before, so I guess you also consider that unexpected?
IMO, it would be perfectly fine to say you need to install Python packages in editable mode (or however you want) before running Python tests. From what I know, this is also the standard way to test any Python package, first install then test (see e.g. tox). The current way with installing during make is much more unexpected behavior from my perspective.
I don't want to run any extra command anymore just to get the python bindings to work - they are supposed to be there after 'make'.
Ok, so I guess there is just a clear difference in opinion on that point.
Anyway, this issue comes from my perspective as a downstream user that struggled for two weeks to get things running again with Dune master to make our code future-proof. Also from someone who still uses Dune from C++ and now has to manually turn off Python bindings to not have to pip install things during compilation in a minimal setup. Before investing time in further changes, I want to understand what the Dune opinion is on this.
I've already pointed out that these changes were in the pipeline for 9months before being merged and have been discussed heavily previously. If you were not included and didn't get the notification from the dune-common MRs then I don't know why that was and how that could be solved. I get notified on every comment made to any dune-common MR independent if I'm involved or not. Requires me to delete quite a few emails but if that is what it takes to keep uptodate and make contributions to MR that make changes that influence my workflow then so be it. If I miss something then I don't try to change things retrospectively (except if I believe that I wasn't left with enough time to figure out what was going on).
I've spend an unsustainable amount of time on these changes over the last 9 months and I am still trying to help make reasonable changes (still spending too much time on this). I have zero enthusiasm to make anymore major changes to the way the setup is now working. If any changes to the current workflow are decided on, I will not do anything to get them to work or test them.
Setup an MR that changes things the way you would find them less surprising or at least open an issue with a clear suggestion and ask for a vote including volunteering someone to actually do it. I'm out of this - I need to do other things.
I think it's perfectly normal that a change gets noticed in some downstream project after it has been merged. Not everyone using Dune is a core developer. It would even be normal if it gets noticed after the next release. Does not being involved in the development of a feature completely disqualify me for noticing an issue or unexpected behavior?
Please understand that your reaction completely discourages me to make contributions to Dune in the future.
Then we are in the same boat - I'm getting discouraged as well.
...
From: Timo Koch (@tkoch) gitlab@dune-project.org
Sent: 20 November 2021 13:42
To: Dedner, Andreas A.S.Dedner@warwick.ac.uk
Subject: Re: dune-common | Python packages should generally not be installed during configure and make (#279)
I think it's perfectly normal that a change gets noticed in some downstream project after it has been merged. Not everyone using Dune is a core developer. It would even be normal if it gets noticed after the next release. Does not being involved in the development of a feature completely disqualify me for noticing an issue or unexpected behavior?
Please understand that your reaction completely discourages me to make contributions to Dune in the future.
Timo Kochchanged title from Python packages should generally not be installed during configure and make to Python packages should not be installed during configure and make
changed title from Python packages should generally not be installed during configure and make to Python packages should not be installed during configure and make
It is easy to miss a change in this large merge requests. And it is frustrating to not get much feedback before hitting the merge button and getting it afterwards. I think we all got a fair share of both. Please keep in mind, nobody is acting in bad faith.
We delayed merging the Python changes after Dune 2.8 to give us as a project some time to test, gather feedback, and maybe to adjust. Timo started giving this feedback, let us collect further voices and maybe suggestion on how to improve the current code. We can also discuss and decide at a develop meeting, if it does not lead to a consensus here.
In my opinion the current approach is perfectly fine.
Essential ideas of the overhaul were to
introduce an internal venv which can be used to run tests that rely on (embedded) python
reduce the number of commands that are necessary to setup the python bindings.
And I think we got a great default behavior now. It works completely like building DUNE before, but with having the full Python functionality directly at our hands. Several use cases (like having an external venv, installing into the system, editable, no python) are covered.
The discussion started a long time age if the python bindings should be enabled by default, and it was somehow consensus at the last developer meeting that this will happen after 2.8. It always had been clear that this will lead to some overhead during configuration and building, but no one had expressed serious concerns.
For users that really don’t want to have the python bindings (e.g. for a minimal setup), there is an easy way to disable them - and probably that is what you want to do now if you are using DUNE as a pure C++ library.
However, thanks @gruenich for mediating!
I guess we should discuss further major concerns or proposals on a developer meeting.
Just to be clear, I don't suggest here to necessarily change any of 1. or 2., I just want it to happen at a different time and possibly with a different command. The internal venv can be created but not during configure or build. The number of commands can stay the same, but it might be different commands (because the commands configure and build/make carry some expected behavior in the C++ world and installation is not part of that).
The discussion started a long time age if the python bindings should be enabled by default, and it was somehow consensus at the last developer meeting that this will happen after 2.8. It always had been clear that this will lead to some overhead during configuration and building, but no one had expressed serious concerns.
I think it was clear that Python bindings will be configured and built and that this causes some overhead. But at least to me it wasn't clear that Python packages will be installed during both configure and make all. This is what this issue is about.
This is not about making the user experience worse or something. And not about removing features that have been added during the overhaul. And not about making things more complicated. The opposite, it's about reducing build system complexity and unexpected behaviour and making things simpler. At least the dumux CI setup got more difficult in 2.9 than it was in 2.8 because of the mix of installation and building. Hope this clarifies things? EDIT: I updated the description a bit in this aspect. Thanks for feedback.
I would like to summarize my position before our next meeting:
I'm happy with the way things are currently and do not have the time or motivation to invest much time into changing things - the overhaul last year has cost an enormous amount of time due to the large number of different use cases that were required to work.
The tests and setting done in the nightly build and on the github repo
(https://github.com/adedner/dune-testpypi and https://gitlab.dune-project.org/infrastructure/dune-nightly-test)
are quite carefully thought through and cover input from a number of people.
They combine different ways of using (or not using) the bindings, including having some package installed with pip and others editable from source etc. These different workflow should continue to work because they are based on actual usage scenarios collected from people working with the bindings.
I want the all command of dunecontrol to continue to mean all, i.e., not only cpp-all. It should do everything required to use a dune module both from the perspective of a c++ user and a python user. I personally never use the more fine grained control made available by dunecontrol so if something is done during configure or make or some new subcommand is not really important to me as long as all has been done when using all.
From source builds should lead to editable installation of the Python package because that has led to a essential simplification of the development process of the bindings - at the beginning I worked for some time without and that was painful. Calling make install should make the installation non editable because that command indicates that the build/source directory should not be used anymore.
Naturally pip packaging still has to work.
I don't use the internal venv and have no use case in mind where I would. That was important to others and the idea was that with multiple build dirs you would end up with multiple venvs so no problem with mixing venvs and build dirs could easily happen. That is a nice effect of this that makes the python tests in the builddir nicely self contained since no external venv is required. But if someone wants to revamp that setup that is fine with me.
Some of the things in the description are simply bugs and others are simply not true (anymore):
No internet connection is needed - Python bindings are simply deactivated if no internet connection can be set up. That was a point that was important to me as well because offline usage has to obviously work. If this is silent or not is a question of figuring out how to get CMake to give a clearer message which I couldn't figure out how to do.
A point is made about system wide install even if a virtual env is active while then saying it only happens if an internal venv is used. If an invernal venv is used then at least during build no external venv was active. It is an open problem how to get the cmake setup to recognize that the Python setup has changed between build and later usage. I.e. build without an active venv and then activate one before calling python_install. If someone finds a way of improving this then that would be nice.
make install_python does install system wide (the description complains about that while at the same time saying that it should happen). What the issue is with first installing it into the internal venv is not clear to me - is it about build time or memory usage?
Dune now (at least on Ubuntu) requires to install python-venv otherwise the dunecontrol fails.
That is clearly a bug.
Building the Python C++ binding modules per default is of course ok. But why is the installation of Python packages not optional anymore (like in Dune 2.8)? Also is there a reason not to delay the installation until installed Python packages are actually needed?
I don't know how the second would work? Otherwise this goes back to my view that dunecontrol all gets me to the point where everything simply works without extra work - and for me and my users that included being able to use Python bindings.
In the case of internal venv the build dependency / flow of information is inverted. Usually, downstream modules depend on upstream modules. Now, the dune-common build directory contains information from all downstream modules.
Could you explain a bit more which information you are referring to? The fact that downstream modules could use (and that also means install into) the internal venv is the whole point of having it in the first place. From my understanding the idea was for downstream modules to install specific package versions into this internal venv to generate a well defined Python setup. But as said above I don't really use the internal venv and didn't actually add that part of the setup so I might misunderstand how it is supposed to be used.
I'd like things to be modular and SRP. There can be always a single command which does all, so that is not an issue. But there should be the possibility to run fine-grained commands if needed.
I propose
separate target for creating internal venv: when I ask for it and it doesn't work -> error. This improves the current situation where Python bindings are just silently disabled when the internal venv setup fails.
separate targets for installing the Python module: If I change my venv I can reconfigure (changing the venv path) and then install Python again without changing anything about my compiled C++ code (including pybind11 binding code)
Regarding all: I don't think the argument checks out from my point of view. The Python bindings can be used without installation, no? So if all is supposed to mean ready from a user perspective it doesn't need to mean that Python packages need to be installed. C++ code can also be installed but it can also be used without installation.
make install_python (or make install) makes much more sense to me from this perspective than the current situation where make all builds C++ and Python bindings and then installs Python but not C++.
"no extra work" would be nice but is currently not achieved. In addition to the usual Dune setup I need to either (a) start a virtual env myself before running dunecontrol, or (b) activate the automatically created internal venv after running dunecontrol.
Moreover, there is #296 (closed) that means I cannot use the code without a working virtual environment.
The current setup is ok if everything works and everyone knows the intended usage pattern. (And it does of course work in the clean CI setup.) If some part fails, I don't think things are too easy to understand. Explicit is better than implicit, some have said ;). Therefore, I think hiding build system magic like a secret pip installs during configure complicates, has already complicated, and will eventually complicate things. Being explicit about it increases transparency and maintainability.
Using the Python bindings without the installation required setting the PYTHONPATH to the build directory of every single module. I don't see that as an acceptable use pattern - although Robert does that.
Asking a user to run dunecontrol from within a virtual env is for me no extra work since it is good practice anyway but that is debatable.
Concerning #296 (closed) we just need to make a decision on what we want to happen instead. We could do an editable installation with --user for example then the Python bindings are instantly usable. The main thing is not to require sudo so no system wide installation.
Concerning the change of the venv without a reconfigure - if you can get that to work within the CMake framework great. I remember discussing this with Dominic a year ago and we gave up - too many CMake variable set from the FindPython command to be able to easily adapt without causing other problems. I don't quite remember but possibly the main issues were with the internal venv. If people using that can live with a different setup that makes switching horses mid race that is fine with me.
For me the main point is that an editable installation is not equal to a full installation in the same way a build directory is not an installation but contains the path to the source files - the editable installation in the current setup does the same. It makes the source files usable from within a well defined local setup. A real source independent installation is done with make install.
All the ctests and python tests will not run without (the editable) installation having been done first - again I would be surprised if make all did not provide me with a setup which allows me to build and run all tests or start ctest - and testing should include everything.
I am still unclear why the editable installation is an issue and why it should cause problems if dunecontrol all does that step - if that means dunecontrol calls make python_install_editable or whatever at some stage that is fine with me. Having to do an extra step after running all is surprising for the user and I found that difficult to communicate to collaborators in 2.8.
Btw: at the moment we rely on make install doing the installation of the python packages for the installation from pypi. Since we are using an external package to handle the building of the dune libs in the isolated environment pip sets up, and then to do the installation into the venv, I don't know how easy it would be to use a different command here. And I also would find it surprising if make install did not install the Python bindings together with the C++ bindings. For my workflow they are not separate things.
In general, I'm not looking for workarounds and hacks for specific examples. I'm looking for a clean solution for all usage patterns. For example, the pattern that you seem to use with creating a venv first and then running dunecontrol is a good (and IMO probably the best/easiest) pattern to use the Python bindings and should of course work.
Using the Python bindings without the installation required setting the PYTHONPATH to the build directory of every single module
I think this should be possible. Why forbid this if it's not necessary to forbid it? This was one approach that worked with 2.8. It was also possible to set the PYTHONPATH for every single module at once using dunecontrol so you needed only one command (a bit of a long command but this could easily be improved upon). And then I don't want some 100+ MB virtual env also being set up.
again I would be surprised if make all did not provide me with a setup which allows me to build and run all tests or start ctest
well, you cannot run ctest without make build_tests first, no? All C++ tests will fail. So for C++ tests it's ok to have an additional step but Python tests need to work immediately? Doesn't seem consistent to me.
I am still unclear why the editable installation is an issue and why it should cause problems if dunecontrol all does that step
This is not an issue. I just want a possibility to do an editable install with a single command without doing other stuff like make all. If you want dunecontrol all to have the current behavior, then I can just start not using it or propose another subcommand for dunecontrol that does less. I just want the option. (And this implies that the implementation must separate concerns.)
Having to do an extra step after running all is surprising for the user and I found that difficult to communicate to collaborators in 2.8.
Side comment: why is it important what it's called? If it's all or fullfull_setup or dune+python? Don't you just want a single command to rule them all? Users will run whatever command is written in the docs.
Repeating from above: There can be always a single command which does all, so that is not an issue.
For the internal venv setup you have to run an additional command after all (status quo). Why is the venv not activated? I would expect that from an all-in-one command. (I know that for the "venv first" you don't have to run two dunecontrol commands and I do not plan to change that.)
None of this is relevant to my suggestion/wish, which is: it should also be possible to run steps separately
Btw: at the moment we rely on make install doing the installation of the python packages for the installation from pypi. Since we are using an external package to handle the building of the dune libs in the isolated environment pip sets up, and then to do the installation into the venv, I don't know how easy it would be to use a different command here. And I also would find it surprising if make install did not install the Python bindings together with the C++ bindings. For my workflow they are not separate things.
I agree that make install would install everything. Additionally, make install_python could install only Python, and make install_cpp could only install C++ (if need be). Why is this a problem? Why is being modular so bad?
Having Python installation within configure is a surprise.
Having more steps with dedicated names would help.
Having an option to prevent creating a huge venv would be a plus.
make all build the library, not the tests (if this analogy does apply to Python). Even for make install no test is build in C++ and my expectation for Python would be the same.
Please, this is not meant to be a complaint, but as constructive criticism on how to align expectations from a non-Python view to the Python bindings. I won't be able to help much about this, because my Python is not advanced enough. But I want to point out, that it is not just Timo who sense room for improvements.
I never said I wanted to forbid setting the PYTHONPATH that is fine with me and as I said is what Robert does - so I want that to work but it wouldn't be the default behavior I'd suggest to students.
As I also said having sub steps is fine with me as well - no issue with that if somebody wants to put in the work to reorganize the CMake scripts. I would just want to continue supporting the different workflows we are supporting now.
Having an extra command to install into some other venv is also no issue for me - as I said we discussed that but couldn't figure out how to get it to work. One would have to call FindPython again outside of the configure stage to get the right interpreter without changing the actual configuration. Don't know - not enough of a CMake expert to judge to implications.
If I understand the discussion here correctly there is possibly not much of an issue. The only problem I perhaps have is that implementing this is going to be quite an involved process until everything works as it should. That will greatly prolong the time to the next release which is a shame both for the Python users who have to use master (or a really outdated setup) and for Simon's CMake development. I was really hoping to get a release branched by Easter.
Concerning the tests the only thing I would say is that there is a certain workflow people would expect to be able to use (and will use) and that is to run make build_tests and make tests and have all tests run including the Python ones. So that is a usage pattern I would like to maintain so that usual testing also test the python base. If the python tests are simply ignored if no prior install step has been run then that will lead to less testing of the Python setup I think. Of course for the CI/nightly build we can add some extra command but for local testing that would not be expected to be necessary by most. But perhaps I misunderstand how your suggestion was to work for running tests.
The internal venv is activated in a sense - all relevant Python tests in the build dir are setup to use it, i.e., activate it before running the tests - that includes C++ code using embedded Python. The setup is nice as I pointed out above since the internal venv is then really local to the build setup, i.e., no chance of mixing different compilers etc. That approach makes sense to me. Activating the venv from the outside is not really the idea of the internal venv I believe (as I said I don't use it) but is of course possible. The usage as I understand it is from within the available CMake setup and local to CMake commands run in the build dir.
For me, a question to discuss in the meeting would be: which of these suggestions/issues are a release stopper, i.e., should be fixed before we make the next release, and which can also be done later. There will always be an evolution of the build-system, also of the python part of the build system. So, what could then be a breaking change (and as such requires downstream module adaption) and what would just change the behavior visible to the user?
One thing would be that I would like to see is the internal venv disabled per default. The minimum for me would be to make it possible to disable it and still get Python bindings (via PYTHONPATH or via pip install --user)
Another thing might be changing the meaning of dunecontrol all back to 2.8 and provide a different command for full-blown python venv setup + installation of Python modules and dependencies. But if all is kept as now on master, the other changes could probably be done after a realease.
dune-testtools is currently broken (one of the use cases) which may be fixed by fixing #278 (closed), but that wouldn't fit with my proposed approach (no install during configure). Actually, we realized in the discussion in the discussion in quality/dune-testtools!142 (closed) that the approach currently proposed there won't work because dunecontrol configure breaks.
Having Python installation within configure is a surprise.
Having more steps with dedicated names would help.
Having an option to prevent creating a huge venv would be a plus.
make all build the library, not the tests (if this analogy does apply to Python). Even for make install no test is build in C++ and my expectation for Python would be the same.
As I pointed out I don't consider the editable option an installation only making the code available to further command during the configure phase which is what I expect configure to be there for. That this involves the word installation is perhaps unfortunate but I would like to concentrate on the process that becomes possible after having run configure and not on words.
The env is rather small but no issues with adding a CMake flag to turn it off
No tests are being build with make all either - the C++ embedded tests are compiled like any other C++ test and Python code does not need to be build anyway. It's a question of being able to run the tests without an unexpected command being needed. If the Python setup is to be triggered by make build_tests that would also work I guess. But make all;make build_tests;make tests should lead to all tests being run including Python tests. From the perspective of a Python user/develper make all;make python_install_editable;make build_tests;make tests would not be an expected requirement.
Just thought I'd add the following in case the setup wasn't clear concerning the default non Python user experience: if no external venv is active an internal venv is setup in the build directory and the editable installation happens into that. Nothing is installed into any folder outside of the build directory and if I remove the builddir everything is gone. It's not an installation into some part of the system outside of the builddir. As suggested above, I'd like to drop the word installation for this and focus on the desired workflows.
well, it doesn't help to change words. The setup is clear. pip install package is an installation for me during configure (and it install packages from pypi) and created a big blob in the build directory and I don't see why I can't achieve the same with something like make venv (i.e. a venv target), optionally, and a bit later.
Because then the tests will not work as expected and that I find really problematic. And focusing on what we want to achieve instead of on words does help me at least.
But dunecontrol all runs make all so that would setup the venv. But would not install the Python bindings? So the tests would again not work - or is that done in make build_tests?
That is an anti-pattern to me. I run dunecontrol all;make build_tests;make tests. If that works in C++ I want it to work with Pythonbindings. Those are not just a nice addon for me but should be treated equivalently to using C++.
why can't you run dunecontrol full_setup;make build_tests;make tests? I also said above that dunecontrol all;make build_tests;make tests is ok as long as I'm not forced to use it. I still didn't understand where the anti-pattern is.
@tkoch perhaps you could write down for me exactly what you want me to have to do to get things to work (assuming I'm using dunecontrol all and that does what it does in 2.8. At the moment my understanding is
dunecontrol all
make python_install_editable
make build_tests
make tests
Is that right? If it is then I envision a lot misunderstanding and surprising behavior. Starts with people forgetting to run make python_install_editable. The next issue would be people running make python_install by mistake. If that happens a git pull in the module would not lead to an update of the Python sources. I can just see the email exchange, were I send an email that I fixed a bug and ask someone to pull and check and they send back that the bug is still there and I then have to figure out that they have used on non editable install. At the moment things work as I would expect in that regards. dunecontroll all;git pull updates all my sources and I can run make build_tests;make tests to see if a patch has fixed the bugs I had.
But perhaps if you send how that testing together with git pull etc. works and how it would differ for people using only C++ using C++/Python or only Python and why it should differ.
I thought you teach people to open a venv before? The above sequence would use the internal venv which you claimed you don't use. So if you are asking what I'd suggest you to do (which I wouldn't normally do but do now because you explicitly ask) would be
python3 -m venv venv
source venv/bin/activate
dunecontrol full_setup
make build_tests
make tests
which I thought is the status quo except for all->full_setup. I will also repeat again that staying with all is also fine, if we slightly change the way it works under the hood (it has to call configure, make all and make python_install_editable under the hood).
Or using the internal venv: I would like to set DUNE_CREATE_PYTHON_VIRTUAL_ENVIRONMENT=ON or a --venv option to dunecontrol full_setup:
dunecontrol (--venv) full_setup
make build_tests
make tests
This is in difference to master (status quo) where AKAIK, I actually have to do
dunecontrol all
make build_tests
source ./dune-common/build-cmake/dune-env/bin/acticate (is there a shortcut for this?)
make tests
which I think is an anti-pattern. How would I know that a virtual env is created and that I also have to activate it? And how to activate it? Tests would fail without activation, right? Edit: the last part is wrong because as pointed out by @andreas.dedner, the tests use a wrapper script to run in the venv without manual activation. However, I have to know this if I want to run a Python script directly.
On the other would also like other use cases to work, e.g.
dunecontrol configure
dunecontrol make -j all
make build_tests
ctest
would build everything (including Python bindings but not venv installation), build the tests, and skip the Python tests because the packages cannot be imported. Or
dunecontrol configure
dunecontrol make -j all
dunecontrol make set_pythonpath / make python_install_editable
make build_tests
ctest
to have Python working. Both approaches wouldn't install an internal virtual env because I didn't require this. (There are more complicated use cases like dune-testtools, where I personally think pure Python packages needed at Dune configure time should just be installed like regular Python packages before building Dune.)
For any of the approaches, I don't believe new users will just come up with these commands on their own. They will run whatever the documentation says (which would recommend the simplest option with the least commands for beginners and explain what else can be done for experts).
What you describe as status quo and anti-pattern makes a lot of sense to me. We discussed this on previous meetings and came up with this solution. The reason is that it automatically isolates different build setups. In contrast they may get mixed up when you require to manually activate the venv. I very much support this because I don't want to manually fiddle with venv's although I don't use the python bindings - just to avoid that they end up in a broken state.
How would I know that a virtual env is created and that I also have to activate it? And how to activate it? Tests would fail without activation, right?
I think you give the answer yourself:
...They will run whatever the documentation says...
I'm probably too dumb to get the arguments presented here. Or everyone talks about different things. I also don't want to fiddle with venv's if I don't use the Python bindings. I don't even want to fiddle with venv's in the case I do use the Python bindings. I want to be able to not create it in the first place and still compile binding code (#296 (closed))
I have the impression that arguments (on all sides) are repeated over and over again. It might be that we are just talking at cross-purposes or just have different expectation in what is "simple", "consistent", and "safe" or maybe it is just terminology. So, lets better talk in person and then summarize the results here.
Not true because as the above shows the current setup is not clear and what I wrote above wasn't understood. It makes sense to me to clear up what the status quo actually is. To really repeat myself:
The internal venv is activated in a sense - all relevant Python tests in the build dir are setup to use it, i.e., activate it before running the tests - that includes C++ code using embedded Python. The setup is nice as I pointed out above since the internal venv is then really local to the build setup, i.e., no chance of mixing different compilers etc. That approach makes sense to me. Activating the venv from the outside is not really the idea of the internal venv I believe (as I said I don't use it) but is of course possible. The usage as I understand it is from within the available CMake setup and local to CMake commands run in the build dir.
The activation is done when running the tests without the user having to activate it. That was (for Dominic) the whole point. I don't use the venv that doesn't mean I regard people wanting to use it as unimportant. The internal venv was something Dominic wanted and the current setup is his development.