In dune_python_install_package a requirements.txt is created for each module. Usually, this file contains the development requirements of a package. How does this relate to what is stated in Python-requires? Is this the way to add development requirements? And packaging requirements are then stated pyproject.toml?
Where would be the right place to add pylint and flake8 in the current setup which are only needed during development?
Idea/suggestion: if a requirement.txt is found in the configured Python dune module (in the build directory) next to setup.py, the requirements.txt is used. There are several possibilities what "used" means, the simplest is that requirements.txt is considered as pip install -e -r requirements.txt when doing an editable installation.
pyproject.toml is needed during build of the dune package and requirements.txt is installed into the venv and contains packages needed while using the package. At least that was the idea.
The requirements.txt is generated in
dune-common/cmake/modules/DunePythonInstallPackage.cmake:116 based on the Python_Requires from the dune.module. So I would thing that your packages need to go there.
Ok, the way I know it is that install_requires in setup.py specifies the minimal requirements for the package to work. requirement.txt additionally contains developer tools and/or pinned-down versions, or even specifies the package index, to create a reproducible test environment (see e.g. https://packaging.python.org/discussions/install-requires-vs-requirements/)
The way I read it is that in Dune the two are currently the same thing? And both get the packages from dune.module's Python-requires: but summed up over all module dependencies. Is this correct?
And a possible way to change this might be to pass a list a development requirements to dune_python_install_package maybe? Or to implement recognizing an existing requirements.txt in the package path if present.
I don't know about 'developer tools' but I'm not really a python expert.
I'm happy to have any approach that works and is easy to use with no extra steps.
We want to install some packages like numpy for example. Since dune packages are installed during make, I don't want to install numpy during pip install to avoid having to force internet access at the time. So, the package are installed during the configure phase which disables python package building if no internet access is available.
The same packages are added to setup.py to be used during make python_install since that might happen into a different venv (internal->external) so the packages might need to be reinstalled at that point.
If it's only about the file name then I am happy for us to rename requirements.txt to configurationpackages.txt or something else. The user will in general never sees that file anyway since it is autogenerated - we can even remove it again in the configure phase or perhaps the pip command can be made to read it from stdin?
The question if versions are pinned down BTW depends on what was added in dune.module. I've already considered, if we perhaps should pinn those down so that everyone gets the same numpy (for example) when working with dune.
Ok, the way I know it is that install_requires in setup.py specifies the minimal requirements for the package to work. requirement.txt additionally contains developer tools and/or pinned-down versions, or even specifies the package index, to create a reproducible test environment (see e.g. https://packaging.python.org/discussions/install-requires-vs-requirements/)
The way I read it is that in Dune the two are currently the same thing? And both get the packages from dune.module's Python-requires: but summed up over all module dependencies. Is this correct?
And a possible way to change this might be to pass a list a development requirements to dune_python_install_package maybe? Or to implement recognizing an existing requirements.txt in the package path if present.
I'm wondering if it would need to be a global requirement.txt (or if that would be at least more useful/practical). If I want to pin down versions, I would have to use the same matching versions of common dependencies for all modules to be installed, no? (Similar to *.opts files where I want to build all dune modules with matching compiler options)
To be honest I don't quite know what the point of the extra requirements.txt file, if during install one just runs
pip install -r requirements.txt anyway. The same thing is now already done with the requirements listed in dune.module. But if someone finds it useful, I don't mind either way. Since I don't know what the use case is, I don't know what the right approach would be.
Concerning the versions, I guess we have the same problem with the requirements listed in dune.module as well.
For example, dune-fem requires a specific ufl version - if some other package required a different version that would lead to a problem in the current setup. But I have no idea of how to solve that. The only way I can see is to then move ufl into a common upstream module once the problem is noticed. It's no different from the cmake packages, or is it? If istl fixed some parmetis version and dune-alugrid another one in their cmake find_package calls things would fail, right?
I'm wondering if it would need to be a global requirement.txt (or if that would be at least more useful/practical). If I want to pin down versions, I would have to use the same matching versions of common dependencies for all modules to be installed, no?
There are two different goals, setup.py+install_requires defines more abstract dependencies and may define version ranges that constitute the minimum requirements so that the package functions. requirements.txt may pin-point a specific version to create a reproducible test environment, and it may also list packages that are only needed during development, e.g. packaging tools, linter tools (pylint, flake8), auto-formatters (black), test packages (pytest, tox). So install_requires would be the thing you need for a published package and if you are just using the package, requirements.txt would be what you usually use during testing and development of packages.
For example, an extremely useful tool is black that just auto-formats code in an opinionated way, so that all code uses the same style no matter which developer wrote it. If the developers agree to use black then you need it during developement (to make the CI style check pass). So you install your packages in developer mode (editable) and with developer dependencies (requirements.txt) and develop. However, black is not required by users of the package, so it shouldn't appear in install_requires of the package.
Or I want to get exactly the same Python packages as another developer to reproduce a bug. I just do pip freeze > requirements.txt to get all current exact versions of the packages in the venv. Then I give it to someone else and they need to be able to install a local setup with these versions.
I can't install these directly in my local developer setup in my virtual env because installing the dune Python modules is more involved and needs dunecontrol. But I also don't know how I would get a specific dune-common version then. It can probably only check that the version matches the one of the current setup.
Then having a global requirements is perhaps the better version - it could simply be installed while setting up the virtual env since it doesn't necessarily have anything to do with the bindings?
There are two different goals, setup.py+install_requires defines more abstract dependencies and may define version ranges that constitute the minimum requirements so that the package functions. requirements.txt may pin-point a specific version to create a reproducible test environment, and it may also list packages that are only needed during development, e.g. packaging tools, linter tools, auto-formatters. So install_requires would be the thing you need for a published package and if you are just using the package, requirements.txt would be what you usually use during testing and development of packages.
The question is if it would contain the dune module versions too. But probably not because that is determined by the version of the current source code I will be building and cannot easily be changed in an automated way in a developing setup.
!1148 (merged) and !1161 (merged) make a clear distinction between abstract and concrete dependencies. Maybe you can build upon these changes to add package dependencies that are only required during development?
I just checked again the code DunePythonInstallPackage and concrete dependencies are installed only at build time and not at install time (introduced on !1148 (merged) and !1161 (merged)). You just need to specify that with a requirements.txt file and add the option INSTALL_CONCRETE_DEPENDENCIES in your dune_python_configure_package CMake call. That's exactly what this issue is asking for. So I will close this, feel free to reopen if you think that this is not correct.