dune-common issueshttps://gitlab.dune-project.org/core/dune-common/-/issues2023-02-10T15:06:34Zhttps://gitlab.dune-project.org/core/dune-common/-/issues/322[Python] set defines in compiled modules2023-02-10T15:06:34ZAndreas Dedner[Python] set defines in compiled modulesProblem
-------
The `SimpleGenerator.load` method takes a `defines` argument that can be used to set preprocess defines in the generator C++ code. At the moment that is added to the files before the includes are written into the files. ...Problem
-------
The `SimpleGenerator.load` method takes a `defines` argument that can be used to set preprocess defines in the generator C++ code. At the moment that is added to the files before the includes are written into the files. See `dune-common/python/dune/generator/generator.py`
```
source += ''.join(["#define " + d + "\n" for d in defines])
source += ''.join(["#include <" + i + ">\n" for i in includes])
```
But these defines are not added to the generated python module instance so that they are not set in modules further down the lines (includes are added).
This is probably not the expected behavior.
Solution
--------
We could add a `cppDefines` property. But we should perhaps consider adding a dictionary to the modules collecting `1cpp` properties of the module needed downstream to make it easier to extend the jit module generation process.https://gitlab.dune-project.org/core/dune-common/-/issues/318python -m fix-dunepy not working with new Makefile approach2023-02-10T15:06:33ZAndreas Dednerpython -m fix-dunepy not working with new Makefile approachOne can use `python -m dune fix-dunepy force` as before but the other inconsistency tests only make sense for the old cmake setup. Need to check if the inconsistencies that came up with the old setup are still an issue with the new setup...One can use `python -m dune fix-dunepy force` as before but the other inconsistency tests only make sense for the old cmake setup. Need to check if the inconsistencies that came up with the old setup are still an issue with the new setup, i.e., what can break by a ctrl-c at the wrong time?https://gitlab.dune-project.org/core/dune-common/-/issues/317Callable in algorithm.run2022-10-10T17:27:19ZRobert KCallable in algorithm.runCurrently it's only possible to receive `std::function<void(...)>` in an algorithm from the Python side since pybind11::function can only be cast into that. This is when the argument passed to algorithm.run has not cppType.
Setting the ...Currently it's only possible to receive `std::function<void(...)>` in an algorithm from the Python side since pybind11::function can only be cast into that. This is when the argument passed to algorithm.run has not cppType.
Setting the cppType by hand to, e.g. `std::function<bool(...)>` is a working fix.https://gitlab.dune-project.org/core/dune-common/-/issues/316Some methods in Communication<T> are unnecessarily mutable2023-03-14T14:12:07ZSantiago Ospina De Los Ríossospinar@gmail.comSome methods in Communication<T> are unnecessarily mutableSome methods like `send` and `recv` in the dummy `Communication<T>` are mutable while their counterpart in `Communication<MPI_Comm>` are const. Although copying the object is not a problem, I don't see the reason why these have to be mut...Some methods like `send` and `recv` in the dummy `Communication<T>` are mutable while their counterpart in `Communication<MPI_Comm>` are const. Although copying the object is not a problem, I don't see the reason why these have to be mutable.https://gitlab.dune-project.org/core/dune-common/-/issues/315race condition in dune-2.8 python builder2022-10-10T07:54:46ZChristian Engwerrace condition in dune-2.8 python builderI tried using dune python on an NFS share. After code generation we try to build the file, but sometimes cmake didn't yet realize that `CMakeLists.txt` has changed (as the changes are not yet fully synced).
What helped in my case was to...I tried using dune python on an NFS share. After code generation we try to build the file, but sometimes cmake didn't yet realize that `CMakeLists.txt` has changed (as the changes are not yet fully synced).
What helped in my case was to add an `os.sync()` at the the beginning of the `compile()` function.
I didn't check yet, how the situation is on the development branch.https://gitlab.dune-project.org/core/dune-common/-/issues/314Python buildir check returns wrong error code2023-02-10T15:06:34ZSantiago Ospina De Los Ríossospinar@gmail.comPython buildir check returns wrong error codeThe following discussion from !1162 should be addressed:
- [ ] @andreas.dedner started a [discussion](https://gitlab.dune-project.org/core/dune-common/-/merge_requests/1162#note_117407): (+4 comments)
> I think this needs to be 77...The following discussion from !1162 should be addressed:
- [ ] @andreas.dedner started a [discussion](https://gitlab.dune-project.org/core/dune-common/-/merge_requests/1162#note_117407): (+4 comments)
> I think this needs to be 77 so that the test is marked as skipped.
The script [`run-in-dune-env.sh`](https://gitlab.dune-project.org/core/dune-common/-/blob/5dcb1e2b96d99e758ad06894218b8ba786725e15/cmake/scripts/run-in-dune-env.sh.in) returns an error code with value 77. This value is chosen so that dune tests are marked as skipped when the python configuration failed. Otherwise, the value is arbitrary. Now, with !1148 merged, we can address this situation at configuration time directly and conditionally add tests to the test suite.
Related to #289.https://gitlab.dune-project.org/core/dune-common/-/issues/313Python virtual environment depends on dune-common python package2023-03-01T11:23:17ZSantiago Ospina De Los Ríossospinar@gmail.comPython virtual environment depends on dune-common python packageCurrently, the [`run-in-dune-env.sh`](https://gitlab.dune-project.org/core/dune-common/-/blob/5dcb1e2b96d99e758ad06894218b8ba786725e15/cmake/scripts/run-in-dune-env.sh.in) script uses the dune-common python module to check for something ...Currently, the [`run-in-dune-env.sh`](https://gitlab.dune-project.org/core/dune-common/-/blob/5dcb1e2b96d99e758ad06894218b8ba786725e15/cmake/scripts/run-in-dune-env.sh.in) script uses the dune-common python module to check for something on the build dirs (I don't know what). This is problematic because the venv may be used in downstream modules without necessarily assuming that dune-common python package is installed (e.g. `dune-testtools`, `dune-codegen`).
A simple fix is to conditionally check if the python package is available:
```bash
# check if dune-common is installed
if python -m dune --help &> /dev/null; then
# test if build directory matches installed dune python packages
python -m dune checkbuilddirs @PROJECT_NAME@ @CMAKE_BINARY_DIR@
RESULT=$?
if [ $RESULT -ne 0 ] ; then
echo "Dune python package could not check build directories"
exit $RESULT
fi
fi
# execute the command
"$@"
```
But perhaps someone with more insight of `checkbuilddirs` can give a more appropriate solution.Samuel Burbullasamuel.burbulla@mathematik.uni-stuttgart.deSamuel Burbullasamuel.burbulla@mathematik.uni-stuttgart.dehttps://gitlab.dune-project.org/core/dune-common/-/issues/312debugallocator.hh: `operator new` not well-behaved; causes test failures with...2022-07-19T20:36:20ZAnsgar Burchardtansgar.burchardt@tu-dresden.dedebugallocator.hh: `operator new` not well-behaved; causes test failures with LTO[dune-common 2.8.0 fails to build in Debian with link-time optimization enabled](https://bugs.debian.org/1015392) due to failures in testdebugallocator.
A backtrace from gdb shows:
```
Program received signal SIGFPE, Arithmetic exceptio...[dune-common 2.8.0 fails to build in Debian with link-time optimization enabled](https://bugs.debian.org/1015392) due to failures in testdebugallocator.
A backtrace from gdb shows:
```
Program received signal SIGFPE, Arithmetic exception.
0x0000555555556cf1 in Dune::DebugMemory::AllocationManager::allocate<char> (this=<optimized out>, n=<optimized out>) at ./dune/common/debugallocator.hh:124
124 ai.page_ptr = mmap(NULL, ai.pages * page_size,
(gdb) bt
#0 0x0000555555556cf1 in Dune::DebugMemory::AllocationManager::allocate<char> (this=<optimized out>,
n=<optimized out>) at ./dune/common/debugallocator.hh:124
#1 operator new (size=size@entry=64) at ./dune/common/debugallocator.hh:322
#2 0x00007ffff7fb0e82 in __gnu_cxx::new_allocator<unsigned long>::allocate (this=<optimized out>,
__n=8) at /usr/include/c++/11/ext/new_allocator.h:127
#3 std::allocator_traits<std::allocator<bool*> >::allocate (__n=8, __a=<synthetic pointer>...)
at /usr/include/c++/11/bits/alloc_traits.h:464
#4 std::_Deque_base<bool, std::allocator<bool> >::_M_allocate_map (__n=8,
this=0x7ffff7fc36f8 <Dune::dvverb+24>) at /usr/include/c++/11/bits/stl_deque.h:576
#5 std::_Deque_base<bool, std::allocator<bool> >::_M_initialize_map (__num_elements=0,
this=0x7ffff7fc36f8 <Dune::dvverb+24>) at /usr/include/c++/11/bits/stl_deque.h:625
#6 std::_Deque_base<bool, std::allocator<bool> >::_Deque_base (this=0x7ffff7fc36f8 <Dune::dvverb+24>)
at /usr/include/c++/11/bits/stl_deque.h:439
#7 std::deque<bool, std::allocator<bool> >::deque (this=0x7ffff7fc36f8 <Dune::dvverb+24>)
at /usr/include/c++/11/bits/stl_deque.h:834
#8 std::stack<bool, std::deque<bool, std::allocator<bool> > >::stack<std::deque<bool, std::allocator<bool> >, void> (this=<optimized out>, this=<optimized out>) at /usr/include/c++/11/bits/stl_stack.h:163
#9 0x00007ffff7fb0f42 in _sub_I_65535_0.0 ()
from /build/dune-common-GdfBws/dune-common-2.8.0/build/lib/libdunecommon.so.2.8.0
#10 0x00007ffff7fdc0ce in ?? () from /lib64/ld-linux-x86-64.so.2
#11 0x00007ffff7fdc1b0 in ?? () from /lib64/ld-linux-x86-64.so.2
#12 0x00007ffff7fcd08a in ?? () from /lib64/ld-linux-x86-64.so.2
#13 0x0000000000000001 in ?? ()
#14 0x00007fffffffeccd in ?? ()
#15 0x0000000000000000 in ?? ()
```
This is during the construction of global objects, in particular some member of `Dune::dvverb`.
It calls the custom `operator new` from "debugallocator.hh".
The relevant instruction is:
```
=> 0x0000555555556cf1 <+81>: div %r12
```
which should come from this division:
```
ai.pages = (ai.capacity) / page_size + 2;
```
`page_size` is still 0 as the global variables from the "debugallocator.cc" translation unit have not been initialized when the "Dune::dvverb" object gets constructed.
Note that the order in which global variables in different translation units are initialized is undefined according to the C++ standard, but the code currently requires a specific order (`page_size` must be initialized before anything uses the `operator new`).https://gitlab.dune-project.org/core/dune-common/-/issues/311python packaging fails with strange errors if dependencies are not available ...2023-04-09T14:10:39ZChristian Engwerpython packaging fails with strange errors if dependencies are not available in the pipy repositoryI tried building locally a python package for dnue-uggrid.
The current build infrastructure tries to install the dependencies. The problem is now that `dune-uggrid` is version 2.9 and requires `dune-common` version 2.9. My initial idea ...I tried building locally a python package for dnue-uggrid.
The current build infrastructure tries to install the dependencies. The problem is now that `dune-uggrid` is version 2.9 and requires `dune-common` version 2.9. My initial idea was to install `dune-common` 2.9 into my local venv using `pip install ./dune-common` which worked perfectly:
```
> pip list
Package Version
------------------ ---------------
certifi 2022.6.15
charset-normalizer 2.0.12
cmake 3.22.5
distro 1.7.0
dune-common 2.9.dev20220622
idna 3.3
Jinja2 3.1.2
MarkupSafe 2.1.1
mpi4py 3.1.3
ninja 1.10.2.3
numpy 1.22.4
packaging 21.3
pip 22.1.1
portalocker 2.4.0
pyparsing 3.0.9
requests 2.28.0
scikit-build 0.15.0
setuptools 59.6.0
urllib3 1.26.9
wheel 0.37.1
```
... but still `pip install ./dune-uggrid` tries to install `dune-common` and finds only the 2.8 wheel:
```
> pip -v install dnue-uggrid/
Using pip 22.1.1 from /home/christi/Uni/Dune/foobar/lib/python3.10/site-packages/pip (python 3.10)
Processing ./uggrid
Running command pip subprocess to install build dependencies
Collecting cmake>=3.13
Using cached cmake-3.22.5-py2.py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (22.5 MB)
Collecting dune-common
Using cached dune_common-2.8.0.post2-cp310-cp310-linux_x86_64.whl
Collecting ninja
Using cached ninja-1.10.2.3-py2.py3-none-manylinux_2_5_x86_64.manylinux1_x86_64.whl (108 kB)
Collecting pip
Using cached pip-22.1.2-py3-none-any.whl (2.1 MB)
Collecting requests
Using cached requests-2.28.0-py3-none-any.whl (62 kB)
Collecting scikit-build
Using cached scikit_build-0.15.0-py2.py3-none-any.whl (77 kB)
Collecting setuptools
Using cached setuptools-62.6.0-py3-none-any.whl (1.2 MB)
Collecting wheel
Using cached wheel-0.37.1-py2.py3-none-any.whl (35 kB)
Collecting portalocker
Using cached portalocker-2.4.0-py2.py3-none-any.whl (16 kB)
Collecting mpi4py
Using cached mpi4py-3.1.3-cp310-cp310-linux_x86_64.whl
Collecting numpy
Using cached numpy-1.22.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (16.8 MB)
Collecting urllib3<1.27,>=1.21.1
Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting idna<4,>=2.5
Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
Using cached certifi-2022.6.15-py3-none-any.whl (160 kB)
Collecting packaging
Using cached packaging-21.3-py3-none-any.whl (40 kB)
Collecting distro
Using cached distro-1.7.0-py3-none-any.whl (20 kB)
Collecting pyparsing!=3.0.5,>=2.0.2
Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Installing collected packages: ninja, cmake, wheel, urllib3, setuptools, pyparsing, portalocker, pip, numpy, mpi4py, idna, distro, charset-normalizer, certifi, requests, packaging, dune-common, scikit-build
```
The build will then later fail with a `cmake` error, as it only find `dune-common` version 2.8 and not version 2.9.
If this intended behaviour? Or how should I properly handle this local install?https://gitlab.dune-project.org/core/dune-common/-/issues/309PoolAllocator is not an allocator2022-06-08T11:25:15ZCarsten Gräsergraeser@math.fau.dePoolAllocator is not an allocatorThe `PoolAllocator` class does not satisfy the allocator requirements of the standard. More specifically the requirements for `A` to be an allocator include:
Assume that `B` is obtained by rebinding `A` to another type and than `a` is a...The `PoolAllocator` class does not satisfy the allocator requirements of the standard. More specifically the requirements for `A` to be an allocator include:
Assume that `B` is obtained by rebinding `A` to another type and than `a` is an object of type `A`. Then after constructing a `B` using `B b(a);` it must hold that `A(b) == a` and `B(a) == b`. This equality constraint especially implies that you can deallocate memory allocated by another instance that compares equal. Hence all allocators obtained in this way from a single allocator essentially must have shared state.
In contrast, different `PoolAllocator` objects will never compare equal, because each one manages its own pool and they never share state. Additionally to not meeting the requirements this also prohibits many use cases. E.g. `std::allocate_shared` (creation of `shared_ptr` with allocator) will always use a copy of the provided allocator. Thus using a `PoolAllocator` many times in `std::allocate_shared` will contstruct a new internal pool for any allocation.https://gitlab.dune-project.org/core/dune-common/-/issues/308MPICommunicator does not allow for size adjustment in async communication2022-06-02T12:55:03ZChristian EngwerMPICommunicator does not allow for size adjustment in async communication`MPICommunicator::rrecv` allows to pass a buffer with insufficient size. It first uses `MPI_Mprobe` and `MPI_Get_count` to extract the size info from the message, then resizes the buffer and at last it actually performs the `MPI_Recv`.
...`MPICommunicator::rrecv` allows to pass a buffer with insufficient size. It first uses `MPI_Mprobe` and `MPI_Get_count` to extract the size info from the message, then resizes the buffer and at last it actually performs the `MPI_Recv`.
It would be very helpful to have a similar feature to `irecv`.
The approach would roughly look as follows:
- use `MPI_Iprobe` to check for the message in an async fashion
- return a corresponding future
- after `future.ready()` is true the data can be received, e.g. in `future.get()` or any other polling call
- with the `MPI_Iprobe`/`MPI_Get_count` results we resize the buffer and perform a sync `MPI_Recv`.https://gitlab.dune-project.org/core/dune-common/-/issues/300GenerateTypeName is fragile2023-04-09T14:08:30ZChristian EngwerGenerateTypeName is fragileThe `GenerateTypeName` infrastructure in `typeregistry.hh` requires detailed knowledge about the details of the C++ type. Providing these manually is error-prone, see e.g. the global basis in dune-functions (dune-functions#70).
The glob...The `GenerateTypeName` infrastructure in `typeregistry.hh` requires detailed knowledge about the details of the C++ type. Providing these manually is error-prone, see e.g. the global basis in dune-functions (dune-functions#70).
The global basis defines a `DiscreteGlobalBasisFunction`, but sets a type name to `GlobalBasis::DiscreteFunction`. When trying to generate C++ code the can operate on such a discrete function, we get compiler errors, because there is no typename `GlobalBasis::DiscreteFunction` in the concrete `GlobalBasis`.
Couldn't we make `GenerateTypeName` more robust by using `className` to directly extract the correct type name?
The problem is that `GenerateTypeName` is hardly documented, thus I'm unsure about implications of such a change.https://gitlab.dune-project.org/core/dune-common/-/issues/299make doc fails to build latex docs2023-02-12T20:37:37ZMarkus Blattmake doc fails to build latex docsNot sure whether its me, my system or DUNE, but I get latex compile errors during `make doc`:
```
[ 5%] Building PDF from communication.tex...
Rc files read:
/etc/LatexMk
/home/mblatt/src/dune/current/dune-common/build-cmake/doc/com...Not sure whether its me, my system or DUNE, but I get latex compile errors during `make doc`:
```
[ 5%] Building PDF from communication.tex...
Rc files read:
/etc/LatexMk
/home/mblatt/src/dune/current/dune-common/build-cmake/doc/comm/doc_comm_communication_tex.latexmkrc
Latexmk: This is Latexmk, John Collins, 29 September 2020, version: 4.70b.
Set environment variable BIBINPUTS='/home/mblatt/src/dune/current/dune-common/build-cmake/doc/comm:'
Set environment variable TEXINPUTS='/home/mblatt/src/dune/current/dune-common/build-cmake/doc/comm:'
Latexmk: applying rule 'bibtex /home/mblatt/src/dune/current/dune-common/build-cmake/doc/comm/communication'...
Rule 'bibtex /home/mblatt/src/dune/current/dune-common/build-cmake/doc/comm/communication': The following rules & subrules became out-of-date:
'bibtex /home/mblatt/src/dune/current/dune-common/build-cmake/doc/comm/communication'
------------
Run number 1 of rule 'bibtex /home/mblatt/src/dune/current/dune-common/build-cmake/doc/comm/communication'
------------
For rule 'bibtex /home/mblatt/src/dune/current/dune-common/build-cmake/doc/comm/communication', running '&run_bibtex( )' ...
------------
Running '/usr/bin/bibtex "/home/mblatt/src/dune/current/dune-common/build-cmake/doc/comm/communication.aux"'
------------
/usr/bin/bibtex: Not writing to /home/mblatt/src/dune/current/dune-common/build-cmake/doc/comm/communication.blg (openout_any = p).
I couldn't open file name `/home/mblatt/src/dune/current/dune-common/build-cmake/doc/comm/communication.blg'
Latexmk: Errors, so I did not complete making targets
Collected error summary (may duplicate other messages):
bibtex /home/mblatt/src/dune/current/dune-common/build-cmake/doc/comm/communication: Could not open bibtex log file for '/home/mblatt/src/dune/current/dune-common/build-cmake/doc/comm/communication'
Latexmk: Use the -f option to force complete processing,
unless error was exceeding maximum runs, or warnings treated as errors.
make[3]: *** [doc/comm/CMakeFiles/doc_comm_communication_tex.dir/build.make:77: doc/comm/CMakeFiles/doc_comm_communication_tex] Fehler 12
make[2]: *** [CMakeFiles/Makefile2:1977: doc/comm/CMakeFiles/doc_comm_communication_tex.dir/all] Fehler 2
make[1]: *** [CMakeFiles/Makefile2:699: CMakeFiles/doc.dir/rule] Fehler 2
```https://gitlab.dune-project.org/core/dune-common/-/issues/294reomove pybind11 sources from dune-common2024-02-22T13:33:42ZAndreas Dednerreomove pybind11 sources from dune-commonBackground
----------
At this point in time `pybind11` headers are included in `dune-common`. The reason for that was that `pybind11` was very actively developed and non compatible changes were not uncommon at the beginning. So we wanted...Background
----------
At this point in time `pybind11` headers are included in `dune-common`. The reason for that was that `pybind11` was very actively developed and non compatible changes were not uncommon at the beginning. So we wanted control over the version that is being used. That is not that much of an issue anymore.
Suggestion
----------
Remove the package from dune-common.
Have a cmake find with a required version for pybind11 (equals or >=) and if that is not found in the system, we can install pybind11 into the venv which can be deactivated by the user via some flag (or the user can install a newer version by hand and get dune to install the required version into the venv by setting some flag) For the packages I would suggest to have installation done by dune if the packages is not found.
Related: discussion in !811 and the discussion in dune-istl!386.https://gitlab.dune-project.org/core/dune-common/-/issues/291Fix version numbers for dune package requirements2022-01-27T11:31:42ZAndreas DednerFix version numbers for dune package requirementsAt the moment we (mostly) don't fix specific versions for the `PythonRequirements` in `dune.module` - except for `pip>=21` and `setuptools>=41`. I just now ran into the problem that the binding code didn't work correctly with a newer `nu...At the moment we (mostly) don't fix specific versions for the `PythonRequirements` in `dune.module` - except for `pip>=21` and `setuptools>=41`. I just now ran into the problem that the binding code didn't work correctly with a newer `numpy` version due to a change in the `dtype.format` value (changed from `Q` to `<Q`).
Should we fix versions (or at least ranges if possible) for those packages?
**Advantage**: code will be more stable and less issues report by users
**Disadvantage**: more difficult to mix code, i.e., we now have some projects with students mixing `fenics` and `dune` or using `petsc4py` if those package were to set other version requirements (which would actually be fine for dune) then that becomes impossible.
A possible compromise would be to add specific version of the required packages into the internal venv but not into an active external venv and making it clear that we support stability with respect to package version _only_ when using the internal venv. But that might look inconsistent.https://gitlab.dune-project.org/core/dune-common/-/issues/290[MPI] MPICommunication should clone the communicator in the constructor.2022-01-10T14:14:21ZRobert K[MPI] MPICommunication should clone the communicator in the constructor.As discussed in !1068 the MPICommunication class (https://gitlab.dune-project.org/core/dune-common/-/blob/master/dune/common/parallel/mpicommunication.hh#L110) in dune-common should in my opinion clonse (MPI_Comm_dup) the communicator pa...As discussed in !1068 the MPICommunication class (https://gitlab.dune-project.org/core/dune-common/-/blob/master/dune/common/parallel/mpicommunication.hh#L110) in dune-common should in my opinion clonse (MPI_Comm_dup) the communicator passed in the constructor. Otherwise this can lead to some serious deadlock problems, for example, when using barriers etc or Send/Recv without tagging.https://gitlab.dune-project.org/core/dune-common/-/issues/289python tests should show up as skipped if python is missing2023-03-01T11:23:17ZChristian Engwerpython tests should show up as skipped if python is missingThe usual behaviour of tests in DNUE is, that they are listed as skipped, if certain prerequisits are not met.
Skipped python test on the other hand don't show up in the logs at all. In order to better judge the coverage I suggest to mo...The usual behaviour of tests in DNUE is, that they are listed as skipped, if certain prerequisits are not met.
Skipped python test on the other hand don't show up in the logs at all. In order to better judge the coverage I suggest to modify the `dune_add_python_test` macro such that, if python is not available, tests are listed as skipped.https://gitlab.dune-project.org/core/dune-common/-/issues/282[python] add a 'test' command to (all) `dune.foo.__main__` scripts2022-02-28T10:14:18ZAndreas Dedner[python] add a 'test' command to (all) `dune.foo.__main__` scriptsTo test packaging better we could add a `test` command to the `__main__.py` scripts of each module. A CI pipeline could then run `dunepackaging`, pip install the generated package, and then run `python -m dune.foo test` to check that the...To test packaging better we could add a `test` command to the `__main__.py` scripts of each module. A CI pipeline could then run `dunepackaging`, pip install the generated package, and then run `python -m dune.foo test` to check that the package can be used. At the moment packages are just tested with the available tutorials (grid and fem) so certain packages are not really tested if they are not used there.https://gitlab.dune-project.org/core/dune-common/-/issues/281[python] install pybind11 from pypi2023-03-01T11:23:17ZAndreas Dedner[python] install pybind11 from pypiCurrently we ship the `pybind11` headers with `dune-common` which means they have to be upgraded occasionally. A possible approach would now be to add `pybin11==version` to the `Python-requires` list in `dune.module` which would install ...Currently we ship the `pybind11` headers with `dune-common` which means they have to be upgraded occasionally. A possible approach would now be to add `pybin11==version` to the `Python-requires` list in `dune.module` which would install a working version of `pybind11` into the venv. The location of the headers ends up being `pythonEnv/lib/pythonX.X/site-packages/pybind11/include/pybind11`. So this approach would require adding that to the general include path (also for `dune-py`).
After installation this path is the output of calling `python -m pybind11 --includes`.https://gitlab.dune-project.org/core/dune-common/-/issues/279Python packages should not be installed during configure and make2023-04-09T14:02:35ZTimo KochPython packages should not be installed during configure and makeThe usual procedure of installing C++ based projects is configure, make, install. With the new Python bindings enabled per default there some IMO __unexpected behaviour__:
1. During `configure` (e.g. `dunecontrol configure`) Python pack...The usual procedure of installing C++ based projects is configure, make, install. With the new Python bindings enabled per default there some IMO __unexpected behaviour__:
1. During `configure` (e.g. `dunecontrol configure`) Python packages (like `jinja2`, `numpy`) are silently installed into the virtual environment (external if a venv activated, internal in dune-common if we are not in a venv). This doesn't even produce output.
2. During `make all` (e.g. `dunecontrol make`) Python packages are installed into the virtual environment. This time it's the Dune packages depending on the Python bindings.
3. During `make install_python` packages get installed globally (system-wide) even if a virtual env is activated. EDIT: This only happens when the internal venv is used.
__Expected behaviour__:
1. `configure` only configures (and doesn't create a venv)
2. `make all` only builds (also building Python bindings (the C++ part) that are enabled per default now)
3. `make install_python` installs Python packages (system-wide may be the default but if a venv is actived it would be nice to install it in there per default)
4. `make install_python_editable` installs Python packages editable instead (would be nice to have)
__Some current issues__
1. It currently doesn't seem possible to just configure and build without installing Python modules. This requires an internet connection (ok it prints some warnings if there is no internet and skips it I think). I might want to just build my code locally without an internet connection and compile some C++ stuff. Why do I need to install Python bindings for that? Or start my own virtual env after running configure. Then the packages are installed in a now useless internal venv. Why not wait?
2. Or in a CI pipeline, I might want to separate Python and C++ testing in different build jobs where the C++ job doesn't need to know anything about the Python job.
3. I might want to install my Python modules system-wide with `make install_python`. However, per default everything always gets installed editable inside the internal venv during `make` already, although this is completely unnecessary in that case.
4. In the case of internal venv the build dependency / flow of information is inverted. Usually, downstream modules depend on upstream modules. Now, the dune-common build directory contains information from all downstream modules.
5. Dune now (at least on Ubuntu) requires to install `python-venv` otherwise the dunecontrol fails.
Building the Python C++ binding modules per default is of course ok. But why is the __installation__ of Python packages not optional anymore (like in Dune 2.8)? Also is there a reason not to delay the installation until installed Python packages are actually needed?
__Suggestion__:
1. Don't create an internal virtual env during configure or build/make. If an external venv is activated, use that as default installation destination, if not, install system-wide per default (like for C++) ((_alternatively, default-install into an internal documented location like now but then `make install_python` should create the venv and install there and not `configure` or `make all`_))
2. Configure only configures (includes configuration of Python packages which depend on CMake information), and one can pass an installation path for the Python packages (if you don't want to install system-wide or in an external activated venv)
3. `make all` only builds libraries (no C++ tests, and no Python installation)
4. `make install_python_editable` installs Python packages into configured destination in editable mode using `pip`
5. `make install_python` installs Python packages into configured destination in normal mode using `pip`
6. `make install`? I have no idea what this currently does in combination with Python packages.
I think this would make things both simpler to implement, more transparent, consistent and expected.
__Question__:
Is the following workflow already possible (by setting some variables) but I just don't know it?:
1. disable creation of internal venv and provide path to external venv via CMake (that is not activated yet)
2. run `dunecontrol all` which _only_ configures and builds _without_ installing like in 2.8 with Python bindings enabled
3. run some install command that installs Python modules and depedencies in provided path (possibly in editable mode)
I think this is at least one reasonable workflow that should be supported.
__User perspective: suggestion vs status quo__:
I think in the minimal case the following would be possible: (Assuming I cloned all modules into a directory)
* Currently (master), to run a python script with dune is as simple as:
- `./dune-common/bin/dunecontrol --opts=my.opts all`
- `source ./dune-common/build-cmake/dune-env/bin/activate`
- `python my_python_test.py`
* With what is suggested here, I believe, the same could be achieved with:
- `./dune-common/bin/dunecontrol --opts=my.opts all`
- `./dune-common/bin/dunecontrol make install_python_editable`
- `python my_python_test.py`
__Related__:
* Even with DUNE_ENABLE_PYTHONBINDINGS=0 a virtual env is installed. Would be nice to have one variable to turn off the Python stuff. (also see #293)