The attached file creates a 2-dimensional grid with 3x3 elements. One direction is labeled periodic, but it doesn't have an effect. Attached is also the output on my machine (Linux 2.6.18-92.1.6.el5 x86_64 GNU/Linux).
To my understanding, e.g. node 0 should have the same global number as node 3 and node 4 should have the same global node number as node 7, which it doesn't.
At first glance, the output looks perfectly correct. If you ignore periodicity for a moment, then this is exactly what you would expect, isn't it?
Now, you have to understand the way DUNE handles periodic boundaries. They are treated similar to process borders. This means that every interior or border entity has its own index and id. The border entities are then identified in the same manner two border entities are identified in a parallel setting, i.e., you can use communication to transport data from one border entity to another. You should also note that periodic borders are considered boundaries.
I was told on the mailing list it's a bug if periodic boundary are assigned different indices instead of being automatically identified. That's why I submitted the bug report.
So you're suggesting it is expected behavior? Then, if you have to figure it out manually, what's the periodic flag for?
Could you please point me to some sample code where it is handled the way you're describing?
The point is that you don't have to figure out manually. You simply use the grid's communication to exchange the data over the periodic boundary (like you would over a process border). So you basically have to ask yourself how to handle parallelization. Periodicity should then be handled by the grid (though I'm not sure if this was tested with YaspGrid).
Martin, I belive you are mistaken. If you consider the following case
seq Grid global ids:
vertizes: cells:
0---1---2 |-0---1-|
the periodic grid with overlap 1 give the following ids:
cells:
1-|-0---1-|-0
from the definition of the global id I'd expect the following ids for the vertizes:
vertizes:
--0---1---0--
the periodic vertizes don't need to be flaged, as your communication couples vertizes with the same global id, but if the global id is not the same, then you would still have a problem. And why should the global id of the copied cells be the same, but not the global id of the copied vertizes?
Hi Christian!
Here's some moral help. Recently I started playing around with the periodicity feature in YaspGrid and I found out that it does not do at all what I would expect a periodic grid to do. I would expect a periodic grid to be a topological torus, with no boundary in the periodic directions. I discussed this with others and was told that periodicity really only works with overlap and the communicate routines.
In my view, this is a bad solution, because it mixes two different concepts, namely parallelism and periodicity. Either one of those makes sense without the other, but with the current semantics I am forced to use parallelity concepts when I just want a periodic grid on a single processor. Also, the fact that some nodes are expected to have the same id but different indices violates our grid specification.
If anything of this is to change it will not be before lengthy discussions. For my own research I have started to write a periodic GridView which implements all the torus topology that I need. You are right, the sequential YaspGrid doesn't help at all when doing this. However I am not going to use communication methods for this, because I find these too unwieldy and error-prone for the taks.
this is a delicate question and I'm not sure if it has really been addressed. From what I have been told or could find out in the documentation, (global) ids should be unique within the interior-border partition.
As far as I understood the philosophy, we allow viewing a periodic grid as a nonperiodic one. Therefore, we need to be able to attach different data to vertex 0 and vertex 2 in your example. But this means we need different ids.
Your cell diagram is completely correct, of course. Moreover, I would understand that both cells with id 0 return the same subid's for the vertices. This means that the right border vertex (2 in the nonperiodic case) would have id 0 if the ask the overlap cell. As far as I know, this is consistent with the current implementation of YaspGrid.
In any case, I understand that this was discussed on a developer meeting and that some kind of a concept exists. It would be helpful, if someone could attach it to this flyspray task.
in your view of a topological torus, what coordinate should the rightmost vertex in Christian's example have? From what you're saying I would expect 0.0 instead of 1.0.
Hi Martin!
In my view, the rightmost vertex would be the same as the leftmost one. Not just a copy but the very same entity. Therefore you would indeed get the coordinate 0.0, but since we're on a circle all coordinates only makes sense modulo your periodicity length anyways. So in this case 0.0 == 1.0.
Hi Oliver,
this is a possible way of seeing periodic. But it is a little too restrictive in my opinion. What about the Moebius strip? Do we want to be able to represent this in DUNE?
Why not? (Klein Bottles?) Off the top of head I don't see a reason why not. The advantage of point of view is that it conceptually clear what is meant by periodicity. And then, if you want to distributed your periodic grid over several processors you can still add overlaps and exchange data via communicators.
Note that it would be an obvious choice to make the periodic boundary a processor boundary, but you don't actually have to do that.
I recognize, though, that for some applications you need to be able to find out whether a given intersection is on a periodic boundary. I propose to add suitable flags to the intersection class (like isPeriodic()...)
I meant a topological Moebius strip (something like YaspGrid<2> with one periodic direction that is glued inversely).
By the way: isPeriodic is redundant for now because the edge belonging to a periodic intersection has different ids from each side.
Ok, but back to your proposal and Christian's example. You're saying that I have two cells with the following corners: 0.0, 0.5 and 0.5, 0.0. How do I build the correct geometries with this information? Or am I not allowed to view the vertex coordinates as a function on the grid? This would mean that we disallow periodic, parametric grids, wouldn't it?
Of course isPeriodic is not necessary now, but in my view the edge belonging to a periodic intersection has the same id, because it is the same edge.
Your second element goes from 0.5 to 1, and this is how you implement the methods global(), jacobianInverseTransposed, local() etc. Only since we're on a torus 1==0, and if you ask the
right vertex itself about its position it will return 0.
I think I have learned from previous discussions that there are at least two possible applications for periodic grids, and both expect a slightly different behaviour. In my application two coordinates are angles, and therefore the configuration space really is a torus. But periodic media are slightly different. I propose to discuss this further at the next meeting.
I fixed the bug that caused the communicate method to crash in the periodic case in the sequential case (it might even have crashed in some parallel cases if one processor had to send more than one message to the same remote processor).
Christian Engwer was right that the global IDs for vertices are not those that one would expect (ie vertices that are identified on the periodic boundary do not have the same global id). I cannot fix this at the moment BUT I have a workaround that uses one communication to compute the correct global ids that have to be stored as user data then. I am happy to share this code with anyone who needs it.
Remarks of Oliver concerning the concept of periodic boundaries in YaspGrid. From a sequential perspective you are right that one would like to have periodic boundaries in a different way. BUT when you ever write a parallel code for periodic boundaries using iterative solvers you will find that it is quite natural how it is done in YaspGrid because your code runs parallel with and without periodic boundaries with almost zero change to the code. However, I never thought enough about P1 elements with periodic boundaries maybe it gets more difficult then...
File periodicworkaround.cc shows my workaround to compute correct global ids for the vertices of a periodic grid. The id is to allocate a vector vertexid of global ids. Then communicate is used to compute the minimum of all ids given to a vertex. Of course YaspGrid should be corrected to deliver correct ids. However, that is not so easy to do in the underlying on the fly implementation.
Peter fixed something back in 2008, nobody seems to be interested in this issue for almost three years. Is this still medium severity? Should we start a proper discussion about periodicity in a separate task?