Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
D
dune-common
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Package registry
Model registry
Operate
Environments
Terraform modules
Monitor
Incidents
Service Desk
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Core Modules
dune-common
Commits
7fab262f
Commit
7fab262f
authored
11 years ago
by
Benjamin Bykowski
Committed by
Oliver Sander
11 years ago
Browse files
Options
Downloads
Patches
Plain Diff
Added methods scatterv, gatherv and allgatherv to collectivecommunication
parent
cb5be0f2
Branches
Branches containing commit
Tags
Tags containing commit
No related merge requests found
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
dune/common/parallel/collectivecommunication.hh
+74
-0
74 additions, 0 deletions
dune/common/parallel/collectivecommunication.hh
dune/common/parallel/mpicollectivecommunication.hh
+28
-0
28 additions, 0 deletions
dune/common/parallel/mpicollectivecommunication.hh
with
102 additions
and
0 deletions
dune/common/parallel/collectivecommunication.hh
+
74
−
0
View file @
7fab262f
...
...
@@ -198,6 +198,32 @@ namespace Dune
return
0
;
}
/** @brief Gather arrays of variable size on root task.
*
* Each process sends its in array of length sendlen to the root process
* (including the root itself). In the root process these arrays are stored in rank
* order in the out array.
* @param[in] in The send buffer with the data to be sent
* @param[in] sendlen The number of elements to send on each task
* @param[out] out The buffer to store the received data in. May have length zero on non-root
* tasks.
* @param[in] recvlen An array with size equal to the number of processes containing the number
* of elements to receive from process i at position i, i.e. the number that
* is passed as sendlen argument to this function in process i.
* May have length zero on non-root tasks.
* @param[in] displ An array with size equal to the number of processes. Data received from
* process i will be written starting at out+displ[i] on the root process.
* May have length zero on non-root tasks.
* @param[out] root The root task that gathers the data.
*/
template
<
typename
T
>
int
gatherv
(
T
*
in
,
int
sendlen
,
T
*
out
,
int
*
recvlen
,
int
*
displ
,
int
root
)
const
{
for
(
int
i
=*
displ
;
i
<
sendlen
;
i
++
)
out
[
i
]
=
in
[
i
];
return
0
;
}
/** @brief Scatter array from a root to all other task.
*
* The root process sends the elements with index from k*len to (k+1)*len-1 in its array to
...
...
@@ -218,6 +244,31 @@ namespace Dune
return
0
;
}
/** @brief Scatter arrays of variable length from a root to all other tasks.
*
* The root process sends the elements with index from send+displ[k] to send+displ[k]-1 in
* its array to task k, which stores it at index 0 to recvlen-1.
* @param[in] send The array to scatter. May have length zero on non-root
* tasks.
* @param[in] sendlen An array with size equal to the number of processes containing the number
* of elements to scatter to process i at position i, i.e. the number that
* is passed as recvlen argument to this function in process i.
* @param[in] displ An array with size equal to the number of processes. Data scattered to
* process i will be read starting at send+displ[i] on root the process.
* @param[out] recv The buffer to store the received data in. Upon completion of the
* method each task will have the same data stored there as the one in
* send buffer of the root task before.
* @param[in] recvlen The number of elements in the recv buffer.
* @param[out] root The root task that gathers the data.
*/
template
<
typename
T
>
int
scatterv
(
T
*
send
,
int
*
sendlen
,
int
*
displ
,
T
*
recv
,
int
recvlen
,
int
root
)
const
{
for
(
int
i
=*
displ
;
i
<*
sendlen
;
i
++
)
recv
[
i
]
=
send
[
i
];
return
0
;
}
/**
* @brief Gathers data from all tasks and distribute it to all.
*
...
...
@@ -238,6 +289,29 @@ namespace Dune
return
0
;
}
/**
* @brief Gathers data of variable length from all tasks and distribute it to all.
*
* The block of data sent from the jth process is received by every
* process and placed in the jth block of the buffer out.
*
* @param[in] in The send buffer with the data to send.
* @param[in] sendlen The number of elements to send on each task.
* @param[out] out The buffer to store the received data in.
* @param[in] recvlen An array with size equal to the number of processes containing the number
* of elements to recieve from process i at position i, i.e. the number that
* is passed as sendlen argument to this function in process i.
* @param[in] displ An array with size equal to the number of processes. Data recieved from
* process i will be written starting at out+displ[i].
*/
template
<
typename
T
>
int
allgatherv
(
T
*
in
,
int
sendlen
,
T
*
out
,
int
*
recvlen
,
int
*
displ
)
const
{
for
(
int
i
=*
displ
;
i
<
sendlen
;
i
++
)
out
[
i
]
=
in
[
i
];
return
0
;
}
/**
* @brief Compute something over all processes
* for each component of an array and return the result
...
...
This diff is collapsed.
Click to expand it.
dune/common/parallel/mpicollectivecommunication.hh
+
28
−
0
View file @
7fab262f
...
...
@@ -258,6 +258,15 @@ namespace Dune
root
,
communicator
);
}
//! @copydoc CollectiveCommunication::gatherv()
template
<
typename
T
>
int
gatherv
(
T
*
in
,
int
sendlen
,
T
*
out
,
int
*
recvlen
,
int
*
displ
,
int
root
)
const
{
return
MPI_Gatherv
(
in
,
sendlen
,
MPITraits
<
T
>::
getType
(),
out
,
recvlen
,
displ
,
MPITraits
<
T
>::
getType
(),
root
,
communicator
);
}
//! @copydoc CollectiveCommunication::scatter()
//! @note out must have space for P*len elements
template
<
typename
T
>
...
...
@@ -268,6 +277,16 @@ namespace Dune
root
,
communicator
);
}
//! @copydoc CollectiveCommunication::scatterv()
template
<
typename
T
>
int
scatterv
(
T
*
send
,
int
*
sendlen
,
int
*
displ
,
T
*
recv
,
int
recvlen
,
int
root
)
const
{
return
MPI_Scatterv
(
send
,
sendlen
,
displ
,
MPITraits
<
T
>::
getType
(),
recv
,
recvlen
,
MPITraits
<
T
>::
getType
(),
root
,
communicator
);
}
operator
MPI_Comm
()
const
{
return
communicator
;
...
...
@@ -282,6 +301,15 @@ namespace Dune
communicator
);
}
//! @copydoc CollectiveCommunication::allgatherv()
template
<
typename
T
>
int
allgatherv
(
T
*
in
,
int
sendlen
,
T
*
out
,
int
*
recvlen
,
int
*
displ
)
const
{
return
MPI_Allgatherv
(
in
,
sendlen
,
MPITraits
<
T
>::
getType
(),
out
,
recvlen
,
displ
,
MPITraits
<
T
>::
getType
(),
communicator
);
}
//! @copydoc CollectiveCommunication::allreduce(Type* inout,int len) const
template
<
typename
BinaryFunction
,
typename
Type
>
int
allreduce
(
Type
*
inout
,
int
len
)
const
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment