PETSc version 3.16.6
DMDAGetProcessorSubsets
Returns communicators consisting only of the processors in a DMDA adjacent in a particular dimension, corresponding to a logical plane in a 3D grid or a line in a 2D grid.
Synopsis
#include "petscdmda.h"
PetscErrorCode DMDAGetProcessorSubsets(DM da, DMDirection dir, MPI_Comm *subcomm)
Collective on da
Input Parameters
| da | - the distributed array
|
| dir | - Cartesian direction, either DM_X, DM_Y, or DM_Z
|
Output Parameter
| subcomm | - new communicator
|
Notes
This routine is useful for distributing one-dimensional data in a tensor product grid.
After use, comm should be freed with MPI_Comm_free()
Not supported from Fortran
Level
advanced
Location
src/dm/impls/da/dasub.c
Examples
src/dm/tutorials/ex51.c.html
Index of all DMDA routines
Table of Contents for all manual pages
Index of all manual pages