#include "petscdmplex.h" #include "petscdmplex.h" PetscErrorCode DMPlexBuildFromCellListParallel(DM dm, PetscInt numCells, PetscInt numVertices, PetscInt NVertices, PetscInt numCorners, const PetscInt cells[], PetscSF *vertexSF)
dm | - The DM | |
numCells | - The number of cells owned by this process | |
numVertices | - The number of vertices owned by this process, or PETSC_DECIDE | |
NVertices | - The global number of vertices, or PETSC_DECIDE | |
numCorners | - The number of vertices for each cell | |
cells | - An array of numCells*numCorners numbers, the global vertex numbers for each cell |
vertexSF | - (Optional) SF describing complete vertex ownership |
2
/ | \
/ | \
/ | \
0 0 | 1 3
\ | /
\ | /
\ | /
1would have input
numCells = 2, numVertices = 4
cells = [0 1 2 1 3 2]
which would result in the DMPlex
4
/ | \
/ | \
/ | \
2 0 | 1 5
\ | /
\ | /
\ | /
3
Vertices are implicitly numbered consecutively 0,...,NVertices. Each rank owns a chunk of numVertices consecutive vertices. If numVertices is PETSC_DECIDE, PETSc will distribute them as evenly as possible using PetscLayout. If both NVertices and numVertices are PETSC_DECIDE, NVertices is computed by PETSc as the maximum vertex index in cells + 1. If only NVertices is PETSC_DECIDE, it is computed as the sum of numVertices over all ranks.
The cell distribution is arbitrary non-overlapping, independent of the vertex distribution.
Not currently supported in Fortran.