#include "petscksp.h" PetscErrorCode PCMGSetLevels(PC pc,PetscInt levels,MPI_Comm *comms)Logically Collective on PC
pc | - the preconditioner context | |
levels | - the number of levels | |
comms | - optional communicators for each level; this is to allow solving the coarser problems on smaller sets of processes. For processes that are not included in the computation you must pass MPI_COMM_NULL. |
You can free the information in comms after this routine is called.
The array of MPI communicators must contain MPI_COMM_NULL for those ranks that at each level are not participating in the coarser solve. For example, with 2 levels and 1 and 2 ranks on the two levels, rank 0 in the original communicator will pass in an array of 2 communicators of size 2 and 1, while rank 1 in the original communicator will pass in array of 2 communicators the first of size 2 and the second of value MPI_COMM_NULL since the rank 1 does not participate in the coarse grid solve.
Since each coarser level may have a new MPI_Comm with fewer ranks than the previous, one must take special care in providing the restriction and interpolation operation. We recommend providing these as two step operations; first perform a standard restriction or interpolation on the full number of ranks for that level and then use an MPI call to copy the resulting vector array entries (after calls to VecGetArray()) to the smaller or larger number of ranks, not in both cases the MPI calls must be made on the larger of the two communicators. Traditional MPI send and recieves or MPI_AlltoAllv() could be used to do the reshuffling of the vector entries.