:orphan: # PCMPIServerEnd ends a server that runs on the rank != 0 MPI processes waiting to process requests for parallel KSP solves and management of parallel `KSP` objects. ## Synopsis ``` PetscErrorCode PCMPIServerEnd(void) ``` Logically Collective on all MPI ranks except 0 ## Note This is normally ended automatically in `PetscFinalize()` when the option is provided ## See Also `PCMPIServerBegin()`, `PCMPI` ## Level developer ## Location src/ksp/pc/impls/mpi/pcmpi.c --- [Edit on GitLab](https://gitlab.com/petsc/petsc/-/edit/release/src/ksp/pc/impls/mpi/pcmpi.c) [Index of all PC routines](index.md) [Table of Contents for all manual pages](/manualpages/index.md) [Index of all manual pages](/manualpages/singleindex.md)