:orphan: # MatCreateMPIAIJSumSeqAIJ Creates a `MATMPIAIJ` matrix by adding sequential matrices from each processor ## Synopsis ``` #include "petscmat.h" PetscErrorCode MatCreateMPIAIJSumSeqAIJ(MPI_Comm comm, Mat seqmat, PetscInt m, PetscInt n, MatReuse scall, Mat *mpimat) ``` Collective ## Input Parameters - ***comm -*** the communicators the parallel matrix will live on - ***seqmat -*** the input sequential matrices - ***m -*** number of local rows (or `PETSC_DECIDE`) - ***n -*** number of local columns (or `PETSC_DECIDE`) - ***scall -*** either `MAT_INITIAL_MATRIX` or `MAT_REUSE_MATRIX` ## Output Parameter - ***mpimat -*** the parallel matrix generated ## Note The dimensions of the sequential matrix in each processor MUST be the same. The input seqmat is included into the container "Mat_Merge_SeqsToMPI", and will be destroyed when mpimat is destroyed. Call `PetscObjectQuery()` to access seqmat. ## See Also [](ch_matrices), `Mat`, `MatCreateAIJ()` ## Level advanced ## Location src/mat/impls/aij/mpi/mpiaij.c --- [Edit on GitLab](https://gitlab.com/petsc/petsc/-/edit/release/src/mat/impls/aij/mpi/mpiaij.c) [Index of all Mat routines](index.md) [Table of Contents for all manual pages](/manualpages/index.md) [Index of all manual pages](/manualpages/singleindex.md)