:orphan:
# MatCreateMPIMatConcatenateSeqMat
Creates a single large PETSc matrix by concatenating sequential matrices from each processor
## Synopsis
```
#include "petscmat.h"
PetscErrorCode MatCreateMPIMatConcatenateSeqMat(MPI_Comm comm, Mat seqmat, PetscInt n, MatReuse reuse, Mat *mpimat)
```
Collective
## Input Parameters
- ***comm -*** the communicators the parallel matrix will live on
- ***seqmat -*** the input sequential matrices
- ***n -*** number of local columns (or `PETSC_DECIDE`)
- ***reuse -*** either `MAT_INITIAL_MATRIX` or `MAT_REUSE_MATRIX`
## Output Parameter
- ***mpimat -*** the parallel matrix generated
## Note
The number of columns of the matrix in EACH processor MUST be the same.
## See Also
[](ch_matrices), `Mat`
## Level
developer
## Location
src/mat/interface/matrix.c
## Implementations
MatCreateMPIMatConcatenateSeqMat_MPIAIJ in src/mat/impls/aij/mpi/mpiaij.c
MatCreateMPIMatConcatenateSeqMat_SeqAIJ in src/mat/impls/aij/seq/aij.c
MatCreateMPIMatConcatenateSeqMat_MPIBAIJ in src/mat/impls/baij/mpi/mpibaij.c
MatCreateMPIMatConcatenateSeqMat_SeqBAIJ in src/mat/impls/baij/seq/baij.c
MatCreateMPIMatConcatenateSeqMat_MPIDense in src/mat/impls/dense/mpi/mpidense.c
MatCreateMPIMatConcatenateSeqMat_SeqDense in src/mat/impls/dense/seq/dense.c
MatCreateMPIMatConcatenateSeqMat_MPISBAIJ in src/mat/impls/sbaij/mpi/mpisbaij.c
MatCreateMPIMatConcatenateSeqMat_SeqSBAIJ in src/mat/impls/sbaij/seq/sbaij.c
---
[Edit on GitLab](https://gitlab.com/petsc/petsc/-/edit/release/src/mat/interface/matrix.c)
[Index of all Mat routines](index.md)
[Table of Contents for all manual pages](/manualpages/index.md)
[Index of all manual pages](/manualpages/singleindex.md)