MatCreateMPIMatConcatenateSeqMat#

Creates a single large PETSc matrix by concatenating sequential matrices from each processor

Synopsis#

#include "petscmat.h" 
PetscErrorCode MatCreateMPIMatConcatenateSeqMat(MPI_Comm comm, Mat seqmat, PetscInt n, MatReuse reuse, Mat *mpimat)

Collective

Input Parameters#

Output Parameter#

  • mpimat - the parallel matrix generated

Note#

The number of columns of the matrix in EACH processor MUST be the same.

See Also#

Matrices, Mat

Level#

developer

Location#

src/mat/interface/matrix.c

Implementations#

MatCreateMPIMatConcatenateSeqMat_MPIAIJ in src/mat/impls/aij/mpi/mpiaij.c
MatCreateMPIMatConcatenateSeqMat_SeqAIJ in src/mat/impls/aij/seq/aij.c
MatCreateMPIMatConcatenateSeqMat_MPIBAIJ in src/mat/impls/baij/mpi/mpibaij.c
MatCreateMPIMatConcatenateSeqMat_SeqBAIJ in src/mat/impls/baij/seq/baij.c
MatCreateMPIMatConcatenateSeqMat_MPIDense in src/mat/impls/dense/mpi/mpidense.c
MatCreateMPIMatConcatenateSeqMat_SeqDense in src/mat/impls/dense/seq/dense.c
MatCreateMPIMatConcatenateSeqMat_MPISBAIJ in src/mat/impls/sbaij/mpi/mpisbaij.c
MatCreateMPIMatConcatenateSeqMat_SeqSBAIJ in src/mat/impls/sbaij/seq/sbaij.c


Edit on GitLab

Index of all Mat routines
Table of Contents for all manual pages
Index of all manual pages