MatCreateMPIMatConcatenateSeqMat#
Creates a single large PETSc matrix by concatenating sequential matrices from each processor
Synopsis#
#include "petscmat.h"
PetscErrorCode MatCreateMPIMatConcatenateSeqMat(MPI_Comm comm, Mat seqmat, PetscInt n, MatReuse reuse, Mat *mpimat)
Collective
Input Parameters#
comm - the communicators the parallel matrix will live on
seqmat - the input sequential matrices
n - number of local columns (or
PETSC_DECIDE
)reuse - either
MAT_INITIAL_MATRIX
orMAT_REUSE_MATRIX
Output Parameter#
mpimat - the parallel matrix generated
Note#
The number of columns of the matrix in EACH processor MUST be the same.
See Also#
Level#
developer
Location#
Implementations#
MatCreateMPIMatConcatenateSeqMat_MPIAIJ in src/mat/impls/aij/mpi/mpiaij.c
MatCreateMPIMatConcatenateSeqMat_SeqAIJ in src/mat/impls/aij/seq/aij.c
MatCreateMPIMatConcatenateSeqMat_MPIBAIJ in src/mat/impls/baij/mpi/mpibaij.c
MatCreateMPIMatConcatenateSeqMat_SeqBAIJ in src/mat/impls/baij/seq/baij.c
MatCreateMPIMatConcatenateSeqMat_MPIDense in src/mat/impls/dense/mpi/mpidense.c
MatCreateMPIMatConcatenateSeqMat_SeqDense in src/mat/impls/dense/seq/dense.c
MatCreateMPIMatConcatenateSeqMat_MPISBAIJ in src/mat/impls/sbaij/mpi/mpisbaij.c
MatCreateMPIMatConcatenateSeqMat_SeqSBAIJ in src/mat/impls/sbaij/seq/sbaij.c
Index of all Mat routines
Table of Contents for all manual pages
Index of all manual pages