#include "petscmat.h" PetscErrorCode MatCreateBlockMat(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt bs,PetscInt nz,PetscInt *nnz, Mat *A)Collective on MPI_Comm
comm | - MPI communicator | |
m | - number of rows | |
n | - number of columns | |
bs | - size of each submatrix | |
nz | - expected maximum number of nonzero blocks in row (use PETSC_DEFAULT if not known) | |
nnz | - expected number of nonzers per block row if known (use PETSC_NULL otherwise) |
PETSc requires that matrices and vectors being used for certain operations are partitioned accordingly. For example, when creating a bmat matrix, A, that supports parallel matrix-vector products using MatMult(A,x,y) the user should set the number of local matrix rows to be the number of local elements of the corresponding result vector, y. Note that this is information is required for use of the matrix interface routines, even though the bmat matrix may not actually be physically partitioned. For example,
Level:intermediate
Location:src/mat/impls/blockmat/seq/blockmat.c
Index of all Mat routines
Table of Contents for all manual pages
Index of all manual pages