#include "petscmat.h" PetscErrorCode MatCreateMPIBAIJWithArrays(MPI_Comm comm,PetscInt bs,PetscInt m,PetscInt n,PetscInt M,PetscInt N,const PetscInt i[],const PetscInt j[],const PetscScalar a[],Mat *mat)Collective on MPI_Comm
comm | - MPI communicator | |
bs | - the block size, only a block size of 1 is supported | |
m | - number of local rows (Cannot be PETSC_DECIDE) | |
n | - This value should be the same as the local size used in creating the x vector for the matrix-vector product y = Ax. (or PETSC_DECIDE to have calculated if N is given) For square matrices n is almost always m. | |
M | - number of global rows (or PETSC_DETERMINE to have calculated if m is given) | |
N | - number of global columns (or PETSC_DETERMINE to have calculated if n is given) | |
i | - row indices; that is i[0] = 0, i[row] = i[row-1] + number of block elements in that rowth block row of the matrix | |
j | - column indices | |
a | - matrix values |
The order of the entries in values is the same as the block compressed sparse row storage format; that is, it is the same as a three dimensional array in Fortran values(bs,bs,nnz) that contains the first column of the first block, followed by the second column of the first block etc etc. That is, the blocks are contiguous in memory with column-major ordering within blocks.
The i and j indices are 0 based, and i indices are indices corresponding to the local j array.