PetscErrorCode MatCreateBAIJMKL(MPI_Comm comm,PetscInt bs,PetscInt m,PetscInt n,PetscInt M,PetscInt N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const PetscInt o_nnz[],Mat *A)Collective on MPI_Comm
comm | - MPI communicator | |
bs | - size of block, the blocks are ALWAYS square. One can use MatSetBlockSizes() to set a different row and column blocksize but the row blocksize always defines the size of the blocks. The column blocksize sets the blocksize of the vectors obtained with MatCreateVecs() | |
m | - number of local rows (or PETSC_DECIDE to have calculated if M is given) This value should be the same as the local size used in creating the y vector for the matrix-vector product y = Ax. | |
n | - number of local columns (or PETSC_DECIDE to have calculated if N is given) This value should be the same as the local size used in creating the x vector for the matrix-vector product y = Ax. | |
M | - number of global rows (or PETSC_DETERMINE to have calculated if m is given) | |
N | - number of global columns (or PETSC_DETERMINE to have calculated if n is given) | |
d_nz | - number of nonzero blocks per block row in diagonal portion of local submatrix (same for all local rows) | |
d_nnz | - array containing the number of nonzero blocks in the various block rows of the in diagonal portion of the local (possibly different for each block row) or NULL. If you plan to factor the matrix you must leave room for the diagonal entry and set it even if it is zero. | |
o_nz | - number of nonzero blocks per block row in the off-diagonal portion of local submatrix (same for all local rows). | |
o_nnz | - array containing the number of nonzero blocks in the various block rows of the off-diagonal portion of the local submatrix (possibly different for each block row) or NULL. |
-mat_block_size | - size of the blocks to use | |
-mat_use_hash_table <fact> | - It is recommended that one use the MatCreate(), MatSetType() and/or MatSetFromOptions(), MatXXXXSetPreallocation() paradgm instead of this routine directly. [MatXXXXSetPreallocation() is, for example, MatSeqAIJSetPreallocation] |
A nonzero block is any block that as 1 or more nonzeros in it
The user MUST specify either the local or global matrix dimensions (possibly both).
If PETSC_DECIDE or PETSC_DETERMINE is used for a particular argument on one processor than it must be used on all processors that share the object for that argument.
The user can specify preallocated storage for the diagonal part of the local submatrix with either d_nz or d_nnz (not both). Set d_nz=PETSC_DEFAULT and d_nnz=NULL for PETSc to control dynamic memory allocation. Likewise, specify preallocated storage for the off-diagonal part of the local submatrix with o_nz or o_nnz (not both).
Consider a processor that owns rows 3, 4 and 5 of a parallel matrix. In the figure below we depict these three local rows and all columns (0-11).
0 1 2 3 4 5 6 7 8 9 10 11 -------------------------- row 3 |o o o d d d o o o o o o row 4 |o o o d d d o o o o o o row 5 |o o o d d d o o o o o o --------------------------
Thus, any entries in the d locations are stored in the d (diagonal) submatrix, and any entries in the o locations are stored in the o (off-diagonal) submatrix. Note that the d and the o submatrices are stored simply in the MATSEQBAIJMKL format for compressed row storage.
Now d_nz should indicate the number of block nonzeros per row in the d matrix, and o_nz should indicate the number of block nonzeros per row in the o matrix. In general, for PDE problems in which most nonzeros are near the diagonal, one expects d_nz >> o_nz. For large problems you MUST preallocate memory or you will get TERRIBLE performance; see the users' manual chapter on matrices.
Level:intermediate
Location:src/mat/impls/baij/mpi/baijmkl/mpibaijmkl.c
Index of all Mat routines
Table of Contents for all manual pages
Index of all manual pages