#include "petscmat.h" PetscErrorCode MatCreateShell(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt M,PetscInt N,void *ctx,Mat *A)Collective on MPI_Comm
comm | - MPI communicator | |
m | - number of local rows (must be given) | |
n | - number of local columns (must be given) | |
M | - number of global rows (may be PETSC_DETERMINE) | |
N | - number of global columns (may be PETSC_DETERMINE) | |
ctx | - pointer to data needed by the shell matrix routines |
extern int mult(Mat,Vec,Vec);
MatCreateShell(comm,m,n,M,N,ctx,&mat);
MatShellSetOperation(mat,MATOP_MULT,(void(*)(void))mult);
[ Use matrix for operations that have been set ]
MatDestroy(mat);
Fortran Notes: To use this from Fortran with a ctx you must write an interface definition for this function and for MatShellGetContext() that tells Fortran the Fortran derived data type you are passing in as the ctx argument.
PETSc requires that matrices and vectors being used for certain operations are partitioned accordingly. For example, when creating a shell matrix, A, that supports parallel matrix-vector products using MatMult(A,x,y) the user should set the number of local matrix rows to be the number of local elements of the corresponding result vector, y. Note that this is information is required for use of the matrix interface routines, even though the shell matrix may not actually be physically partitioned. For example,
Vec x, y
extern int mult(Mat,Vec,Vec);
Mat A
VecCreateMPI(comm,PETSC_DECIDE,M,&y);
VecCreateMPI(comm,PETSC_DECIDE,N,&x);
VecGetLocalSize(y,&m);
VecGetLocalSize(x,&n);
MatCreateShell(comm,m,n,M,N,ctx,&A);
MatShellSetOperation(mat,MATOP_MULT,(void(*)(void))mult);
MatMult(A,x,y);
MatDestroy(A);
VecDestroy(y); VecDestroy(x);
Level:advanced
Location:src/mat/impls/shell/shell.c
Index of all Mat routines
Table of Contents for all manual pages
Index of all manual pages