petsc-3.10.5 2019-03-28
Report Typos and Errors

MatLoad

Loads a matrix that has been stored in binary format with MatView(). The matrix format is determined from the options database. Generates a parallel MPI matrix if the communicator has more than one processor. The default matrix type is AIJ.

Synopsis

#include "petscmat.h" 
PetscErrorCode MatLoad(Mat newmat,PetscViewer viewer)
Collective on PetscViewer

Input Parameters

newmat - the newly loaded matrix, this needs to have been created with MatCreate() or some related function before a call to MatLoad()
viewer - binary file viewer, created with PetscViewerBinaryOpen()

Options Database Keys

Used with block matrix formats (MATSEQBAIJ, ...) to specify block size
-matload_block_size <bs> -

Notes

If the Mat type has not yet been given then MATAIJ is used, call MatSetFromOptions() on the Mat before calling this routine if you wish to set it from the options database.

MatLoad() automatically loads into the options database any options given in the file filename.info where filename is the name of the file that was passed to the PetscViewerBinaryOpen(). The options in the info file will be ignored if you use the -viewer_binary_skip_info option.

If the type or size of newmat is not set before a call to MatLoad, PETSc sets the default matrix type AIJ and sets the local and global sizes. If type and/or size is already set, then the same are used.

In parallel, each processor can load a subset of rows (or the entire matrix). This routine is especially useful when a large matrix is stored on disk and only part of it is desired on each processor. For example, a parallel solver may access only some of the rows from each processor. The algorithm used here reads relatively small blocks of data rather than reading the entire matrix and then subsetting it.

Notes for advanced users

Most users should not need to know the details of the binary storage format, since MatLoad() and MatView() completely hide these details. But for anyone who's interested, the standard binary matrix storage format is

   int    MAT_FILE_CLASSID
   int    number of rows
   int    number of columns
   int    total number of nonzeros
   int    *number nonzeros in each row
   int    *column indices of all nonzeros (starting index is zero)
   PetscScalar *values of all nonzeros

PETSc automatically does the byte swapping for machines that store the bytes reversed, e.g. DEC alpha, freebsd, linux, Windows and the paragon; thus if you write your own binary read/write routines you have to swap the bytes; see PetscBinaryRead() and PetscBinaryWrite() to see how this may be done.

Keywords

matrix, load, binary, input

See Also

PetscViewerBinaryOpen(), MatView(), VecLoad()

Level

beginner

Location

src/mat/interface/matrix.c

Examples

src/vec/vec/examples/tutorials/ex6.c.html
src/mat/examples/tutorials/ex1.c.html
src/mat/examples/tutorials/ex9.c.html
src/mat/examples/tutorials/ex10.c.html
src/mat/examples/tutorials/ex12.c.html
src/mat/examples/tutorials/ex16.c.html
src/ksp/ksp/examples/tutorials/ex10.c.html
src/ksp/ksp/examples/tutorials/ex27.c.html
src/ksp/ksp/examples/tutorials/ex41.c.html
src/ksp/ksp/examples/tutorials/ex63.cxx.html
src/ksp/ksp/examples/tutorials/ex72.c.html

Implementations

MatLoad_MPI_DA in src/dm/impls/da/fdda.c
MatLoad_MPIAIJ in src/mat/impls/aij/mpi/mpiaij.c
MatLoad_SeqAIJ in src/mat/impls/aij/seq/aij.c
MatLoad_MPIBAIJ in src/mat/impls/baij/mpi/mpibaij.c
MatLoad_SeqBAIJ in src/mat/impls/baij/seq/baij.c
MatLoad_BlockMat in src/mat/impls/blockmat/seq/blockmat.c
MatLoad_MPIDense_DenseInFile in src/mat/impls/dense/mpi/mpidense.c
MatLoad_MPIDense in src/mat/impls/dense/mpi/mpidense.c
MatLoad_SeqDense in src/mat/impls/dense/seq/dense.c
MatLoad_Elemental in src/mat/impls/elemental/matelem.cxx
MatLoad_MPISBAIJ in src/mat/impls/sbaij/mpi/mpisbaij.c
MatLoad_SeqSBAIJ in src/mat/impls/sbaij/seq/sbaij.c

Index of all Mat routines
Table of Contents for all manual pages
Index of all manual pages