MatLoad#
Loads a matrix that has been stored in binary/HDF5 format with MatView()
. The matrix format is determined from the options database. Generates a parallel MPI matrix if the communicator has more than one processor. The default matrix type is MATAIJ
.
Synopsis#
#include "petscmat.h"
PetscErrorCode MatLoad(Mat mat, PetscViewer viewer)
Collective
Input Parameters#
mat - the newly loaded matrix, this needs to have been created with
MatCreate()
or some related function before a call toMatLoad()
viewer -
PETSCVIEWERBINARY
/PETSCVIEWERHDF5
file viewer
Options Database Key#
-matload_block_size
- set block size
Notes#
If the Mat
type has not yet been given then MATAIJ
is used, call MatSetFromOptions()
on the
Mat
before calling this routine if you wish to set it from the options database.
MatLoad()
automatically loads into the options database any options
given in the file filename.info where filename is the name of the file
that was passed to the PetscViewerBinaryOpen()
. The options in the info
file will be ignored if you use the -viewer_binary_skip_info option.
If the type or size of mat is not set before a call to MatLoad()
, PETSc
sets the default matrix type AIJ and sets the local and global sizes.
If type and/or size is already set, then the same are used.
In parallel, each processor can load a subset of rows (or the entire matrix). This routine is especially useful when a large matrix is stored on disk and only part of it is desired on each processor. For example, a parallel solver may access only some of the rows from each processor. The algorithm used here reads relatively small blocks of data rather than reading the entire matrix and then subsetting it.
Viewer’s PetscViewerType
must be either PETSCVIEWERBINARY
or PETSCVIEWERHDF5
.
Such viewer can be created using PetscViewerBinaryOpen()
or PetscViewerHDF5Open()
,
or the sequence like
`PetscViewer` v;
`PetscViewerCreate`(`PETSC_COMM_WORLD`,&v);
`PetscViewerSetType`(v,`PETSCVIEWERBINARY`);
`PetscViewerSetFromOptions`(v);
`PetscViewerFileSetMode`(v,`FILE_MODE_READ`);
`PetscViewerFileSetName`(v,"datafile");
The optional PetscViewerSetFromOptions()
call allows overriding PetscViewerSetType()
using the option
-viewer_type {binary, hdf5}
See the example src/ksp/ksp/tutorials/ex27.c with the first approach, and src/mat/tutorials/ex10.c with the second approach.
In case of PETSCVIEWERBINARY
, a native PETSc binary format is used. Each of the blocks
is read onto MPI rank 0 and then shipped to its destination MPI rank, one after another.
Multiple objects, both matrices and vectors, can be stored within the same file.
Their PetscObject
name is ignored; they are loaded in the order of their storage.
Most users should not need to know the details of the binary storage
format, since MatLoad()
and MatView()
completely hide these details.
But for anyone who is interested, the standard binary matrix storage
format is
If PETSc was not configured with --with-64-bit-indices
then only MATMPIAIJ
matrices with more than PETSC_INT_MAX
non-zeros can be
stored or loaded (each MPI process part of the matrix must have less than PETSC_INT_MAX
nonzeros). Since the total nonzero count in this
case will not fit in a (32-bit) PetscInt
the value PETSC_INT_MAX
is used for the header entry total number of nonzeros
.
PETSc automatically does the byte swapping for
machines that store the bytes reversed. Thus if you write your own binary
read/write routines you have to swap the bytes; see PetscBinaryRead()
and PetscBinaryWrite()
to see how this may be done.
In case of PETSCVIEWERHDF5
, a parallel HDF5 reader is used.
Each processor’s chunk is loaded independently by its owning MPI process.
Multiple objects, both matrices and vectors, can be stored within the same file.
They are looked up by their PetscObject name.
As the MATLAB MAT-File Version 7.3 format is also a HDF5 flavor, we decided to use by default the same structure and naming of the AIJ arrays and column count within the HDF5 file. This means that a MAT file saved with -v7.3 flag, e.g.
save example.mat A b -v7.3
can be directly read by this routine (see Reference 1 for details).
Depending on your MATLAB version, this format might be a default, otherwise you can set it as default in Preferences.
Unless -nocompression flag is used to save the file in MATLAB, PETSc must be configured with ZLIB package.
See also examples src/mat/tutorials/ex10.c and src/ksp/ksp/tutorials/ex27.c
This reader currently supports only real MATSEQAIJ
, MATMPIAIJ
, MATSEQDENSE
and MATMPIDENSE
matrices for PETSCVIEWERHDF5
Corresponding MatView()
is not yet implemented.
The loaded matrix is actually a transpose of the original one in MATLAB,
unless you push PETSC_VIEWER_HDF5_MAT
format (see examples above).
With this format, matrix is automatically transposed by PETSc,
unless the matrix is marked as SPD or symmetric
(see MatSetOption()
, MAT_SPD
, MAT_SYMMETRIC
).
See MATLAB Documentation on save()
, https://www.mathworks.com/help/matlab/ref/save.html#btox10b-1-version
See Also#
Matrices, Mat
, PetscViewerBinaryOpen()
, PetscViewerSetType()
, MatView()
, VecLoad()
Level#
beginner
Location#
Examples#
src/ksp/ksp/tutorials/ex10.c
src/ksp/ksp/tutorials/ex75.c
src/ksp/ksp/tutorials/ex87.c
src/ksp/ksp/tutorials/ex27.c
src/ksp/ksp/tutorials/ex77.c
src/ksp/ksp/tutorials/ex72.c
src/mat/tutorials/ex10.c
src/ksp/ksp/tutorials/ex76.c
src/ksp/ksp/tutorials/ex41.c
src/mat/tutorials/ex1.c
Implementations#
MatLoad_MPIAIJ() in src/mat/impls/aij/mpi/mpiaij.c
MatLoad_SeqAIJ() in src/mat/impls/aij/seq/aij.c
MatLoad_MPIBAIJ() in src/mat/impls/baij/mpi/mpibaij.c
MatLoad_SeqBAIJ() in src/mat/impls/baij/seq/baij.c
MatLoad_BlockMat() in src/mat/impls/blockmat/seq/blockmat.c
MatLoad_MPIDense() in src/mat/impls/dense/mpi/mpidense.c
MatLoad_SeqDense() in src/mat/impls/dense/seq/dense.c
MatLoad_Elemental() in src/mat/impls/elemental/matelem.cxx
MatLoad_MPISBAIJ() in src/mat/impls/sbaij/mpi/mpisbaij.c
MatLoad_SeqSBAIJ() in src/mat/impls/sbaij/seq/sbaij.c
MatLoad_ScaLAPACK() in src/mat/impls/scalapack/matscalapack.c
Index of all Mat routines
Table of Contents for all manual pages
Index of all manual pages