#include "petscksp.h" PetscErrorCode MatCreateLMVMBFGS(MPI_Comm comm, PetscInt n, PetscInt N, Mat *B)The provided local and global sizes must match the solution and function vectors used with MatLMVMUpdate() and MatSolve(). The resulting L-BFGS matrix will have storage vectors allocated with VecCreateSeq() in serial and VecCreateMPI() in parallel. To use the L-BFGS matrix with other vector types, the matrix must be created using MatCreate() and MatSetType(), followed by MatLMVMAllocate(). This ensures that the internal storage and work vectors are duplicated from the correct type of vector.
Collective
comm | - MPI communicator, set to PETSC_COMM_SELF | |
n | - number of local rows for storage vectors | |
N | - global size of the storage vectors |
B | - the matrix |
It is recommended that one use the MatCreate(), MatSetType() and/or MatSetFromOptions() paradigm instead of this routine directly.
-mat_lmvm_num_vecs | - maximum number of correction vectors (i.e.: updates) stored | |
-mat_lmvm_scale_type | - (developer) type of scaling applied to J0 (none, scalar, diagonal) | |
-mat_lmvm_theta | - (developer) convex ratio between BFGS and DFP components of the diagonal J0 scaling | |
-mat_lmvm_rho | - (developer) update limiter for the J0 scaling | |
-mat_lmvm_alpha | - (developer) coefficient factor for the quadratic subproblem in J0 scaling | |
-mat_lmvm_beta | - (developer) exponential factor for the diagonal J0 scaling | |
-mat_lmvm_sigma_hist | - (developer) number of past updates to use in J0 scaling |