1: #ifndef __TAOLINESEARCH_ARMIJO_H
4: /* Context for an Armijo (nonmonotone) linesearch for unconstrained
5: minimization.
7: Given a function f, the current iterate x, and a descent direction d:
8: Find the smallest i in 0, 1, 2, ..., such that:
10: f(x + (beta**i)d) <= f(x) + (sigma*beta**i)<grad f(x),d>
12: The nonmonotone modification of this linesearch replaces the f(x) term
13: with a reference value, R, and seeks to find the smallest i such that:
15: f(x + (beta**i)d) <= R + (sigma*beta**i)<grad f(x),d>
17: This modification does effect neither the convergence nor rate of
18: convergence of an algorithm when R is chosen appropriately. Essentially,
19: R must decrease on average in some sense. The benefit of a nonmonotone
20: linesearch is that local minimizers can be avoided (by allowing increase
21: in function value), and typically, fewer iterations are performed in
22: the main code.
24: The reference value is chosen based upon some historical information
25: consisting of function values for previous iterates. The amount of
26: historical information used is determined by the memory size where the
27: memory is used to store the previous function values. The memory is
28: initialized to alpha*f(x^0) for some alpha >= 1, with alpha=1 signifying
29: that we always force decrease from the initial point.
31: The reference value can be the maximum value in the memory or can be
32: chosen to provide some mean descent. Elements are removed from the
33: memory with a replacement policy that either removes the oldest
34: value in the memory (FIFO), or the largest value in the memory (MRU).
36: Additionally, we can add a watchdog strategy to the search, which
37: essentially accepts small directions and only checks the nonmonotonic
38: descent criteria every m-steps. This strategy is NOT implemented in
39: the code.
41: Finally, care must be taken when steepest descent directions are used.
42: For example, when the Newton direction is not not satisfy a sufficient
43: descent criteria. The code will apply the same test regardless of
44: the direction. This type of search may not be appropriate for all
45: algorithms. For example, when a gradient direction is used, we may
46: want to revert to the best point found and reset the memory so that
47: we stay in an appropriate level set after using a gradient steps.
48: This type of search is currently NOT supported by the code.
50: References:
51: Armijo, "Minimization of Functions Having Lipschitz Continuous
52: First-Partial Derivatives," Pacific Journal of Mathematics, volume 16,
53: pages 1-3, 1966.
54: Ferris and Lucidi, "Nonmonotone Stabilization Methods for Nonlinear
55: Equations," Journal of Optimization Theory and Applications, volume 81,
56: pages 53-71, 1994.
57: Grippo, Lampariello, and Lucidi, "A Nonmonotone Line Search Technique
58: for Newton's Method," SIAM Journal on Numerical Analysis, volume 23,
59: pages 707-716, 1986.
60: Grippo, Lampariello, and Lucidi, "A Class of Nonmonotone Stabilization
61: Methods in Unconstrained Optimization," Numerische Mathematik, volume 59,
62: pages 779-805, 1991. */
63: #include <petsc/private/taolinesearchimpl.h> 64: typedef struct {
65: PetscReal *memory;
67: PetscReal alpha; /* Initial reference factor >= 1 */
68: PetscReal beta; /* Steplength determination < 1 */
69: PetscReal beta_inf; /* Steplength determination < 1 */
70: PetscReal sigma; /* Acceptance criteria < 1) */
71: PetscReal minimumStep; /* Minimum step size */
72: PetscReal lastReference; /* Reference value of last iteration */
74: PetscInt memorySize; /* Number of functions kept in memory */
75: PetscInt current; /* Current element for FIFO */
76: PetscInt referencePolicy; /* Integer for reference calculation rule */
77: PetscInt replacementPolicy; /* Policy for replacing values in memory */
79: PetscBool nondescending;
80: PetscBool memorySetup;
82: Vec x; /* Maintain reference to variable vector to check for changes */
83: Vec work;
84: } TaoLineSearch_ARMIJO;
86: #endif