Data Assimilation (Inversion)#

This subpackage provides implementations to solve time-dependent data assimilation problems only. In this case, the simulation model, the observation operator, and the observational data are generally time dependent. Of course, one can set the observation operator to be the same at all observation time point which is common when the observational configuration does not change over time. The DA algorithm assimilates multiple observations at once to correct the model trajectory over a given assimilation timespan (assimilation window).

DA Smoothing Algorithms#

DA Smoothing Base Classes#

Abstract classes for smoothing algoirthms (Bayesian and Variational approaches). A HybridSmoother Class is also added, but without clear functionality yet. Unlike filtering algorithms, in the case of smoothers multiple observations are given (e.g., time-dependent problems). The output of a smoothing algorithm is a point estimate of the truth. Uncertainty measure, e.g., posterior covariance, provided if a Bayesian approach is followed.

class VariationalSmootherConfigs(*, debug=False, verbose=False, output_dir='./_PYOED_RESULTS_', name=None, model=None, prior=None, observation_error_model=None, observation_operator=None, observations=None, optimizer=None, optimizer_configs=None, window=None)[source]#

Bases: SmootherConfigs

Configurations class for the VariationalSmoother abstract base class.

Parameters:
  • verbose (bool) – a boolean flag to control verbosity of the object.

  • debug (bool) – a boolean flag that enables adding extra functionlity in a debug mode

  • output_dir (str | Path) – the base directory where the output files will be saved.

  • name (str | None) – name of the DA (inverse problem solver) approach/method.

  • model (None | SimulationModel) – the simulation model.

  • prior (None | ErrorModel) – Background/Prior model (e.g., GaussianErrorModel)

  • observation_operator (None | ObservationOperator) – operator to map model state to observation space

  • observation_error_model (None | ErrorModel) – Observation error model (e.g., GaussianErrorModel)

  • observations (None | Any) – Observational data (the data type is very much dependent of the DA method)

  • optimizer (None | Optimizer) –

    the optimization routine (optimizer) to be registered and later used for solving the OED problem. This can be one of the following:

    • None: In this case, no optimizer is registered, and the solve() won’t be functional until an optimization routine is registered.

    • An optimizer instance (object that inherits :py:class`Optimizer`). In this case, the optimizer is registered as is and is updated with the passed configurations if available.

    • The class (subclass of Optimizer) to be used to instantiate the optimizer.

  • optimizer_configs (None | dict | OptimizerConfigs) –

    the configurations of the optimization routine. This can be one of the following:

    • None, in this case configurations are discarded, and whatever default configurations of the selected/passed optimizer are employed.

    • A dict holding full/partial configurations of the selected optimizer. These are either used to instantiate or update the optimizer configurations based on the type of the passed optimizer.

    • A class providing implementations of the configurations (this must be a subclass of OptimizerConfigs.

    • An instance of a subclass of OptimizerConfigs which is to set/udpate optimizer configurations.

    Note

    Not all DA (inverse problem) objects are optimization-based. For example, particle-based (EnKF, PF, etc.) employ a sample to estimate the flow of the distribution through the model dynamics (prior -> posterior). Thus, the optimizer (and configs) in this case (the default) are set to None. For optimization-based methods such as 3DVar, 4DVar, etc., an optimizer must be registered for the inverse problem to be solved.

  • window (None | Iterable) – the assimilation window (t0, tf)

__init__(*, debug=False, verbose=False, output_dir='./_PYOED_RESULTS_', name=None, model=None, prior=None, observation_error_model=None, observation_operator=None, observations=None, optimizer=None, optimizer_configs=None, window=None)#
class VariationalSmoother(configs=None)[source]#

Bases: Smoother

Base class for variational smoothers. In this case, a single point estimate is obtained by solving a weighted least-squares optimization problem to minimize the mismatch between model prediction and observation (in the appropriate projected space, e.g., observation space) The mismatch is usually regularized using a penalty term (usually asserted by the prior)

__init__(configs=None)[source]#
objective(init_guess, data_misfit_only=False)[source]#

Evaluate the objective function, and the associated gradient

Parameters:
  • init_guess – model parameter/state to evaluate the objective function and the associated gradient at

  • data_misfit_only (bool) – discard the prior/regularization term if True. This is added for flexibility

Returns:

objective function value, and the gradient; (the objective value is a scalar, and the gradient is a one-dimensional Numpy array)

abstract objective_function_value(state, data_misfit_only=False)[source]#

A method to evaluate the variational objective function.

objective_function_gradient(state, data_misfit_only=False)[source]#

A method to evaluate the variational objective function. This implementation (by default) provides a gradient approximation using finite difference approximation. For efficient evaluation of the gradient, the derived class need to provide and implementation of the analytical gradient.

class VariationalSmootherResults(*, inverse_problem=None, optimization_results=None)[source]#

Bases: SmootherResults

Base class for objects holding results of Smoother

Parameters:

inverse_problem (InverseProblem | None) – instance of a class derived from InverseProblem.

__init__(*, inverse_problem=None, optimization_results=None)#
class BayesianSmootherConfigs(*, debug=False, verbose=False, output_dir='./_PYOED_RESULTS_', name=None, model=None, prior=None, observation_error_model=None, observation_operator=None, observations=None, optimizer=None, optimizer_configs=None, window=None)[source]#

Bases: SmootherConfigs

Configurations class for the BayesianSmoother abstract base class.

Parameters:
  • verbose (bool) – a boolean flag to control verbosity of the object.

  • debug (bool) – a boolean flag that enables adding extra functionlity in a debug mode

  • output_dir (str | Path) – the base directory where the output files will be saved.

  • name (str | None) – name of the DA (inverse problem solver) approach/method.

  • model (None | SimulationModel) – the simulation model.

  • prior (None | ErrorModel) – Background/Prior model (e.g., GaussianErrorModel)

  • observation_operator (None | ObservationOperator) – operator to map model state to observation space

  • observation_error_model (None | ErrorModel) – Observation error model (e.g., GaussianErrorModel)

  • observations (None | Any) – Observational data (the data type is very much dependent of the DA method)

  • optimizer (None | Optimizer) –

    the optimization routine (optimizer) to be registered and later used for solving the OED problem. This can be one of the following:

    • None: In this case, no optimizer is registered, and the solve() won’t be functional until an optimization routine is registered.

    • An optimizer instance (object that inherits :py:class`Optimizer`). In this case, the optimizer is registered as is and is updated with the passed configurations if available.

    • The class (subclass of Optimizer) to be used to instantiate the optimizer.

  • optimizer_configs (None | dict | OptimizerConfigs) –

    the configurations of the optimization routine. This can be one of the following:

    • None, in this case configurations are discarded, and whatever default configurations of the selected/passed optimizer are employed.

    • A dict holding full/partial configurations of the selected optimizer. These are either used to instantiate or update the optimizer configurations based on the type of the passed optimizer.

    • A class providing implementations of the configurations (this must be a subclass of OptimizerConfigs.

    • An instance of a subclass of OptimizerConfigs which is to set/udpate optimizer configurations.

    Note

    Not all DA (inverse problem) objects are optimization-based. For example, particle-based (EnKF, PF, etc.) employ a sample to estimate the flow of the distribution through the model dynamics (prior -> posterior). Thus, the optimizer (and configs) in this case (the default) are set to None. For optimization-based methods such as 3DVar, 4DVar, etc., an optimizer must be registered for the inverse problem to be solved.

  • window (None | Iterable) – the assimilation window (t0, tf)

__init__(*, debug=False, verbose=False, output_dir='./_PYOED_RESULTS_', name=None, model=None, prior=None, observation_error_model=None, observation_operator=None, observations=None, optimizer=None, optimizer_configs=None, window=None)#
class BayesianSmoother(configs=None)[source]#

Bases: Smoother

Base class for Bayesian smoothing algorithms. In this case, the probability distribution (or an estimate thereof) of the model state/parameter is considered. The goal is to apply Bayes’ theorem, and retrieve the exact/approximate posterior or samples from teh posterior

__init__(configs=None)[source]#
class BayesianSmootherResults(*, inverse_problem=None, optimization_results=None)[source]#

Bases: SmootherResults

Base class for objects holding results of Smoother

Parameters:

inverse_problem (InverseProblem | None) – instance of a class derived from InverseProblem.

__init__(*, inverse_problem=None, optimization_results=None)#
class HybridSmootherConfigs(*, debug=False, verbose=False, output_dir='./_PYOED_RESULTS_', name=None, model=None, prior=None, observation_error_model=None, observation_operator=None, observations=None, optimizer=None, optimizer_configs=None, window=None)[source]#

Bases: SmootherConfigs

Configurations class for the HybridSmoother abstract base class.

Parameters:
  • verbose (bool) – a boolean flag to control verbosity of the object.

  • debug (bool) – a boolean flag that enables adding extra functionlity in a debug mode

  • output_dir (str | Path) – the base directory where the output files will be saved.

  • name (str | None) – name of the DA (inverse problem solver) approach/method.

  • model (None | SimulationModel) – the simulation model.

  • prior (None | ErrorModel) – Background/Prior model (e.g., GaussianErrorModel)

  • observation_operator (None | ObservationOperator) – operator to map model state to observation space

  • observation_error_model (None | ErrorModel) – Observation error model (e.g., GaussianErrorModel)

  • observations (None | Any) – Observational data (the data type is very much dependent of the DA method)

  • optimizer (None | Optimizer) –

    the optimization routine (optimizer) to be registered and later used for solving the OED problem. This can be one of the following:

    • None: In this case, no optimizer is registered, and the solve() won’t be functional until an optimization routine is registered.

    • An optimizer instance (object that inherits :py:class`Optimizer`). In this case, the optimizer is registered as is and is updated with the passed configurations if available.

    • The class (subclass of Optimizer) to be used to instantiate the optimizer.

  • optimizer_configs (None | dict | OptimizerConfigs) –

    the configurations of the optimization routine. This can be one of the following:

    • None, in this case configurations are discarded, and whatever default configurations of the selected/passed optimizer are employed.

    • A dict holding full/partial configurations of the selected optimizer. These are either used to instantiate or update the optimizer configurations based on the type of the passed optimizer.

    • A class providing implementations of the configurations (this must be a subclass of OptimizerConfigs.

    • An instance of a subclass of OptimizerConfigs which is to set/udpate optimizer configurations.

    Note

    Not all DA (inverse problem) objects are optimization-based. For example, particle-based (EnKF, PF, etc.) employ a sample to estimate the flow of the distribution through the model dynamics (prior -> posterior). Thus, the optimizer (and configs) in this case (the default) are set to None. For optimization-based methods such as 3DVar, 4DVar, etc., an optimizer must be registered for the inverse problem to be solved.

  • window (None | Iterable) – the assimilation window (t0, tf)

__init__(*, debug=False, verbose=False, output_dir='./_PYOED_RESULTS_', name=None, model=None, prior=None, observation_error_model=None, observation_operator=None, observations=None, optimizer=None, optimizer_configs=None, window=None)#
class HybridSmoother(configs=None)[source]#

Bases: Smoother

Base class for Bayesian-variational filters.

__init__(configs=None)[source]#
class HybridSmootherResults(*, inverse_problem=None, optimization_results=None)[source]#

Bases: SmootherResults

Base class for objects holding results of Smoother

Parameters:

inverse_problem (InverseProblem | None) – instance of a class derived from InverseProblem.

__init__(*, inverse_problem=None, optimization_results=None)#