Optimizers

ESCAPE provides powerful optimization algorithms for curve fitting - an iterative process of finding parameters of mathematical functions that best fit experimental data points. The optimizer_obj class implements several algorithms:

  • Levenberg-Marquardt [1], [2] - A robust non-linear least-squares algorithm that combines gradient descent and Gauss-Newton methods

  • Differential Evolution [3] - A stochastic evolutionary algorithm that maintains a population of candidate solutions and iteratively improves them through mutation and crossover operations

The optimizers support:

  • Asynchronous optimization

  • Custom initialization, iteration and finalization callbacks

  • Progress monitoring and early stopping

  • Parameter constraints and bounds

  • Multiple optimization strategies

  • Automatic convergence detection

escape.core.optimizer.levmar(name: str, stack: Union[modelstack_obj, List[model_obj], model_obj], ftol: float = 1e-10, xtol: float = 1e-10, gtol: float = 1e-10, maxiter: int = 300, maxfev: int = 5000, nupdate: int = 1, status_exc: bool = False) optimizer_obj

Creates a Levenberg-Marquardt optimizer.

Args:

stack: Models to optimize (modelstack_obj, list of models or single model) ftol: Relative error tolerance for cost function (default: 1e-10) xtol: Relative error tolerance for parameter values (default: 1e-10) gtol: Orthogonality tolerance between cost function and jacobian (default: 1e-10) maxiter: Maximum iterations allowed (default: 300) maxfev: Maximum function evaluations allowed (default: 5000) nupdate: Iteration callback frequency (default: 1) status_exc: Raise exception on non-zero status code (default: False)

Returns:

optimizer_obj instance

escape.core.optimizer.diffevol(name: str, objective: Union[modelstack_obj, List[model_obj], model_obj, functor_obj], popsize: int = 15, strategy: str = 'best1bin', init_strategy: Union[str, np.ndarray] = 'random', maxiter: int = 1000, ftol: float = 0.001, mutation: float = 0.5, crossover: float = 0.7, polish_candidate_maxiter: int = 0, polish_final_maxiter: int = 300, nupdate: int = 1, status_exc: bool = False) Union[optimizer_obj, functor_obj]

Creates a Differential Evolution optimizer.

The algorithm maintains a population of candidate solutions and evolves them using mutation and crossover operations to find the global minimum.

Args:

objective: Models or functor to optimize popsize: Population size multiplier (pop_size * num_params individuals) strategy: Evolution strategy (default: ‘best1bin’)

Supported strategies: - ‘best1bin’/’best1exp’ - ‘rand1bin’/’rand1exp’ - ‘randtobest1bin’/’randtobest1exp’ - ‘best2bin’/’best2exp’ - ‘rand2bin’/’rand2exp’ - ‘sqg2bin’/’sqg2exp’ through ‘sqg5bin’/’sqg5exp’

init_strategy: Initialization method (default: ‘random’)
  • ‘random’: Uniform random initialization

  • ‘lhs’: Latin hypercube sampling

  • numpy array: Custom initial population

maxiter: Maximum generations (default: 300) ftol: Convergence tolerance (default: 1e-3) mutation: Mutation constant F (default: 0.5)

Controls search radius - larger values explore more but converge slower

crossover: Crossover probability CR (default: 0.7)

Controls population diversity - larger values allow more mutations

polish_candidate_maxiter: LM polish iterations per candidate (default: 0) polish_final_maxiter: Final LM polish iterations (default: 300) nupdate: Iteration callback frequency (default: 1) status_exc: Raise exception on non-zero status (default: False)

Returns:

optimizer_obj for models, functor_obj for functors

class escape.core.optimizer.optimizer_obj

Wrapper class for ESCAPE optimizers.

This class provides a Pythonic interface to the C++ optimization algorithms. It handles parameter management, optimization control, and progress monitoring.

best_cost

Best cost value found. For Levenberg-Marquardt this is the final cost. For stochastic methods this is the best cost across all iterations.

cost_history

History of cost values across iterations.

finalization_method

Custom finalization callback.

initial_parameters

Initial parameter values.

initialization_method

Custom initialization callback.

iteration_method

Custom iteration callback.

modelstack

The modelstack being optimized.

name

Optimizer instance name.

num_of_evaluations

Number of objective function evaluations performed.

num_of_iterations

Number of completed optimization iterations.

on_finalized() None

Called when optimization completes.

on_initialized() None

Called when optimization starts.

on_iteration() None

Called after each iteration (frequency controlled by nupdate setting).

parameters

List of optimization parameters.

progress

Optimization progress iter/maxiter

reset_to_initial() None

Reset parameters to their initial values.

shake() None

Randomizes all non-fixed independent parameters.

show(model_configs: Optional[Any] = None, config: Optional[Any] = None, **kwargs) Any

Show optimizer object in a widget.

Args:

model_configs: Model configurations to display, a single ModelConfig instance or a list of ModelConfig instances config: Layout configuration, a LayoutConfig instance **kwargs: Additional keyword arguments

Returns:

OptimizerObjectLayout instance

status_code

Numeric status code indicating optimization outcome.

status_msg

Status message describing the optimization state.

stop() None

Interrupts the optimization process.

wait() None

Blocks until the asynchronous optimization completes. Only applicable when optimizer was started with asynchr=True.

escape.core.optimizer.optimization_status(func: functor_obj) dict

Get optimization status for a functor.

Args:

func: Functor object with minimizer handler

Returns:

Dictionary with status code and function evaluation count