4.8.6. Optimizer Base Class BaseOptimizer API

class BaseOptimizer

Base class of parameter optimizers in ParAMS. Classes representing specific optimizers must derive from this abstract base class.

minimize(function: scm.params.core.opt_components._Step, x0: Sequence[float], bounds: numpy.ndarray, workers: int = 1) → scm.params.optimizers.base.MinimizeResult

Abstract method. Minimizes function, given an initial list of variable values x0, and possibly a list of bounds on the variable values.

In the context of ParAMS, function is a wrapper around all scm.params.core.opt_components._LossEvaluator instances, and behaves like a regular callable such that fx = function(x), returning the loss function value fx to the parameter set x.

In addition, all optimizers can make use of the parallel parameter set evaluation in ParAMS by passing the workers argument to the call: fxlist = function(Xlist,workers=N) evaluates N parameter sets in parallel (until all sets in Xlist are evaluated). The workers argument will be passed down from the Optimization class.

Callbacks previously passed to the top level Optimization instance are automatically evaluated at every call to function.
Whether any callback has signalled the optimizer to stop can be checked with function.callback() is True. Internally, whenever a stopping callback is invoked, consecutive calls to function() will skip the evaluation of further jobs and return a constant stored in function.ret instead (inf by default). Make sure to change it to a different value if your optimizer does not support infinite loss values.

This method should return an instance of MinimizeResult (see example implementation below).

function : callable
The objective function to be minimized. In the most basic setup it will return the scalar fx value of for every candidate evaluation function(x). Multiple fx can be effectively evaluated by passing the workers keyword to the function call.
x0 : List[float]
Initial parameter vector, required by some optimization algorithms
bounds : 2d array

Lower (bounds[:,0]) and upper (bounds[:,1]) parameter bounds.


Even though your optimizer might not support lower and upper parameter bounds, ParAMS internally enforces the constraints as defined through the Parameter Interface by returning inf whenever a candidate is outside of the the bounded space. A more elaborate way to deal with such constraints can be implemented by using the bounds parameter.


By default, every optimizer receives parameters x0 and bounds that have previously been scaled to be in the range of [0,1]. You can choose to alter the scaling by defining a self._scaler class variable which can either be set to the string ‘std’ for a [-1,1] scaling or ‘none’ to avoid scaling altogether.

workers : int
The number of workers to be used for parallel execution. In most cases, this should be passed along to to the function parameter (see above), as the evaluation will most likely be the slowest step in the optimization process. If your algorithm supports a higher level of parallelization, you can choose to use this parameter elsewhere.
Example implementation:
result = MinimizeResult()
function.ret = 1e30 # Assuming our optimizer can not handle infs, the return value will be a finite (very large) scalar

# Optimization loop:
while not function.callback(): # returns `function.stop`
    new_x  = self.ask() # Ask the optimizer for a set of new candidate solutions
    new_fx = function(new_x, workers=workers) # Evaluate multiple candidates at once
    self.tell(fx) # In the most basic scenario, an optimizer only needs the fitness function value `fx`

result.x  = self.best_x
result.fx = self.best_fx
result.success = self.best_fx < function.ret
retrun result
Returns:An instance of MinimizeResult

This method is called when the Optimization() class is initialized and should reset a previously used optimizer instance. MinimizeResult API

class MinimizeResult(success=False, x=None, fx=inf)
This class is the return value of

The results of an optimization can be accessed by:


success : bool
Whether the optimization was successful or not
x : List[float]
The optimized parameters
fx : float
The corresponding DataSet.evaluate() value of x