Optimizer Return Type

class MinimizeResult(success=False, x=None, fx=inf, stats=None, origin=None)[source]

The return value of BaseOptimizer classes. The results of an optimization can be accessed by:



Whether the optimization was successful or not.


The optimized parameters.


The corresponding function value of x.

statsDict[str, Any]

Dictionary of various statistics related to the optimization.

originDict[str, Any]

Dictionary with configurations details of the optimizer which produced the result.

Abstract Base Optimizer

class BaseOptimizer(_opt_id=None, _signal_pipe=None, _results_queue=None, _pause_flag=None, _is_log_detailed=False, _workers=1, _backend='threads', **kwargs)[source]

Abstract base class for optimizers used within the GloMPO framework. Cannot be used directly, must be superclassed by child classes which implement a specific optimization algorithm.


To Ensure GloMPO Functionality:

  1. Messages to the GloMPO manager must be sent via message_manager().

  2. Messages from the manager must be read by check_messages() which executes BaseOptimizer methods corresponding to the signals. The defaults provided in the BaseOptimizer class are generally suitable and should not need to be overwritten! The only methods which must implemented by the user are:

    1. minimize() which is the algorithm specific optimization loop;

    2. callstop() which interrupts the optimization loop.

  3. The statement self._pause_signal.wait() must appear somewhere in the body of the iterative loop to allow the optimizer to be paused by the manager as needed.

  4. Optional: the class should be able to handle resuming an optimization from any point using checkpoint_save() and checkpoint_load().


The TestSubclassGlompoCompatible test in test_optimizers.py can be used to test that an optimizer meets these criteria and is GloMPO compatible. Simply add your optimizer to AVAILABLE_CLASSES there.



Unique optimizer identifier.


Bidirectional pipe used to message management behaviour between the manager and optimizer.


Threading queue into which optimizer iteration results are centralised across all optimizers and sent to the manager.


Event flag which can be used to pause the optimizer between iterations.


See is_log_detailed.


The number of concurrent calculations used by the optimizer. Defaults to one. The manager will only start the optimizer if there are sufficient slots available for it.


The type of concurrency used by the optimizers (processes or threads). This is not necessarily applicable to all optimizers. This will default to 'threads' unless forced to use 'processes' (see GloMPOManager.setup() and Parallelism).


Optimizer specific initialization arguments.


The user need not concern themselves with the particulars of the _opt_id, _signal_pipe, _results_queue, _pause_flag, _workers and _backend parameters. These are automatically generated by the manager.


Make sure to call the superclass initialization method as the first statement when creating your own optimizers:


incumbentDict[str, Any]

Dictionary with keys 'x' and 'fx' which contain the lowest function value and associated parameter vector seen thus far by the optimizer.


If True:

  1. When the task’s __call__() method is called, its detailed_call() method will actually be evaluated.

  2. All the return values from detailed_call() will be added to the log history of the optimizer.

  3. The function itself will only return the function value (as if the __call__() method had been used).


This will not result in a doubling of the computational time as the original call will be intercepted. This setting is useful for cases where optimizers do not need/cannot handle the extra information generated by a detailed call but one would still like the iteration details logged for analysis.


Logger instance into which status messages may be added.


Maximum number of threads/processes the optimizer may use for evaluating the objective function.


Optional class attribute flagging that the optimizer is only compatible with a specific scaling:

  • 'linear': All parameters scaled [0, 1];

  • 'std': All parameters scaled [-1, 1];

  • 'none': No scaling must be used.

  • If not set, then any scaling may be used.

property is_restart

True if the optimizer is loaded from a checkpoint.

property opt_id

The unique GloMPO generated identification number of the optimizer.

classmethod checkpoint_load(path, **kwargs)[source]

Recreates an optimizer from a saved snapshot.



Path to checkpoint file from which to build from. It must be a file produced by the corresponding checkpoint_save() method.


See __init__.


This is a basic implementation which should suit most optimizers; may need to be overwritten.

abstract minimize(function, x0, bounds)[source]

Run the optimization algorithm to minimize a function.



Function to be minimized. See BaseFunction for an API guide. In the context of ParAMS, function is automatically produced by the Optimization instance, and behaves like a regular callable such that fx = function(x), returning the loss function value fx to the parameter set x.


The initial optimizer starting point. In the context of GloMPO, it is provided by BaseGenerator objects.


Min/max boundary limit pairs for each element of the input vector to the minimisation function.


Even though your optimizer might not support lower and upper parameter bounds, ParAMS internally enforces the constraints as defined through the Parameter Interface by returning float('inf') whenever a candidate is outside of the bounded space.


Processes and executes manager signals from the manager.


This implementation has been very carefully structured to operate as expected by the manager. Should be suitable for all optimizers. Should not be overwritten.



Signal keys received by the manager during the call.

message_manager(key, message=None)[source]

Sends arguments to the manager.


Should not be overwritten.



Indicates the type of signal sent. The manager recognises the following keys:

0: The optimizer has terminated normally according to its own internal convergence conditions.

1: Confirm that a pause signal has been received from the manager and the optimizer has complied with the request.

9: General message to be appended to the optimizer’s log.


Message to be appended when sending signal 9.

abstract callstop(reason)[source]

Breaks out of the minimize() minimization loop.

checkpoint_save(path, force=None, block=None)[source]

Save current state, suitable for restarting.



Path to file into which the object will be dumped. Typically supplied by the manager.


Set of variable names which will be forced into the dumped file. Convenient shortcut for overwriting if fails for a particular optimizer because a certain variable is filtered out of the data dump.


Set of variable names which are typically caught in the construction of the checkpoint but should be excluded. Useful for excluding some properties.


  1. Only the absolutely critical aspects of the state of the optimizer need to be saved. The manager will resupply multiprocessing parameters when the optimizer is reconstructed.

  2. This method will almost never be called directly by the user. Rather it will be called (via signals) by the manager.

  3. This is a basic implementation which should suit most optimizers; may need to be overwritten. Typically it is sufficient to call the super method and use the force and block parameters to get a working implementation.

inject(x, fx)[source]

Updates the incumbent with a better solution from the manager.