Helpers

Useful static functions and classes used throughout the GloMPO package.

class BoundGroup(iterable=(), /)[source]

Used to better represent the parameter bounds in a human-readable but reusable way in YAML.

exception CheckpointingError[source]

Error raised during creation of a checkpoint which would result in an incomplete checkpoint.

class FlowList(iterable=(), /)[source]

Used to wrap lists which should appear in YAML flow style rather than default block style.

class LiteralWrapper[source]

Used by YAML to save some block strings as literals

class SplitOptimizerLogs(filepath='', propagate=False, formatter=None)[source]

Splits print statements from child processes and threads to text files. If this filter is applied to a Handler on the 'glompo.optimizers' logger it will automatically separate the single 'glompo.optimizers' logging stream into separate ones for each individual optimizer.

Parameters

filepath

Directory in which new log files will be located.

propagate

If propagate is True then the filter will allow the message to pass through the filter allowing all 'glompo.optimizers' logging to be simultaneously recorded together.

formatter

Formatting to be applied in the new logs. If not supplied the logging default is used.

Examples

>>> frmttr = logging.Formatter("%(levelname)s : %(name)s : %(processName)s :: %(message)s")

Adds individual handlers for each optimizer created. Format for the new handlers is set by frmttr propagate=True sends the message on to opt_handler which in this case is sys.stdout.

>>> opt_filter = SplitOptimizerLogs("diverted_logs", propagate=True, formatter=frmttr)
>>> opt_handler = logging.StreamHandler(sys.stdout)
>>> opt_handler.addFilter(opt_filter)
>>> opt_handler.setFormatter(frmttr)

Messages of the 'INFO' level will propogate to sys.stdout.

>>> opt_handler.setLevel('INFO')
>>> logging.getLogger("glompo.optimizers").addHandler(opt_handler)

The level for the handlers made in SplitOptimizerLogs is set at the higher level. Here 'DEBUG' level messages will be logged to the files even though 'INFO' level propagates to the console.

>>> logging.getLogger("glompo.optimizers").setLevel('DEBUG')

Initialize a filter.

Initialize with the name of the logger which, together with its children, will have its events allowed through the filter. If no name is specified, allow every event.

filter(record)[source]

Determine if the specified record is to be logged.

Returns True if the record should be logged, or False otherwise. If deemed appropriate, the record may be modified in-place.

exception StopInterrupt[source]

Raised if a file called EXIT is found in the working directory.

class WorkInDirectory(path)[source]

Context manager to manage the creation of new files in a different directory from the working one.

Parameters

path

A directory to which the working directory will be changed on entering the context manager. If the directory does not exist, it will be created. The working directory is changed back on exiting the context manager.

bound_group_presenter(dumper, bound_group)[source]

Unique YAML constructor for Bound.

deepsizeof(obj)[source]

Recursively determines the byte size of an object. Any initialised objects (not including Python primitives) must correct implement __sizeof__ for this method to work correctly.

distance(pt1, pt2)[source]

Calculate the Euclidean distance between two points.

Examples

>>> distance([0,0,0], [1,1,1])
1.7320508075688772
flow_presenter(dumper, lst)[source]

YAML Presenter for a FlowList style list.

generator_presenter(dumper, generator)[source]

Unique YAML constructor for the BaseGenerator.

glompo_colors(opt_id=None)[source]

Returns a matplotlib.colors.ListedColormap containing the custom GloMPO color cycle. If opt_id is provided than the specific color at that index is returned instead.

infer_headers(infer_from)[source]

Infers tables.Col types based on a function evaluation return. Only used if BaseFunction.headers() is not defined. The produced headings are required by FileLogger. Default header names will be used: 'result_0', ..., 'result_N'.

Parameters

infer_from

A function evaluation result

Returns

Dict[str, tables.Col]

Mapping of column names to type

Examples

>>> import tables
>>> infer_headers((1232.1431423, [213, 345, 675], False, "valid"))
{'result_0': tables.Float64Col(pos=0),
 'result_1': tables.Int64Col((1, 3), pos=1),
 'result_2': tables.BoolCol(pos=2),
 'result_3': tables.StringCol(280, pos=3)}
is_bounds_valid(bounds, raise_invalid=True)[source]

Checks if provided parameter bounds are valid. ‘Valid’ is defined as meaning that every lower bound is less than the upper bound and every bound is finite.

Parameters

bounds

Sequence of min/max pairs indicating the interval in which the optimizer must search for each parameter.

raise_invalid

If True raises an error if the bounds are invalid otherwise a bool is returned.

Returns

bool

True if the bounds are all valid, False otherwise.

Raises

ValueError

If raise_invalid is True and bounds are invalid.

Examples

>>> is_bounds_valid([(0, 1), (-1, 0)])
True
>>> is_bounds_valid([(0, 0), (0, float('inf'))], False)
False
literal_presenter(dumper, data)[source]

Wrapper around string for better readability in YAML file.

nested_string_formatting(nested_str)[source]

Reformat strings produced by the ._CombiCore class. BaseStoppers and BaseExitConditions produce strings detailing their last evaluation. This method parses and indents each nested level of the string to make it more human-readable.

Parameters

nested_str

Return produced by BaseStopper.__str__() or BaseStopper.str_with_result().

Returns

str

String with added nesting and indenting.

Examples

>>> nested_string_formatting("[TrueStopper() AND\n"
...                          "[[TrueStopper() OR\n"
...                          "[FalseStopper() AND\n"
...                          "[TrueStopper() OR\n"
...                          "FalseStopper()]]]\n"
...                          "OR\n"
...                          "FalseStopper()]]")
"TrueStopper() AND\n" \
"[\n" \
" [\n" \
"  TrueStopper() OR\n" \
"  [\n" \
"   FalseStopper() AND\n" \
"   [\n" \
"    TrueStopper() OR\n" \
"    FalseStopper()\n" \
"   ]\n" \
"  ]\n" \
" ]\n" \
" OR\n" \
" FalseStopper()\n" \
"]"
number_available_cores()[source]

Provides a consistent manner to get the number of available CPUs for a particular process. First, returns the environment variable NSCM if set. Second, tries to measure CPU affinity. Third, tries to return the number of physical cores present. Finally, returns the number of logical cores.

numpy_array_presenter(dumper, numpy_arr)[source]

Unique YAML constructor for numpy.ndarray.

numpy_dtype_presenter(dumper, numpy_type)[source]

Unique YAML constructor for numpy.dtype.

optimizer_selector_presenter(dumper, opt_selector)[source]

Unique YAML constructor for the BaseSelector.

present_memory(n_bytes, digits=2)[source]

Accepts an integer number of bytes and returns a string formatted to the most appropriate units.

Parameters

n_bytes

Number of bytes to write in human readable format

digits

Number of decimal places to include in the result

Returns

str

Converted data quantity and units

Examples

>>> present_memory(123456789, 1)
'117.7MB'
rolling_min(x)[source]

Returns a vector of shape x where each index has been replaced by the smallest number seen thus far when reading the list sequentially from left to right.

Examples

>>> rolling_min([3, 4, 5, 6, 2, 3, 4, 1, 2, 3])
[3, 3, 3, 3, 2, 2, 2, 1, 1, 1]
unknown_object_presenter(dumper, unknown_class)[source]

Parses all remaining classes into strings and python primitives for YAML files. To ensure the YAML file is human readable and can be loaded only with native python types. This constructor parses all unknown objects into a dictionary containing their name and instance variables or, if uninitialised, just the class name.

unravel(seq)[source]

From a nested sequence of items of any type, return a flatten sequence of items.

Examples

>>> unravel([0, [1], [2, 3, [4, 5, [6], 7], 8, [9]]])
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]