simplex — Simplex Search Algorithms

This module contains simplex search algorithms.

class dfoalgos.simplex.SimplexSearch(problem, initial_simplex, max_iterations=inf, xtol=None, ftol=None, shrink_factor=0.5, individual_factory=None, sort_key=None, verbosity=1, **kwargs)

Abstract base class for simplex search algorithms.

__init__(problem, initial_simplex, max_iterations=inf, xtol=None, ftol=None, shrink_factor=0.5, individual_factory=None, sort_key=None, verbosity=1, **kwargs)

Constructor.

Parameters:
  • problem (optproblems.Problem) – An optimization problem.
  • initial_simplex (list) – The points or individuals of the initial simplex.
  • max_iterations (int, optional) – A potential budget restriction on the number of iterations. Default is unlimited.
  • xtol (float, optional) – The algorithm stops when the maximal absolute deviation in all dimensions of the search space between the best and all other points in the simplex drops below this value. By default, this criterion is off.
  • ftol (float, optional) – The algorithm stops when the absolute difference of objective values between the best and worst individual in the simplex drops below this value. This criterion can only be used with scalar objective values. By default, this criterion is off.
  • shrink_factor (float, optional) – The parameter for the shrink operation.
  • individual_factory (callable, optional) – A callable taking a point as argument. It must return an object with two member attributes phenome and objective_values. The phenome contains the solution, while objective_values is set to None. By default, optproblems.base.Individual is used.
  • sort_key (callable, optional) – A sort key for ranking the individuals in the simplex. By default, dfoalgos.base.lexicographic_sort_key() is used.
  • verbosity (int, optional) – A value of 0 means quiet, 1 means some information is printed to standard out on start and termination of this algorithm.
__str__()

Return the algorithm’s name.

static calc_mean_point(individuals)

Calculate the mean point for reflection.

displace_operation(reference_sort_key, mean_point, worst_point, factor)

Carry out a displacement operation of a single point.

Parameters:
  • reference_sort_key (float or iterable) – Comparison to this reference sort key determines if the operation was successful.
  • mean_point (iterable) – Normally the mean of all points in the simplex except for the worst one.
  • worst_point (iterable) – The worst point in the simplex, which is to be displaced.
  • factor (float) – Determines how far the worst point is displaced.
Returns:

fail – If the operation was unsuccessful.

Return type:

bool

classmethod minimize(fun, x0, args=(), bounds=None, callback=None, options=None, **kwargs)

Minimization of scalar function of one or more variables.

This function mimics the SciPy interface for optimizers.

Parameters:
  • fun (callable) – Objective function.
  • x0 (sequence) – Initial guess.
  • args (tuple, optional) – Extra arguments passed to the objective function.
  • bounds (sequence, optional) – Bounds for variables. (min, max) pairs for each element in x, defining the bounds on that parameter. Use None for one of min or max when there is no bound in that direction.
  • callback (callable, optional) – Called after each iteration, as callback(xk), where xk is the current parameter vector.
  • options (dict, optional) –

    A dictionary of solver options. The following generic options are accepted:

    maxiter : int
    Maximum number of iterations to perform.
    disp : bool
    Set to True to print convergence messages.
Returns:

result – The optimization result represented as a OptimizeResult object. Important attributes are: x the solution array, success a Boolean flag indicating if the optimizer exited successfully and message which describes the cause of the termination.

Return type:

OptimizeResult

run()

Run the algorithm.

The step function is called in a loop. The algorithm stops when a StopIteration exception is caught or when the stopping criterion evaluates to True.

Returns:best_individual – The best individual of the simplex according to the sort key.
Return type:Individual
scale(individuals, fixed_ind, factor)

Scale individuals by a given factor.

This method unwraps the points, delegates to scale(), and then again wraps the scaled points in individuals with the individual factory.

Parameters:
  • individuals (iterable of Individual) – The individuals containing the points to scale.
  • fixed_ind (Individual) – The individual containing the fixed point of the transformation.
  • factor (float) – The scale factor.
Returns:

scaled_individuals – List containing scaled individuals.

Return type:

list of Individual

shrink_operation()

Carry out the shrink operation.

Returns:fail – If the operation was unsuccessful.
Return type:bool
step()

Perform one optimization step.

This is an abstract method. Overwrite in your own simplex search implementation.

stopping_criterion()

Check if optimization should go on.

The algorithm halts when this method returns True or raises an exception.

Raises:
  • ResourcesExhausted – when number of iterations reaches maximum
  • Stalled – when the xtol or ftol criteria trigger or the simplex contains a duplicate point
update_simplex(new_simplex)

Overwrite the current simplex with a new one.

Parameters:new_simplex (list of Individual) – The new solutions.
class dfoalgos.simplex.SpendleySimplexSearch(problem, initial_simplex, max_age=None, **kwargs)

Fixed-shape, variable-size simplex search.

This algorithm is regarded as the oldest simplex search around. It was introduced by [Spendley1962], who were also the first to propose the basic reflection operation. This implementation follows the description of [Gurson2000] (pp. 13-23), which extends the original algorithm with a shrink operation and generalizes it to more than two dimensions.

References

[Spendley1962]W. Spendley, G. R. Hext, and F. R. Himsworth (1962). Sequential Application of Simplex Designs in Optimisation and Evolutionary Operation. Technometrics, Vol. 4, No. 4, pp. 441-461.
[Gurson2000]Adam P. Gurson (2000). Simplex Search Behavior in Nonlinear Optimization. Honors thesis. http://www.cs.wm.edu/~va/CS495/gurson.pdf
__init__(problem, initial_simplex, max_age=None, **kwargs)

Constructor.

Parameters:
  • problem (optproblems.Problem) – An optimization problem.
  • initial_simplex (list) – The points of the initial simplex.
  • max_age (int, optional) – A shrink operation is triggered when the best individual’s age reaches max_age. The default is len(initial_simplex).
  • kwargs – Arbitrary keyword arguments, passed to the super class.
step()

Perform one optimization step.

class dfoalgos.simplex.NelderMeadSimplexSearch(problem, initial_simplex, reflection_factor=1.0, expansion_factor=2.0, contraction_factor=0.5, **kwargs)

The simplex search by Nelder and Mead.

This algorithm was proposed by [Nelder1965]. It adapts the simplex shape to the landscape by expansion and contration operations. This algorithm is fast also on ill-conditioned problems, but may converge to a non-stationary point.

References

[Nelder1965]Nelder, J.A. and Mead, R. (1965). A simplex method for function minimization, The Computer Journal, Vol. 7, No. 4, pp. 308-313. https://dx.doi.org/10.1093/comjnl/7.4.308
__init__(problem, initial_simplex, reflection_factor=1.0, expansion_factor=2.0, contraction_factor=0.5, **kwargs)

Constructor.

Parameters:
  • problem (optproblems.Problem) – An optimization problem.
  • initial_simplex (list) – The points of the initial simplex.
  • reflection_factor (float, optional) – The parameter used for reflection.
  • expansion_factor (float, optional) – The parameter used for expansion.
  • contraction_factor (float, optional) – The parameter used for inside and outside contraction.
  • kwargs – Arbitrary keyword arguments, passed to the super class.
step()

Perform one optimization step.

class dfoalgos.simplex.NMKSimplexSearch(problem, initial_simplex, max_num_reorientations=None, recalc_alpha=True, alpha0=0.0001, **kwargs)

Nelder-Mead simplex search with reorientations by Kelley.

This algorithm employs a sufficient decrease condition to detect stagnation and convergence to a non-stationary point. If not fulfilled, the simplex is reinitialized, based on information from the simplex gradient. Thus, this algorithm is unfortunately not rank-based. For details see [Kelley1999].

References

[Kelley1999]C. T. Kelley (1999). Detection and Remediation of Stagnation in the Nelder-Mead Algorithm Using a Sufficient Decrease Condition, SIAM Journal on Optimization, 10:1, pp. 43-55. https://doi.org/10.1137/S1052623497315203
__init__(problem, initial_simplex, max_num_reorientations=None, recalc_alpha=True, alpha0=0.0001, **kwargs)

Constructor.

Parameters:
  • problem (optproblems.Problem) – An optimization problem.
  • initial_simplex (list) – The points of the initial simplex.
  • max_num_reorientations (int, optional) – The maximal number of allowed reorientations before giving up.
  • recalc_alpha (bool, optional) – This flag decides if alpha is recalculated after reorientations.
  • alpha0 (float, optional) – A coefficient for the calculation of alpha.
  • kwargs – Arbitrary keyword arguments, passed to the super class.
step()

Perform one optimization step.

Usage Example

The easiest way to use the optimization algorithms is via the minimize() classmethod that mimics the interface of SciPy optimizers.

from dfoalgos.simplex import NelderMeadSimplexSearch

def sphere(phenome):
    return sum(x * x for x in phenome)

result = NelderMeadSimplexSearch.minimize(sphere, [1.333, 1.999])

Additional keyword arguments will be passed to the constructor of the algorithm. If bounds are provided (recommended), the search space is normalized internally and violations of the bound constraints are repaired automatically by reflection.

result = NelderMeadSimplexSearch.minimize(sphere, [1.333, 1.999], xtol=1e-4, bounds=[(-1.5, 1.5), (-1.5, 1.5)])

Full control over the setup can be obtained by using the internal API:

from optproblems import Problem, ScalingPreprocessor, BoundConstraintsRepair
from dfoalgos.base import create_tilted_regular_simplex
from dfoalgos.simplex import NelderMeadSimplexSearch

def sphere(phenome):
    return sum(x * x for x in phenome)

x0 = [1.333, 1.999]
dim = 2
min_bounds = [-1.5, -1.5]
max_bounds = [1.5, 1.5]
unit_cube = ([0.0] * dim, [1.0] * dim)
scale_to_orig = ScalingPreprocessor(from_cuboid=unit_cube,
                                    to_cuboid=(min_bounds, max_bounds))
scale_to_unit = ScalingPreprocessor(from_cuboid=(min_bounds, max_bounds),
                                    to_cuboid=unit_cube)
repair = BoundConstraintsRepair((min_bounds, max_bounds),
                                ["reflection"] * dim,
                                previous_preprocessor=scale_to_orig)
problem = Problem(sphere, phenome_preprocessor=repair)
initial_simplex = create_tilted_regular_simplex(scale_to_unit(x0),
                                                size_param=1.0)
algo = NelderMeadSimplexSearch(problem, initial_simplex)
best_individual = algo.run()
best_individual = min(algo.simplex, key=algo.sort_key) # alternatively
print(algo.last_termination)
print(best_individual.objective_values)
print(problem.consumed_evaluations)
print(algo.iteration)