Clever Algorithms: Nature-Inspired Programming Recipes

By Jason Brownlee PhD

Home | Read Online

This is the ad-supported version of the book. Buy it now if you like it.

Extremal Optimization

Extremal Optimization, EO.


Extremal Optimization is a stochastic search technique that has the properties of being a local and global search method. It is generally related to hill-climbing algorithms and provides the basis for extensions such as Generalized Extremal Optimization.


Extremal Optimization is inspired by the Bak-Sneppen self-organized criticality model of co-evolution from the field of statistical physics. The self-organized criticality model suggests that some dynamical systems have a critical point as an attractor, whereby the systems exhibit periods of slow movement or accumulation followed by short periods of avalanche or instability. Examples of such systems include land formation, earthquakes, and the dynamics of sand piles. The Bak-Sneppen model considers these dynamics in co-evolutionary systems and in the punctuated equilibrium model, which is described as long periods of status followed by short periods of extinction and large evolutionary change.


The dynamics of the system result in the steady improvement of a candidate solution with sudden and large crashes in the quality of the candidate solution. These dynamics allow two main phases of activity in the system: 1) to exploit higher quality solutions in a local search like manner, and 2) escape possible local optima with a population crash and explore the search space for a new area of high quality solutions.


The objective of the information processing strategy is to iteratively identify the worst performing components of a given solution and replace or swap them with other components. This is achieved through the allocation of cost to the components of the solution based on their contribution to the overall cost of the solution in the problem domain. Once components are assessed they can be ranked and the weaker components replaced or switched with a randomly selected component.


Algorithm (below) provides a pseudocode listing of the Extremal Optimization algorithm for minimizing a cost function. The deterministic selection of the worst component in the SelectWeakComponent function and replacement in the SelectReplacementComponent function is classical EO. If these decisions are probabilistic making use of $\tau$ parameter, this is referred to as $\tau$-Extremal Optimization.

Input: ProblemSize, $iterations_{max}$, $\tau$
Output: $S_{best}$
$S_{current}$ $\leftarrow$ CreateInitialSolution(ProblemSize)
$S_{best}$ $\leftarrow$ $S_{current}$
For ($i=1$ To $iterations_{max}$)
    For ($Component_{i}$ $\in$ $S_{current}$)
        $Component_{i}^{cost}$ $\leftarrow$ Cost($Component_{i}$, $S_{current}$)
    RankedComponents $\leftarrow$ Rank($Si_{components}$)
    $Component_{i}$ $\leftarrow$ SelectWeakComponent(RankedComponents, $Component_{i}$, $\tau$)
    $Component_{j}$ $\leftarrow$ SelectReplacementComponent(RankedComponents, $\tau$)
    $S_{candidate}$ $\leftarrow$ Replace($S_{current}$, $Component_{i}$, $Component_{j}$)
    If (Cost($S_{candidate}$) $\leq$ Cost($S_{best}$))
        $S_{best}$ $\leftarrow$ $S_{candidate}$
Return ($S_{best}$)
Pseudocode for Extremal Optimization.


  • Extremal Optimization was designed for combinatorial optimization problems, although variations have been applied to continuous function optimization.
  • The selection of the worst component and the replacement component each iteration can be deterministic or probabilistic, the latter of which is referred to as $\tau$-Extremal Optimization given the use of a $\tau$ parameter.
  • The selection of an appropriate scoring function of the components of a solution is the most difficult part in the application of the technique.
  • For $\tau$-Extremal Optimization, low $\tau$ values are used (such as $\tau \in [1.2,1.6]$) have been found to be effective for the TSP.

Code Listing

Listing (below) provides an example of the Extremal Optimization algorithm implemented in the Ruby Programming Language. The algorithm is applied to the Berlin52 instance of the Traveling Salesman Problem (TSP), taken from the TSPLIB. The problem seeks a permutation of the order to visit cities (called a tour) that minimizes the total distance traveled. The optimal tour distance for Berlin52 instance is 7542 units.

The algorithm implementation is based on the seminal work by Boettcher and Percus [Boettcher1999]. A solution is comprised of a permutation of city components. Each city can potentially form a connection to any other city, and the connections to other cities ordered by distance may be considered its neighborhood. For a given candidate solution, the city components of a solution are scored based on the neighborhood rank of the cities to which they are connected: $fitness_k \leftarrow \frac{3}{r_i + r_j}$, where $r_i$ and $r_j$ are the neighborhood ranks of cities $i$ and $j$ against city $k$. A city is selected for modification probabilistically where the probability of selecting a given city is proportional to $n_i^{-\tau}$, where $n$ is the rank of city $i$. The longest connection is broken, and the city is connected with another neighboring city that is also probabilistically selected.

def euc_2d(c1, c2)
  Math.sqrt((c1[0] - c2[0])**2.0 + (c1[1] - c2[1])**2.0).round

def cost(permutation, cities)
  distance =0
  permutation.each_with_index do |c1, i|
    c2 = (i==permutation.size-1) ? permutation[0] : permutation[i+1]
    distance += euc_2d(cities[c1], cities[c2])
  return distance

def random_permutation(cities)
  perm ={|i| i}
  perm.each_index do |i|
    r = rand(perm.size-i) + i
    perm[r], perm[i] = perm[i], perm[r]
  return perm

def calculate_neighbor_rank(city_number, cities, ignore=[])
  neighbors = []
  cities.each_with_index do |city, i|
    next if i==city_number or ignore.include?(i)
    neighbor = {:number=>i}
    neighbor[:distance] = euc_2d(cities[city_number], city)
    neighbors << neighbor
  return neighbors.sort!{|x,y| x[:distance] <=> y[:distance]}

def get_edges_for_city(city_number, permutation)
  c1, c2 = nil, nil
  permutation.each_with_index do |c, i|
    if c == city_number
      c1 = (i==0) ? permutation.last : permutation[i-1]
      c2 = (i==permutation.size-1) ? permutation.first : permutation[i+1]
  return [c1, c2]

def calculate_city_fitness(permutation, city_number, cities)
  c1, c2 = get_edges_for_city(city_number, permutation)
  neighbors = calculate_neighbor_rank(city_number, cities)
  n1, n2 = -1, -1
  neighbors.each_with_index do |neighbor,i|
    n1 = i+1 if neighbor[:number] == c1
    n2 = i+1 if neighbor[:number] == c2
    break if n1!=-1 and n2!=-1
  return 3.0 / (n1.to_f + n2.to_f)

def calculate_city_fitnesses(cities, permutation)
  city_fitnesses = []
  cities.each_with_index do |city, i|
    city_fitness = {:number=>i}
    city_fitness[:fitness] = calculate_city_fitness(permutation, i, cities)
    city_fitnesses << city_fitness
  return city_fitnesses.sort!{|x,y| y[:fitness] <=> x[:fitness]}

def calculate_component_probabilities(ordered_components, tau)
  sum = 0.0
  ordered_components.each_with_index do |component, i|
    component[:prob] = (i+1.0)**(-tau)
    sum += component[:prob]
  return sum

def make_selection(components, sum_probability)
  selection = rand()
  components.each_with_index do |component, i|
    selection -= (component[:prob] / sum_probability)
    return component[:number] if selection <= 0.0
  return components.last[:number]

def probabilistic_selection(ordered_components, tau, exclude=[])
  sum = calculate_component_probabilities(ordered_components, tau)
  selected_city = nil
    selected_city = make_selection(ordered_components, sum)
  end while exclude.include?(selected_city)
  return selected_city

def vary_permutation(permutation, selected, new, long_edge)
  perm =
  c1, c2 = perm.rindex(selected), perm.rindex(new)
  p1,p2 = (c1<c2) ? [c1,c2] : [c2,c1]
  right = (c1==perm.size-1) ? 0 : c1+1
  if perm[right] == long_edge
    perm[p1+1..p2] = perm[p1+1..p2].reverse
    perm[p1...p2] = perm[p1...p2].reverse
  return perm

def get_long_edge(edges, neighbor_distances)
  n1 = neighbor_distances.find {|x| x[:number]==edges[0]}
  n2 = neighbor_distances.find {|x| x[:number]==edges[1]}
  return (n1[:distance] > n2[:distance]) ? n1[:number] : n2[:number]

def create_new_perm(cities, tau, perm)
  city_fitnesses = calculate_city_fitnesses(cities, perm)
  selected_city = probabilistic_selection(city_fitnesses.reverse, tau)
  edges = get_edges_for_city(selected_city, perm)
  neighbors = calculate_neighbor_rank(selected_city, cities)
  new_neighbor = probabilistic_selection(neighbors, tau, edges)
  long_edge = get_long_edge(edges, neighbors)
  return vary_permutation(perm, selected_city, new_neighbor, long_edge)

def search(cities, max_iterations, tau)
  current = {:vector=>random_permutation(cities)}
  current[:cost] = cost(current[:vector], cities)
  best = current
  max_iterations.times do |iter|
    candidate = {}
    candidate[:vector] = create_new_perm(cities, tau, current[:vector])
    candidate[:cost] = cost(candidate[:vector], cities)
    current = candidate
    best = candidate if candidate[:cost] < best[:cost]
    puts " > iter #{(iter+1)}, curr=#{current[:cost]}, best=#{best[:cost]}"
  return best

if __FILE__ == $0
  # problem configuration
  berlin52 = [[565,575],[25,185],[345,750],[945,685],[845,655],
  # algorithm configuration
  max_iterations = 250
  tau = 1.8
  # execute the algorithm
  best = search(berlin52, max_iterations, tau)
  puts "Done. Best Solution: c=#{best[:cost]}, v=#{best[:vector].inspect}"
Extremal Optimization in Ruby


Primary Sources

Extremal Optimization was proposed as an optimization heuristic by Boettcher and Percus applied to graph partitioning and the Traveling Salesman Problem [Boettcher1999]. The approach was inspired by the Bak-Sneppen self-organized criticality model of co-evolution [Bak1987] [Bak1993].

Learn More

A number of detailed reviews of Extremal Optimization have been presented, including a review and studies by Boettcher and Percus [Boettcher2000], an accessible review by Boettcher [Boettcher2000a], and a focused study on the Spin Glass problem by Boettcher and Percus [Boettcher2001].


[Bak1987] P. Bak and C. Tang and K. Wiesenfeld, "Self-organized criticality: An explanation of the 1/f noise", Physical Review Letters, 1987.
[Bak1993] P. Bak and K. Sneppen, "Punctuated equilibrium and criticality in a simple model of evolution", Physical Review Letters, 1993.
[Boettcher1999] S. Boettcher and A. G. Percus, "Extremal Optimization: Methods derived from Co-Evolution", in Proceedings of the Genetic and Evolutionary Computation Conference, 1999.
[Boettcher2000] S. Boettcher and A. Percus, "Nature’s Way of Optimizing", Artificial Intelligence, 2000.
[Boettcher2000a] S. Boettcher, "Extremal optimization: heuristics via coevolutionary avalanches", Computing in Science & Engineering, 2000.
[Boettcher2001] S. Boettcher and A. G. Percus, "Optimization with extremal dynamics", Phys. Rev. Lett., 2001.
Clever Algorithms: Nature-Inspired Programming Recipes

Free Course

Get one algorithm per week...
  • ...delivered to your inbox
  • ...described in detail
  • read at your own pace
Sign-up Now

Own A Copy

This 438-page ebook has...
  • ...45 algorithm descriptions
  • practice usage
  • ...pseudo code
  • ...Ruby code
  • ...primary sources
Buy Now

Please Note: This content was automatically generated from the book content and may contain minor differences.

Clever Algorithms: Nature-Inspired Programming Recipes

Do you like Clever Algorithms?
Buy the book now.

© Copyright 2015. All Rights Reserved. | About | Contact | Privacy