Clever Algorithms: Nature-Inspired Programming Recipes

By Jason Brownlee PhD

Home | Read Online



This is the ad-supported version of the book. Buy it now if you like it.

Iterated Local Search

Iterated Local Search, ILS.

Taxonomy

Iterated Local Search is a Metaheuristic and a Global Optimization technique. It is an extension of Multi-Restart Search and may be considered a parent of many two-phase search approaches such as the Greedy Randomized Adaptive Search Procedure and Variable Neighborhood Search.

Strategy

The objective of Iterated Local Search is to improve upon stochastic Multi-Restart Search by sampling in the broader neighborhood of candidate solutions and using a Local Search technique to refine solutions to their local optima. Iterated Local Search explores a sequence of solutions created as perturbations of the current best solution, the result of which is refined using an embedded heuristic.

Procedure

Algorithm (below) provides a pseudocode listing of the Iterated Local Search algorithm for minimizing a cost function.

Output: $S_{best}$
$S_{best}$ $\leftarrow$ ConstructInitialSolution()
$S_{best}$ $\leftarrow$ LocalSearch()
SearchHistory $\leftarrow$ $S_{best}$
While ($\neg$ StopCondition())
    $S_{candidate}$ $\leftarrow$ Perturbation($S_{best}$, SearchHistory)
    $S_{candidate}$ $\leftarrow$ LocalSearch($S_{candidate}$)
    SearchHistory $\leftarrow$ $S_{candidate}$
    If (AcceptanceCriterion($S_{best}$, $S_{candidate}$, SearchHistory))
        $S_{best}$ $\leftarrow$ $S_{candidate}$
    End
End
Return ($S_{best}$)
Pseudocode for Iterated Local Search.

Heuristics

  • Iterated Local Search was designed for and has been predominately applied to discrete domains, such as combinatorial optimization problems.
  • The perturbation of the current best solution should be in a neighborhood beyond the reach of the embedded heuristic and should not be easily undone.
  • Perturbations that are too small make the algorithm too greedy, perturbations that are too large make the algorithm too stochastic.
  • The embedded heuristic is most commonly a problem-specific local search technique.
  • The starting point for the search may be a randomly constructed candidate solution, or constructed using a problem-specific heuristic (such as nearest neighbor).
  • Perturbations can be made deterministically, although stochastic and probabilistic (adaptive based on history) are the most common.
  • The procedure may store as much or as little history as needed to be used during perturbation and acceptance criteria. No history represents a random walk in a larger neighborhood of the best solution and is the most common implementation of the approach.
  • The simplest and most common acceptance criteria is an improvement in the cost of constructed candidate solutions.

Code Listing

Listing (below) provides an example of the Iterated Local Search algorithm implemented in the Ruby Programming Language. The algorithm is applied to the Berlin52 instance of the Traveling Salesman Problem (TSP), taken from the TSPLIB. The problem seeks a permutation of the order to visit cities (called a tour) that minimizes the total distance traveled. The optimal tour distance for Berlin52 instance is 7542 units.

The Iterated Local Search runs for a fixed number of iterations. The implementation is based on a common algorithm configuration for the TSP, where a 'double-bridge move' (4-opt) is used as the perturbation technique, and a stochastic 2-opt is used as the embedded Local Search heuristic. The double-bridge move involves partitioning a permutation into 4 pieces (a,b,c,d) and putting it back together in a specific and jumbled ordering (a,d,c,b).

def euc_2d(c1, c2)
  Math.sqrt((c1[0] - c2[0])**2.0 + (c1[1] - c2[1])**2.0).round
end

def cost(permutation, cities)
  distance =0
  permutation.each_with_index do |c1, i|
    c2 = (i==permutation.size-1) ? permutation[0] : permutation[i+1]
    distance += euc_2d(cities[c1], cities[c2])
  end
  return distance
end

def random_permutation(cities)
  perm = Array.new(cities.size){|i| i}
  perm.each_index do |i|
    r = rand(perm.size-i) + i
    perm[r], perm[i] = perm[i], perm[r]
  end
  return perm
end

def stochastic_two_opt(permutation)
  perm = Array.new(permutation)
  c1, c2 = rand(perm.size), rand(perm.size)
  exclude = [c1]
  exclude << ((c1==0) ? perm.size-1 : c1-1)
  exclude << ((c1==perm.size-1) ? 0 : c1+1)
  c2 = rand(perm.size) while exclude.include?(c2)
  c1, c2 = c2, c1 if c2 < c1
  perm[c1...c2] = perm[c1...c2].reverse
  return perm
end

def local_search(best, cities, max_no_improv)
  count = 0
  begin
    candidate = {:vector=>stochastic_two_opt(best[:vector])}
    candidate[:cost] = cost(candidate[:vector], cities)
    count = (candidate[:cost] < best[:cost]) ? 0 : count+1
    best = candidate if candidate[:cost] < best[:cost]
  end until count >= max_no_improv
  return best
end

def double_bridge_move(perm)
  pos1 = 1 + rand(perm.size / 4)
  pos2 = pos1 + 1 + rand(perm.size / 4)
  pos3 = pos2 + 1 + rand(perm.size / 4)
  p1 = perm[0...pos1] + perm[pos3..perm.size]
  p2 = perm[pos2...pos3] + perm[pos1...pos2]
  return p1 + p2
end

def perturbation(cities, best)
  candidate = {}
  candidate[:vector] = double_bridge_move(best[:vector])
  candidate[:cost] = cost(candidate[:vector], cities)
  return candidate
end

def search(cities, max_iterations, max_no_improv)
  best = {}
  best[:vector] = random_permutation(cities)
  best[:cost] = cost(best[:vector], cities)
  best = local_search(best, cities, max_no_improv)
  max_iterations.times do |iter|
    candidate = perturbation(cities, best)
    candidate = local_search(candidate, cities, max_no_improv)
    best = candidate if candidate[:cost] < best[:cost]
    puts " > iteration #{(iter+1)}, best=#{best[:cost]}"
  end
  return best
end

if __FILE__ == $0
  # problem configuration
  berlin52 = [[565,575],[25,185],[345,750],[945,685],[845,655],
   [880,660],[25,230],[525,1000],[580,1175],[650,1130],[1605,620],
   [1220,580],[1465,200],[1530,5],[845,680],[725,370],[145,665],
   [415,635],[510,875],[560,365],[300,465],[520,585],[480,415],
   [835,625],[975,580],[1215,245],[1320,315],[1250,400],[660,180],
   [410,250],[420,555],[575,665],[1150,1160],[700,580],[685,595],
   [685,610],[770,610],[795,645],[720,635],[760,650],[475,960],
   [95,260],[875,920],[700,500],[555,815],[830,485],[1170,65],
   [830,610],[605,625],[595,360],[1340,725],[1740,245]]
  # algorithm configuration
  max_iterations = 100
  max_no_improv = 50
  # execute the algorithm
  best = search(berlin52, max_iterations, max_no_improv)
  puts "Done. Best Solution: c=#{best[:cost]}, v=#{best[:vector].inspect}"
end
Iterated Local Search in Ruby

References

Primary Sources

The definition and framework for Iterated Local Search was described by Stützle in his PhD dissertation [Stutzle1998]. Specifically he proposed constrains on what constitutes an Iterated Local Search algorithm as 1) a single chain of candidate solutions, and 2) the method used to improve candidate solutions occurs within a reduced space by a black-box heuristic. Stützle does not take credit for the approach, instead highlighting specific instances of Iterated Local Search from the literature, such as 'iterated descent' [Baum1986], 'large-step Markov chains' [Martin1991], 'iterated Lin-Kernighan' [Johnson1990], 'chained local optimization' [Martin1996], as well as [Baxter1981] that introduces the principle, and [Johnson1997] that summarized it (list taken from [Ramalhinho-Lourenco2003]).

Learn More

Two early technical reports by Stützle that present applications of Iterated Local Search include a report on the Quadratic Assignment Problem [Stuetzle1999], and another on the permutation flow shop problem [Stutzle1998a]. Stützle and Hoos also published an early paper studying Iterated Local Search for to the TSP [Stutzle1999]. Lourenco, Martin, and Stützle provide a concise presentation of the technique, related techniques and the framework, much as it is presented in Stützle's dissertation [Lourenco2001]. The same author's also preset an authoritative summary of the approach and its applications as a book chapter [Ramalhinho-Lourenco2003].

Bibliography

[Baum1986] E. B. Baum, "Towards practical "neural" computation for combinatorial optimization problems", in AIP conference proceedings: Neural Networks for Computing, 1986.
[Baxter1981] J. Baxter, "Local optima avoidance in depot location", Journal of the Operational Research Society, 1981.
[Johnson1990] D. S. Johnson, "Local optimization and the travelling salesman problem", in Proceedings of the 17th Colloquium on Automata, Languages, and Programming, 1990.
[Johnson1997] D. S. Johnson and L. A. McGeoch, "The travelling salesman problem: A case study in local optimization", in Local Search in Combinatorial Optimization, pages 215–310, John Wiley & Sons, 1997.
[Lourenco2001] H. R. Lourenco and O. Martin and T. Stützle, "A Beginner’s Introduction to Iterated Local Search", in Proceedings 4th Metaheuristics International Conference (MIC’2001), 2001.
[Martin1991] O. Martin and S. W. Otto and E. W. Felten, "Large-step Markov chains for the traveling salesman problems", Complex Systems, 1991.
[Martin1996] O. Martin and S. W. Otto, "Combining simulated annealing with local search heuristics", Annals of Operations Research, 1996.
[Ramalhinho-Lourenco2003] H. Ramalhinho–Lourenco and O. C. Martin and T. Stützle, "Iterated Local Search", in Handbook of Metaheuristics, pages 320–353, Springer, 2003.
[Stuetzle1999] T. Stützle, "Iterated local search for the quadratic assignment problem", Technical Report AIDA-99-03, FG Intellektik, FB Informatik, TU Darmstadt, 1999.
[Stutzle1998] T. G. Stützle, "Local Search Algorithms for Combinatorial Problems: Analysis, Improvements, and New Applications", [PhD Thesis] Darmstadt University of Technology, Department of Computer Science, 1998.
[Stutzle1998a] T. Stützle, "Applying iterated local search to the permutation flow shop problem", Technical Report AIDA–98–04, FG Intellektik, TU Darmstadt, 1998.
[Stutzle1999] T. Stützle and H. H. Hoos, "Analyzing the run-time behaviour of iterated local search for the TSP", in Proceedings III Metaheuristics International Conference, 1999.
Clever Algorithms: Nature-Inspired Programming Recipes

Free Course

Get one algorithm per week...
  • ...delivered to your inbox
  • ...described in detail
  • ...to read at your own pace
Sign-up Now












Own A Copy

This 438-page ebook has...
  • ...45 algorithm descriptions
  • ...best practice usage
  • ...pseudo code
  • ...Ruby code
  • ...primary sources
Buy Now




Please Note: This content was automatically generated from the book content and may contain minor differences.

Clever Algorithms: Nature-Inspired Programming Recipes



Do you like Clever Algorithms?
Buy the book now.


© Copyright 2015. All Rights Reserved. | About | Contact | Privacy