Strength Pareto Evolutionary AlgorithmStrength Pareto Evolutionary Algorithm, SPEA, SPEA2. TaxonomyStrength Pareto Evolutionary Algorithm is a Multiple Objective Optimization (MOO) algorithm and an Evolutionary Algorithm from the field of Evolutionary Computation. It belongs to the field of Evolutionary Multiple Objective (EMO) algorithms. Refer to for more information and references on Multiple Objective Optimization. Strength Pareto Evolutionary Algorithm is an extension of the Genetic Algorithm for multiple objective optimization problems. It is related to sibling Evolutionary Algorithms such as Nondominated Sorting Genetic Algorithm (NSGA), VectorEvaluated Genetic Algorithm (VEGA), and Pareto Archived Evolution Strategy (PAES). There are two versions of SPEA, the original SPEA algorithm and the extension SPEA2. Additional extensions include SPEA+ and iSPEA. StrategyThe objective of the algorithm is to locate and and maintain a front of nondominated solutions, ideally a set of Pareto optimal solutions. This is achieved by using an evolutionary process (with surrogate procedures for genetic recombination and mutation) to explore the search space, and a selection process that uses a combination of the degree to which a candidate solution is dominated (strength) and an estimation of density of the Pareto front as an assigned fitness. An archive of the nondominated set is maintained separate from the population of candidate solutions used in the evolutionary process, providing a form of elitism. Procedure
Algorithm (below) provides a pseudocode listing of the Strength Pareto Evolutionary Algorithm 2 (SPEA2) for minimizing a cost function.
The Input :
$Population_{size}$, $Archive_{size}$, ProblemSize , $P_{crossover}$, $P_{mutation}$
Output :
Archive
Population $\leftarrow$ InitializePopulation ($Population_{size}$, ProblemSize )Archive $\leftarrow \emptyset$While ($\neg$StopCondition ())For ($S_i$ $\in$ Population )$Si_{objectives}$ $\leftarrow$ CalculateObjectives ($S_i$)End Union $\leftarrow$ Population + Archive For ($S_i$ $\in$ Union )$Si_{raw}$ $\leftarrow$ CalculateRawFitness ($S_i$, Union )$Si_{density}$ $\leftarrow$ CalculateSolutionDensity ($S_i$, Union )$Si_{fitness}$ $\leftarrow$ $Si_{raw}$ + $Si_{density}$ End Archive $\leftarrow$ GetNonDominated (Union )If (Size (Archive ) < $Archive_{size}$)PopulateWithRemainingBest (Union , Archive , $Archive_{size}$)ElseIf (Size (Archive ) > $Archive_{size}$)RemoveMostSimilar (Archive , $Archive_{size}$)End Selected $\leftarrow$ SelectParents (Archive , $Population_{size}$)Population $\leftarrow$ CrossoverAndMutation (Selected , $P_{crossover}$, $P_{mutation}$)End Return (GetNonDominated(Archive ))Heuristics
Code ListingListing (below) provides an example of the Strength Pareto Evolutionary Algorithm 2 (SPEA2) implemented in the Ruby Programming Language. The demonstration problem is an instance of continuous multiple objective function optimization called SCH (problem one in [Deb2002]). The problem seeks the minimum of two functions: $f1=\sum_{i=1}^n x_{i}^2$ and $f2=\sum_{i=1}^n (x_{i}2)^2$, $10\leq x_i \leq 10$ and $n=1$. The optimal solutions for this function are $x \in [0,2]$. The algorithm is an implementation of SPEA2 based on the presentation by Zitzler, Laumanns, and Thiele [Zitzler2002]. The algorithm uses a binary string representation (16 bits per objective function parameter) that is decoded and rescaled to the function domain. The implementation uses a uniform crossover operator and point mutations with a fixed mutation rate of $\frac{1}{L}$, where $L$ is the number of bits in a solution's binary string. def objective1(vector) return vector.inject(0.0) {sum, x sum + (x**2.0)} end def objective2(vector) return vector.inject(0.0) {sum, x sum + ((x2.0)**2.0)} end def decode(bitstring, search_space, bits_per_param) vector = [] search_space.each_with_index do bounds, i off, sum = i*bits_per_param, 0.0 param = bitstring[off...(off+bits_per_param)].reverse param.size.times do j sum += ((param[j].chr=='1') ? 1.0 : 0.0) * (2.0 ** j.to_f) end min, max = bounds vector << min + ((maxmin)/((2.0**bits_per_param.to_f)1.0)) * sum end return vector end def point_mutation(bitstring, rate=1.0/bitstring.size) child = "" bitstring.size.times do i bit = bitstring[i].chr child << ((rand()<rate) ? ((bit=='1') ? "0" : "1") : bit) end return child end def binary_tournament(pop) i, j = rand(pop.size), rand(pop.size) j = rand(pop.size) while j==i return (pop[i][:fitness] < pop[j][:fitness]) ? pop[i] : pop[j] end def crossover(parent1, parent2, rate) return ""+parent1 if rand()>=rate child = "" parent1.size.times do i child << ((rand()<0.5) ? parent1[i].chr : parent2[i].chr) end return child end def reproduce(selected, pop_size, p_cross) children = [] selected.each_with_index do p1, i p2 = (i.modulo(2)==0) ? selected[i+1] : selected[i1] p2 = selected[0] if i == selected.size1 child = {} child[:bitstring] = crossover(p1[:bitstring], p2[:bitstring], p_cross) child[:bitstring] = point_mutation(child[:bitstring]) children << child break if children.size >= pop_size end return children end def random_bitstring(num_bits) return (0...num_bits).inject(""){s,i s<<((rand<0.5) ? "1" : "0")} end def calculate_objectives(pop, search_space, bits_per_param) pop.each do p p[:vector] = decode(p[:bitstring], search_space, bits_per_param) p[:objectives] = [] p[:objectives] << objective1(p[:vector]) p[:objectives] << objective2(p[:vector]) end end def dominates?(p1, p2) p1[:objectives].each_index do i return false if p1[:objectives][i] > p2[:objectives][i] end return true end def weighted_sum(x) return x[:objectives].inject(0.0) {sum, x sum+x} end def euclidean_distance(c1, c2) sum = 0.0 c1.each_index {i sum += (c1[i]c2[i])**2.0} return Math.sqrt(sum) end def calculate_dominated(pop) pop.each do p1 p1[:dom_set] = pop.select {p2 p1!=p2 and dominates?(p1, p2) } end end def calculate_raw_fitness(p1, pop) return pop.inject(0.0) do sum, p2 (dominates?(p2, p1)) ? sum + p2[:dom_set].size.to_f : sum end end def calculate_density(p1, pop) pop.each do p2 p2[:dist] = euclidean_distance(p1[:objectives], p2[:objectives]) end list = pop.sort{x,y x[:dist]<=>y[:dist]} k = Math.sqrt(pop.size).to_i return 1.0 / (list[k][:dist] + 2.0) end def calculate_fitness(pop, archive, search_space, bits_per_param) calculate_objectives(pop, search_space, bits_per_param) union = archive + pop calculate_dominated(union) union.each do p p[:raw_fitness] = calculate_raw_fitness(p, union) p[:density] = calculate_density(p, union) p[:fitness] = p[:raw_fitness] + p[:density] end end def environmental_selection(pop, archive, archive_size) union = archive + pop environment = union.select {p p[:fitness]<1.0} if environment.size < archive_size union.sort!{x,y x[:fitness]<=>y[:fitness]} union.each do p environment << p if p[:fitness] >= 1.0 break if environment.size >= archive_size end elsif environment.size > archive_size begin k = Math.sqrt(environment.size).to_i environment.each do p1 environment.each do p2 p2[:dist] = euclidean_distance(p1[:objectives], p2[:objectives]) end list = environment.sort{x,y x[:dist]<=>y[:dist]} p1[:density] = list[k][:dist] end environment.sort!{x,y x[:density]<=>y[:density]} environment.shift end until environment.size <= archive_size end return environment end def search(search_space, max_gens, pop_size, archive_size, p_cross, bits_per_param=16) pop = Array.new(pop_size) do i {:bitstring=>random_bitstring(search_space.size*bits_per_param)} end gen, archive = 0, [] begin calculate_fitness(pop, archive, search_space, bits_per_param) archive = environmental_selection(pop, archive, archive_size) best = archive.sort{x,y weighted_sum(x)<=>weighted_sum(y)}.first puts ">gen=#{gen}, objs=#{best[:objectives].join(', ')}" break if gen >= max_gens selected = Array.new(pop_size){binary_tournament(archive)} pop = reproduce(selected, pop_size, p_cross) gen += 1 end while true return archive end if __FILE__ == $0 # problem configuration problem_size = 1 search_space = Array.new(problem_size) {i [10, 10]} # algorithm configuration max_gens = 50 pop_size = 80 archive_size = 40 p_cross = 0.90 # execute the algorithm pop = search(search_space, max_gens, pop_size, archive_size, p_cross) puts "done!" end Download: spea2.rb.
ReferencesPrimary SourcesZitzler and Thiele introduced the Strength Pareto Evolutionary Algorithm as a technical report on a multiple objective optimization algorithm with elitism and clustering along the Pareto front [Zitzler1998]. The technical report was later published [Zitzler1999]. The Strength Pareto Evolutionary Algorithm was developed as a part of Zitzler's PhD thesis [Zitzler1999a]. Zitzler, Laumanns, and Thiele later extended SPEA to address some inefficiencies of the approach, the algorithm was called SPEA2 and was released as a technical report [Zitzler2001] and later published [Zitzler2002]. SPEA2 provides finegrained fitness assignment, density estimation of the Pareto front, and an archive truncation operator. Learn MoreZitzler, Laumanns, and Bleuler provide a tutorial on SPEA2 as a book chapter that considers the basics of multiple objective optimization, and the differences from SPEA and the other related Multiple Objective Evolutionary Algorithms [Zitzler2004]. Bibliography

Free CourseGet one algorithm per week...
Own A CopyThis 438page ebook has...


Please Note: This content was automatically generated from the book content and may contain minor differences. 

Do you like Clever Algorithms? Buy the book now. 