Evolutionary ProgrammingEvolutionary Programming, EP. TaxonomyEvolutionary Programming is a Global Optimization algorithm and is an instance of an Evolutionary Algorithm from the field of Evolutionary Computation. The approach is a sibling of other Evolutionary Algorithms such as the Genetic Algorithm, and Learning Classifier Systems. It is sometimes confused with Genetic Programming given the similarity in name, and more recently it shows a strong functional similarity to Evolution Strategies. InspirationEvolutionary Programming is inspired by the theory of evolution by means of natural selection. Specifically, the technique is inspired by macrolevel or the specieslevel process of evolution (phenotype, hereditary, variation) and is not concerned with the genetic mechanisms of evolution (genome, chromosomes, genes, alleles). MetaphorA population of a species reproduce, creating progeny with small phenotypical variation. The progeny and the parents compete based on their suitability to the environment, where the generally more fit members constitute the subsequent generation and are provided with the opportunity to reproduce themselves. This process repeats, improving the adaptive fit between the species and the environment. StrategyThe objective of the Evolutionary Programming algorithm is to maximize the suitability of a collection of candidate solutions in the context of an objective function from the domain. This objective is pursued by using an adaptive model with surrogates for the processes of evolution, specifically hereditary (reproduction with variation) under competition. The representation used for candidate solutions is directly assessable by a cost or objective function from the domain. ProcedureAlgorithm (below) provides a pseudocode listing of the Evolutionary Programming algorithm for minimizing a cost function. Input :
$Population_{size}$, ProblemSize , BoutSize
Output :
$S_{best}$
Population $\leftarrow$ InitializePopulation ($Population_{size}$, ProblemSize )EvaluatePopulation (Population )$S_{best}$ $\leftarrow$ GetBestSolution (Population )While ($\neg$StopCondition ())Children $\leftarrow \emptyset$For ($Parent_{i}$ $\in$ Population )$Child_{i}$ $\leftarrow$ Mutate ($Parent_{i}$)Children $\leftarrow$ $Child_{i}$End EvaluatePopulation (Children )$S_{best}$ $\leftarrow$ GetBestSolution (Children , $S_{best}$)Union $\leftarrow$ Population + Children For ($S_{i}$ $\in$ Union )For ($1$ To BoutSize )$S_{j}$ $\leftarrow$ RandomSelection (Union )If (Cost ($S_{i}$) < Cost ($S_{j}$))$Si_{wins}$ $\leftarrow$ $Si_{wins}$ + 1 End End End Population $\leftarrow$ SelectBestByWins (Union , $Population_{size}$)End Return ($S_{best}$)Heuristics
Code ListingListing (below) provides an example of the Evolutionary Programming algorithm implemented in the Ruby Programming Language. The demonstration problem is an instance of a continuous function optimization that seeks $\min f(x)$ where $f=\sum_{i=1}^n x_{i}^2$, $5.0\leq x_i \leq 5.0$ and $n=2$. The optimal solution for this basin function is $(v_0,\ldots,v_{n1})=0.0$. The algorithm is an implementation of Evolutionary Programming based on the classical implementation for continuous function optimization by Fogel et al. [Fogel1991a] with pervariable adaptive variance based on Fogel's description for a selfadaptive variation on page 160 of his 1995 book [Fogel1995]. def objective_function(vector) return vector.inject(0.0) {sum, x sum + (x ** 2.0)} end def random_vector(minmax) return Array.new(minmax.size) do i minmax[i][0] + ((minmax[i][1]  minmax[i][0]) * rand()) end end def random_gaussian(mean=0.0, stdev=1.0) u1 = u2 = w = 0 begin u1 = 2 * rand()  1 u2 = 2 * rand()  1 w = u1 * u1 + u2 * u2 end while w >= 1 w = Math.sqrt((2.0 * Math.log(w)) / w) return mean + (u2 * w) * stdev end def mutate(candidate, search_space) child = {:vector=>[], :strategy=>[]} candidate[:vector].each_with_index do v_old, i s_old = candidate[:strategy][i] v = v_old + s_old * random_gaussian() v = search_space[i][0] if v < search_space[i][0] v = search_space[i][1] if v > search_space[i][1] child[:vector] << v child[:strategy] << s_old + random_gaussian() * s_old.abs**0.5 end return child end def tournament(candidate, population, bout_size) candidate[:wins] = 0 bout_size.times do i other = population[rand(population.size)] candidate[:wins] += 1 if candidate[:fitness] < other[:fitness] end end def init_population(minmax, pop_size) strategy = Array.new(minmax.size) do i [0, (minmax[i][1]minmax[i][0]) * 0.05] end pop = Array.new(pop_size, {}) pop.each_index do i pop[i][:vector] = random_vector(minmax) pop[i][:strategy] = random_vector(strategy) end pop.each{c c[:fitness] = objective_function(c[:vector])} return pop end def search(max_gens, search_space, pop_size, bout_size) population = init_population(search_space, pop_size) population.each{c c[:fitness] = objective_function(c[:vector])} best = population.sort{x,y x[:fitness] <=> y[:fitness]}.first max_gens.times do gen children = Array.new(pop_size) {i mutate(population[i], search_space)} children.each{c c[:fitness] = objective_function(c[:vector])} children.sort!{x,y x[:fitness] <=> y[:fitness]} best = children.first if children.first[:fitness] < best[:fitness] union = children+population union.each{c tournament(c, union, bout_size)} union.sort!{x,y y[:wins] <=> x[:wins]} population = union.first(pop_size) puts " > gen #{gen}, fitness=#{best[:fitness]}" end return best end if __FILE__ == $0 # problem configuration problem_size = 2 search_space = Array.new(problem_size) {i [5, +5]} # algorithm configuration max_gens = 200 pop_size = 100 bout_size = 5 # execute the algorithm best = search(max_gens, search_space, pop_size, bout_size) puts "done! Solution: f=#{best[:fitness]}, s=#{best[:vector].inspect}" end Download: evolutionary_programming.rb.
ReferencesPrimary SourcesEvolutionary Programming was developed by Lawrence Fogel, outlined in early papers (such as [Fogel1962]) and later became the focus of his PhD dissertation [Fogel1964]. Fogel focused on the use of an evolutionary process for the development of control systems using Finite State Machine (FSM) representations. Fogel's early work on Evolutionary Programming culminated in a book (coauthored with Owens and Walsh) that elaborated the approach, focusing on the evolution of state machines for the prediction of symbols in time series data [Fogel1966]. Learn MoreThe field of Evolutionary Programming lay relatively dormant for 30 years until it was revived by Fogel's son, David. Early works considered the application of Evolutionary Programming to control systems [Sebald1990], and later function optimization (system identification) culminating in a book on the approach [Fogel1991], and David Fogel's PhD dissertation [Fogel1992]. Lawrence Fogel collaborated in the revival of the technique, including reviews [Fogel1990] [Fogel1994] and extensions on what became the focus of the approach on function optimization [Fogel1991a]. Yao et al. provide a seminal study of Evolutionary Programming proposing an extension and racing it against the classical approach on a large number of test problems [Yao1999]. Finally, Porto provides an excellent contemporary overview of the field and the technique [Porto2000]. Bibliography

Free CourseGet one algorithm per week...
Own A CopyThis 438page ebook has...


Please Note: This content was automatically generated from the book content and may contain minor differences. 

Do you like Clever Algorithms? Buy the book now. 