Home  About Us  Contact Us  Privacy
Books: Nature Inspired  Machine Learning  Further Reading
Clever Algorithms: NatureInspired Programming RecipesBy Jason Brownlee PhD.Compact Genetic AlgorithmCompact Genetic Algorithm, CGA, cGA. TaxonomyThe Compact Genetic Algorithm is an Estimation of Distribution Algorithm (EDA), also referred to as Population ModelBuilding Genetic Algorithms (PMBGA), an extension to the field of Evolutionary Computation. The Compact Genetic Algorithm is the basis for extensions such as the Extended Compact Genetic Algorithm (ECGA). It is related to other EDAs such as the Univariate Marginal Probability Algorithm, the PopulationBased Incremental Learning algorithm, and the Bayesian Optimization Algorithm. InspirationThe Compact Genetic Algorithm is a probabilistic technique without an inspiration. It is related to the Genetic Algorithm and other Evolutionary Algorithms that are inspired by the biological theory of evolution by means of natural selection. StrategyThe information processing objective of the algorithm is to simulate the behavior of a Genetic Algorithm with a much smaller memory footprint (without requiring a population to be maintained). This is achieved by maintaining a vector that specifies the probability of including each component in a solution in new candidate solutions. Candidate solutions are probabilistically generated from the vector and the components in the better solution are used to make small changes to the probabilities in the vector. ProcedureThe Compact Genetic Algorithm maintains a realvalued prototype vector that represents the probability of each component being expressed in a candidate solution. Algorithm (below) provides a pseudocode listing of the Compact Genetic Algorithm for maximizing a cost function. The parameter $n$ indicates the amount to update probabilities for conflicting bits in each algorithm iteration. Input :
$Bits_{num}$, $n$
Output :
$S_{best}$
$V$ $\leftarrow$ InitializeVector ($Bits_{num}$, 0.5)$S_{best}$ $\leftarrow$ $\emptyset$ While ($\neg$StopCondition ())$S_{1}$ $\leftarrow$ GenerateSamples ($V$)$S_{2}$ $\leftarrow$ GenerateSamples ($V$)$S_{winner}$, $S_{loser}$ $\leftarrow$ SelectWinnerAndLoser ($S_{1}$, $S_{2}$)If (Cost ($S_{winner}$) $\leq$ Cost ($S_{best}$))$S_{best}$ $\leftarrow$ $S_{winner}$ End For ($i$ To $Bits_{num}$)If ($S_{winner}^i$ $\neq$ $S_{loser}^i$)If ($S_{winner}^i$ $\equiv$ 1)$V_{i}^i$ $\leftarrow$ $V_{i}^i$ + $\frac{1}{n}$ Else $V_{i}^i$ $\leftarrow$ $V_{i}^i$ – $\frac{1}{n}$ End End End End Return ($S_{best}$)Heuristics
Code ListingListing (below) provides an example of the Compact Genetic Algorithm implemented in the Ruby Programming Language. The demonstration problem is a maximizing binary optimization problem called OneMax that seeks a binary string of unity (all '1' bits). The objective function only provides an indication of the number of correct bits in a candidate string, not the positions of the correct bits. The algorithm is an implementation of Compact Genetic Algorithm that uses integer values to represent 1 and 0 bits in a binary string representation. def onemax(vector) return vector.inject(0){sum, value sum + value} end def generate_candidate(vector) candidate = {} candidate[:bitstring] = Array.new(vector.size) vector.each_with_index do p, i candidate[:bitstring][i] = (rand()<p) ? 1 : 0 end candidate[:cost] = onemax(candidate[:bitstring]) return candidate end def update_vector(vector, winner, loser, pop_size) vector.size.times do i if winner[:bitstring][i] != loser[:bitstring][i] if winner[:bitstring][i] == 1 vector[i] += 1.0/pop_size.to_f else vector[i] = 1.0/pop_size.to_f end end end end def search(num_bits, max_iterations, pop_size) vector = Array.new(num_bits){0.5} best = nil max_iterations.times do iter c1 = generate_candidate(vector) c2 = generate_candidate(vector) winner, loser = (c1[:cost] > c2[:cost] ? [c1,c2] : [c2,c1]) best = winner if best.nil? or winner[:cost]>best[:cost] update_vector(vector, winner, loser, pop_size) puts " >iteration=#{iter}, f=#{best[:cost]}, s=#{best[:bitstring]}" break if best[:cost] == num_bits end return best end if __FILE__ == $0 # problem configuration num_bits = 32 # algorithm configuration max_iterations = 200 pop_size = 20 # execute the algorithm best = search(num_bits, max_iterations, pop_size) puts "done! Solution: f=#{best[:cost]}/#{num_bits}, s=#{best[:bitstring]}" end Download: compact_genetic_algorithm.rb. Unit test available in the github project
ReferencesPrimary SourcesThe Compact Genetic Algorithm was proposed by Harik, Lobo, and Goldberg in 1999 [Harik1999], based on a random walk model previously introduced by Harik et al. [Harik1997]. In the introductory paper, the cGA is demonstrated to be comparable to the Genetic Algorithm on standard binary string optimization problems. Learn MoreHarik et al. extended the Compact Genetic Algorithm (called the Extended Compact Genetic Algorithm) to generate populations of candidate solutions and perform selection (much like the Univariate Marginal Probabilist Algorithm), although it used Marginal Product Models [Harik1999a] [Harik2006]. Sastry and Goldberg performed further analysis into the Extended Compact Genetic Algorithm applying the method to a complex optimization problem [Sastry2000]. Bibliography

Paperback
Lulu Contribute 

Please Note: This content was automatically generated from the book content and may contain minor differences. 
© Copyright 2013. All Rights Reserved.