Clonal Selection AlgorithmClonal Selection Algorithm, CSA, CLONALG. TaxonomyThe Clonal Selection Algorithm (CLONALG) belongs to the field of Artificial Immune Systems. It is related to other Clonal Selection Algorithms such as the Artificial Immune Recognition System, the BCell Algorithm (BCA), and the Multiobjective Immune System Algorithm (MISA). There are numerious extensions to CLONALG including tweaks such as the CLONALG1 and CLONALG2 approaches, a version for classification called CLONCLAS, and an adaptive version called Adaptive Clonal Selection (ACS). InspirationThe Clonal Selection algorithm is inspired by the Clonal Selection theory of acquired immunity. The clonal selection theory credited to Burnet was proposed to account for the behavior and capabilities of antibodies in the acquired immune system [Burnet1957] [Burnet1959]. Inspired itself by the principles of Darwinian natural selection theory of evolution, the theory proposes that antigens selectfor lymphocytes (both B and Tcells). When a lymphocyte is selected and binds to an antigenic determinant, the cell proliferates making many thousands more copies of itself and differentiates into different cell types (plasma and memory cells). Plasma cells have a short lifespan and produce vast quantities of antibody molecules, whereas memory cells live for an extended period in the host anticipating future recognition of the same determinant. The important feature of the theory is that when a cell is selected and proliferates, it is subjected to small copying errors (changes to the genome called somatic hypermutation) that change the shape of the expressed receptors and subsequent determinant recognition capabilities of both the antibodies bound to the lymphocytes cells surface, and the antibodies that plasma cells produce. MetaphorThe theory suggests that starting with an initial repertoire of general immune cells, the system is able to change itself (the compositions and densities of cells and their receptors) in response to experience with the environment. Through a blind process of selection and accumulated variation on the large scale of many billions of cells, the acquired immune system is capable of acquiring the necessary information to protect the host organism from the specific pathogenic dangers of the environment. It also suggests that the system must anticipate (guess) at the pathogen to which it will be exposed, and requires exposure to pathogen that may harm the host before it can acquire the necessary information to provide a defense. StrategyThe information processing principles of the clonal selection theory describe a general learning strategy. This strategy involves a population of adaptive information units (each representing a problemsolution or component) subjected to a competitive processes for selection, which together with the resultant duplication and variation ultimately improves the adaptive fit of the information units to their environment. ProcedureAlgorithm (below) provides a pseudocode listing of the Clonal Selection Algorithm (CLONALG) for minimizing a cost function. The general CLONALG model involves the selection of antibodies (candidate solutions) based on affinity either by matching against an antigen pattern or via evaluation of a pattern by a cost function. Selected antibodies are subjected to cloning proportional to affinity, and the hypermutation of clones inverselyproportional to clone affinity. The resultant clonalset competes with the existent antibody population for membership in the next generation. In addition, lowaffinity population members are replaced by randomly generated antibodies. The pattern recognition variation of the algorithm includes the maintenance of a memory solution set which in its entirety represents a solution to the problem. A binaryencoding scheme is employed for the binarypattern recognition and continuous function optimization examples, and an integer permutation scheme is employed for the Traveling Salesman Problem (TSP). Input :
$Population_{size}$, $Selection_{size}$, $Problem_{size}$, $RandomCells_{num}$, $Clone_{rate}$, $Mutation_{rate}$
Output :
Population
Population $\leftarrow$ CreateRandomCells ($Population_{size}$, $Problem_{size}$)While ($\neg$StopCondition ())For ($p_i \in$ Population )Affinity ($p_i$)End $Population_{select} \leftarrow$ Select (Population , $Selection_{size}$)$Population_{clones} \leftarrow \emptyset$ For ($p_i \in Population_{select}$)$Population_{clones} \leftarrow$ Clone ($p_i$, $Clone_{rate}$)End For ($p_i \in Population_{clones}$)Hypermutate ($p_i$, $Mutation_{rate}$)Affinity ($p_i$)End Population $\leftarrow$ Select (Population , $Population_{clones}$, $Population_{size}$)$Population_{rand} \leftarrow$ CreateRandomCells ($RandomCells_{num}$)Replace (Population , $Population_{rand}$)End Return (Population )Heuristics
Code ListingListing (below) provides an example of the Clonal Selection Algorithm (CLONALG) implemented in the Ruby Programming Language. The demonstration problem is an instance of a continuous function optimization that seeks $\min f(x)$ where $f=\sum_{i=1}^n x_{i}^2$, $5.0\leq x_i \leq 5.0$ and $n=3$. The optimal solution for this basin function is $(v_0,\ldots,v_{n1})=0.0$. The algorithm is implemented as described by de Castro and Von Zuben for function optimization [Castro2002a]. def objective_function(vector) return vector.inject(0.0) {sum, x sum + (x**2.0)} end def decode(bitstring, search_space, bits_per_param) vector = [] search_space.each_with_index do bounds, i off, sum = i*bits_per_param, 0.0 param = bitstring[off...(off+bits_per_param)].reverse param.size.times do j sum += ((param[j].chr=='1') ? 1.0 : 0.0) * (2.0 ** j.to_f) end min, max = bounds vector << min + ((maxmin)/((2.0**bits_per_param.to_f)1.0)) * sum end return vector end def evaluate(pop, search_space, bits_per_param) pop.each do p p[:vector] = decode(p[:bitstring], search_space, bits_per_param) p[:cost] = objective_function(p[:vector]) end end def random_bitstring(num_bits) return (0...num_bits).inject(""){s,i s<<((rand<0.5) ? "1" : "0")} end def point_mutation(bitstring, rate) child = "" bitstring.size.times do i bit = bitstring[i].chr child << ((rand()<rate) ? ((bit=='1') ? "0" : "1") : bit) end return child end def calculate_mutation_rate(antibody, mutate_factor=2.5) return Math.exp(mutate_factor * antibody[:affinity]) end def num_clones(pop_size, clone_factor) return (pop_size * clone_factor).floor end def calculate_affinity(pop) pop.sort!{x,y x[:cost]<=>y[:cost]} range = pop.last[:cost]  pop.first[:cost] if range == 0.0 pop.each {p p[:affinity] = 1.0} else pop.each {p p[:affinity] = 1.0(p[:cost]/range)} end end def clone_and_hypermutate(pop, clone_factor) clones = [] num_clones = num_clones(pop.size, clone_factor) calculate_affinity(pop) pop.each do antibody m_rate = calculate_mutation_rate(antibody) num_clones.times do clone = {} clone[:bitstring] = point_mutation(antibody[:bitstring], m_rate) clones << clone end end return clones end def random_insertion(search_space, pop, num_rand, bits_per_param) return pop if num_rand == 0 rands = Array.new(num_rand) do i {:bitstring=>random_bitstring(search_space.size*bits_per_param)} end evaluate(rands, search_space, bits_per_param) return (pop+rands).sort{x,y x[:cost]<=>y[:cost]}.first(pop.size) end def search(search_space, max_gens, pop_size, clone_factor, num_rand, bits_per_param=16) pop = Array.new(pop_size) do i {:bitstring=>random_bitstring(search_space.size*bits_per_param)} end evaluate(pop, search_space, bits_per_param) best = pop.min{x,y x[:cost]<=>y[:cost]} max_gens.times do gen clones = clone_and_hypermutate(pop, clone_factor) evaluate(clones, search_space, bits_per_param) pop = (pop+clones).sort{x,y x[:cost]<=>y[:cost]}.first(pop_size) pop = random_insertion(search_space, pop, num_rand, bits_per_param) best = (pop + [best]).min{x,y x[:cost]<=>y[:cost]} puts " > gen #{gen+1}, f=#{best[:cost]}, s=#{best[:vector].inspect}" end return best end if __FILE__ == $0 # problem configuration problem_size = 2 search_space = Array.new(problem_size) {i [5, +5]} # algorithm configuration max_gens = 100 pop_size = 100 clone_factor = 0.1 num_rand = 2 # execute the algorithm best = search(search_space, max_gens, pop_size, clone_factor, num_rand) puts "done! Solution: f=#{best[:cost]}, s=#{best[:vector].inspect}" end Download: clonal_selection_algorithm.rb.
ReferencesPrimary SourcesHidden at the back of a technical report on the applications of Artificial Immune Systems de Castro and Von Zuben [Castro1999] proposed the Clonal Selection Algorithm (CSA) as a computational realization of the clonal selection principle for pattern matching and optimization. The algorithm was later published [Castro2000], and investigated where it was renamed to CLONALG (CLONal selection ALGorithm) [Castro2002a]. Learn MoreWatkins et al. proposed to exploit the inherent distributedness of the CLONALG and proposed a parallel version of the pattern recognition version of the algorithm [Watkins2003]. White and Garret also investigated the pattern recognition version of CLONALG and generalized the approach for the task of binary pattern classification renaming it to Clonal Classification (CLONCLAS) where their approach was compared to a number of simple Hamming distance based heuristics [White2003]. In an attempt to address concerns of algorithm efficiency, parameterization, and representation selection for continuous function optimization Garrett proposed an updated version of CLONALG called Adaptive Clonal Selection (ACS) [Garrett2004]. In their book, de Castro and Timmis provide a detailed treatment of CLONALG including a description of the approach (starting page 79) and a step through of the algorithm (starting page 99) [Castro2002b]. Cutello and Nicosia provide a study of the clonal selection principle and algorithms inspired by the theory [Cutello2005]. Brownlee provides a review of Clonal Selection algorithms providing a taxonomy, algorithm reviews, and a broader bibliography [Brownlee2007b]. Bibliography

Free CourseGet one algorithm per week...
Own A CopyThis 438page ebook has...


Please Note: This content was automatically generated from the book content and may contain minor differences. 

Do you like Clever Algorithms? Buy the book now. 