Foundations of Metaheuristics
3.4 Selected Metaheuristic Optimization Techniques
procedure for this task and many different options exist to decide on the thresh- old and its reduction. Often, the threshold is reduced by a certain value after a given number of iterations. In the following algorithm 1, one example for the basic func- tionality of TA is presented.
Algorithm 1. Threshold Accepting 1: choose initial threshold T>0 2: choose threshold reduction step size r
3: choose maximum number imaxof iterations between improvements 4: choose maximum number tmaxof iterations between threshold reduction 5: create initial solution s with fitness value f(s)
6: t=0 7: repeat 8: t=t+1
9: create neighboring solution s∗ 10: calculate new fitness value f(s∗) 11: Δf=f(s)−f(s∗)
12: ifΔf<T then 13: s=s∗
14: i=0
15: else 16: i=i+1 17: end if
18: if i>imaxor t>tmaxthen 19: T=T−r
20: i=0
21: t=0
22: end if 23: until termination
3.4.2 Recombination-Based Search: Genetic Algorithms
Genetic algorithms (GA) are recombination-based metaheuristics and belong to the class of evolutionary algorithms (EA). EA were introduced by Holland (1975) and Rechenberg (1973b) and have been applied to a variety of different problems of different domains.3These optimization techniques are inspired by evolutionary
3Some applications of EA to airline-related problems can be found for example in Levine (1996), Langerman and Ehlers (1997), Christou et al. (1999), Gu and Chung (1999), Ozdemir and Mohan (1999), Ozdemir and Mohan (2001), Pulugurtha and Nambisan (2001b), Pulugurtha and Nambisan (2001a), Chang (2002), Pulugurtha and Nambisan (2003), Capr`ı and Ignaccolo (2004), Lee et al. (2007).
3.4 Selected Metaheuristic Optimization Techniques 55 principles and imitate basic biological operators of the modern evolutionary synthe- sis. This theory of evolution is based on the findings of Darwin (1859) and Mendel (1866) and identifies selection, recombination, and mutation as the basic mecha- nisms of nature to propagate advantageous properties of creatures throughout popu- lations (survival of the fittest). In EA, these mechanisms are formulated at an abstract level to solve optimization problems.
Many different variants of EA have been developed since its beginnings. They differ in the design elements presented in Sect. 3.3 and the emphasis of certain operators or their intended use (for example mathematical optimization vs. machine learning). However, each technique relies on the basic evolutionary principles and the differences between the different variants have become much less in more recent algorithms, especially when applied to real-world problems. Besides GA, two major kinds of EA can be identified and are subject to current research: evolution strategies (ES) and genetic programming (GP). Basic information regarding these techniques can be found for example in Rechenberg (1973a) and Rechenberg (1973b) for ES and Koza (1992) for GP.
3.4.2.1 Simple Genetic Algorithm
The simple GA denotes a GA in a basic form and, thus, well describes the function- ality of a GA (Goldberg, 1989). The algorithm uses a population S of n solutions s during the search process. A solution is denoted as individual and usually consists of a string of fixed length l incorporating the problem parameters. New solutions (a new generation) are created by applying recombination-based operators (crossover or recombination) to the existing solutions. Usually, a crossover operator creates a new offspring of two parental solutions by exchanging substrings of the parents.
The crossover represents the main variation mechanism of the simple GA. In addi- tion, a local search (mutation) is performed on individual solutions, that serves as a background operator. This operator has the additional functionality to insert new properties into the solution, that might not be existent in the current population. A selection operator decides which solutions are removed from the population and are no longer available to the search operators. The decision on the removal of solutions might be controlled deterministically or stochastically. On average, low-quality so- lutions have to be removed from the population to guide the search towards promis- ing regions in the solution space. All operators are applied iteratively, each iteration is denoted as generation.
The following algorithm 2 presents one example for the basic functionality of the simple GA including its parameters.
Often, a maximum number of generations or the convergence of the population is used as a criterion for termination. A convergence criterion could be the difference between the fitness of the best solution in the population and the average fitness of the population (or the worst solution in the population): the smaller this differ- ence, the higher the convergence. Another option would be to stop the GA if the effort for obtaining new solutions is more expensive than the possible fitness gain (Wendt, 1995).
Algorithm 2. Simple Genetic Algorithm 1: choose population size n
2: choose recombination probability prand mutation probability pm
3: create initial population S with n solutions s= (s0, . . .,sl) 4: calculate fitness value f(s)for each s∈S
5: repeat
6: insert n solutions from S into a mating pool M using a selection scheme depending on fitness values
7: S∗={}
8: while|S∗|<n do
9: if random(0,1)<prthen
10: create a new solution s∗by recombination of two randomly chosen s∈M 11: include s∗in S∗
12: else
13: copy one randomly chosen s∈M as s∗in S∗ 14: end if
15: end while 16: for all s∈S∗do 17: for i=0 to i=l do 18: if random(0,1)<pmthen
19: mutate si
20: end if
21: end for
22: end for
23: calculate fitness values f(s)for all s∈S∗ 24: S=S∗
25: until termination
3.4.2.2 Steady-State Genetic Algorithm
The simple GA represents a generational GA. A new generation of solutions is created by applying genetic operators to the current generation until enough new solutions are obtained. Thus, there is an explicit distinction between the parental and offspring generation. One drawback of this approach is the risk that high- quality solutions get lost if their offspring have a lower fitness value than them- selves (Reeves & Rowe, 2003). To prevent these situations, concepts like elitism and population overlaps were developed (De Jong, 1975). In elitism, the best individual is not allowed to be replaced. Population overlaps describe the situation when only a fraction (the generation gap) of the population is replaced by offspring.
Using this principle in a GA and replacing only one solution at a time per iteration