Theoretical Analysis of the enhanced Best Performance Algorithm
2.2 Local Search Metaheuristic Algorithms
Local Search (LS) metaheuristic algorithms determine solutions to computationally difficult optimization problems. Basically, they search through a solution space 𝑋, of objective function 𝑓, by repeatedly making slight adjustments (or local moves) from one solution 𝑥 to another 𝑥′ (𝑥′ will be chosen from a set of candidate solutions associated with 𝑥); the intent is to direct the search towards the global optima point. A key element of modern-day metaheuristic algorithms is to accept both improved and dis-improved solutions. Accepting dis-improved solutions is a strategic way of escaping local entrapment.
A local move is an adjustment to the design variables of solution vector 𝑥. This could include: the inversing of binary digits; adding, deleting, or the swapping of elements within the solution vector;
and real number alterations. The set of candidate solutions associated with solution 𝑥 is called the
19
neighborhood of 𝑥. The neighborhood of 𝑥 is denoted as 𝑁(𝑥). It is defined as follows (Blum and Roli, 2003):
Definition 2.1: Let ℕ: 𝑥 → 2𝑥 be a function that assigns to every feasible solution 𝑥 ∈ 𝑋 a subset of feasible solutions 𝑗 ∈ ℕ(𝑥) ⊆ 𝑋. ℕ(𝑥) is called the neighborhood of solution 𝑥 if each neighbor 𝑗 ∈ ℕ(𝑥) is in some way close to 𝑥 within the domains of the solution space 𝑋.
The best solution found within the neighborhood structure of 𝑁(𝑥) is called the local optima. Local optimum are defined as follows (Hancock, 2005):
Definition 2.2: A local optimum point 𝑥∗∈ ℝ exists for some error value 𝜀 > 0 such that for a minimization problem 𝑓(𝑥∗) ≤ 𝑓(𝑥), and for a maximization problem 𝑓(𝑥∗) ≥ 𝑓(𝑥). These are subjected to |𝑥– 𝑥∗| < 𝜀, ∀ 𝑥 ∈ ℝ. Here, 𝑓 represents the objective function, ℝ represents a solution space of real numbers, and |𝑥– 𝑥∗| is the absolute value of the difference between 𝑥 and 𝑥∗.
Within solution space 𝑋, several local optimum points may exist. The best local optimum point from this lot is called the global optimum point. Global optimum points are local optimum points, but not necessarily vice versa. A global optimum point is defined as follows (Snyman, 2005):
Definition 2.3: A global optimum point 𝑥∗∈ ℝ exists for a minimization problem if 𝑓(𝑥∗) ≤ 𝑓(𝑥), and for a maximization problem if 𝑓(𝑥∗) ≥ 𝑓(𝑥), ∀ 𝑥 ∈ ℝ.
Local and global optimum points are visually seen in Figure 2.1.
LS metaheuristic algorithms search for local optimum points, in trying to determine the global optimum point. Consequently, LS metaheuristic algorithms are considered as improvement techniques (Liberti, 2008). An obvious attempt to determine all local optimum points is to perform an exhaustive search of the solution space. However, for large to complexed solution spaces, this may be impractical. The reason is due to the computational time taken to examine every possible solution;
ultimately, this may prove to be too expensive. In this scenario, examining subsets of feasible solutions, within the neighborhood regions of 𝑁(𝑥), is the alternative. This alternative is computationally more acceptable for exponentially complexed optimization problems. The challenge
20
then is to intelligently examine subsets of the most attractive solutions within these neighborhood regions, within the bounds of polynomial time complexity.
Figure 2.1: The global optimum point is the extreme local optimum point. A local optimum point is the best point within a neighborhood region. This image is for a maximization problem.
The techniques employed by metaheuristic algorithms are to strategically refine and explore subsets 𝑆 ∈ 𝑁(𝑥) in ways that are efficient and computationally feasible. Metaheuristic algorithms thus employ a level of intelligence in searching for the best solution 𝑥∗∈ 𝑆 ∈ 𝑁(𝑥). In doing, these algorithms typically make use of the knowledge acquired from examining other neighboring solutions, in trying to determine 𝑥∗.
Hence, the goal of metaheuristic algorithms is to narrow the visited regions to subsets of solutions that are more representative of the local optimum points (Glover, 1993). For this reason, metaheuristic algorithms are justifiably more advanced than standard heuristic techniques (Yagiura and Ibaraki, 2001).
In addition to effectively and intelligently guiding the search, metaheuristic algorithms also need to be intelligent enough to escape from premature convergence. Premature convergence is when the algorithm believes that it has found the global optimum point, when in fact the global optimum point is not within the vicinity of the local neighborhood region being searched.
To escape from premature convergence, LS metaheuristic algorithms strategically allow for dis- improved solutions to be accepted. The dis-improved solutions accepted will also be used to traject through the solution space. Accepting dis-improved solutions mean that solution 𝑥′, which is determined from solution 𝑥, is accepted, although 𝑓(𝑥′) is worse off than 𝑓(𝑥). This strategy allows
21
for a re-direction of the search path. This re-direction may cause a break out of possible local entrapment; it could also lead the search towards other neighborhood regions which could potentially contain higher quality solutions.
However, accepting dis-improved solutions is accompanied with a risk; this risk is an effect called cycling (Glover, 1990). Cycling occurs when one solution consequently leads to another solution, in a repeated cycle. Cycling does not necessarily mean a repetition after one move transition, but could also be as a result of some interval of intermediate steps. Metaheuristic algorithms need to be watchful for the effect of cycling.
The stopping criterion of metaheuristic algorithms include:
1. Stop when the optimal solution is found.
2. Stop when a solution is found that falls within an acceptable degree of error.
3. Stop when the number of iterations exceed the upper bound.
4. Stop when a count of the iterations exceed a certain value since the best solution was last updated.
Once the stopping criterion is satisfied, the metaheuristic algorithm will return the final result.