SCHEDULES: FUNDAMENTALS, EXAMPLES, AND IMPLEMENTATION
5. ARTIFICIAL INTELLIGENCE TECHNIQUES FOR MPS PROCESS IMPLEMENTATION
From the results presented in Table 7-4, one can also see that overtime is used in all periods but the last, being mostly adopted in Week 4 and Month 2.
Although this heuristic does not include any optimization principle, the reader can rapidly begin to see the complexity in an MPS process, especially if the production scenario contains a large number of products, periods and resources. The real difficulty in this MPS creation heuristic is, therefore, located at three main points of the logic: at the product and resource selection and at the inventory building.
5. ARTIFICIAL INTELLIGENCE TECHNIQUES
In practice, linear programming has been applied to several fields, such as in transport routings, production planning and scheduling, agriculture, mining, industrial layout problems. (Prado, 1999). One of the main disadvantages of this method is the computer time during the solution search due to the combinatorial explosion that might happen (Tsang, 1995).
5.2 Hill-climbing search
Hill climbing algorithms try to continuously improve a solution initially generated by a constructive heuristic. The main limitation of this method is the possibility of being trapped in a local optimum. For instance, consider Figure 7-5, the optimal solution - when minimization is the objective - is on point C, however, the hill-climbing search can get trapped on A or B and propose it as the optimal solution.
The algorithm can also be "spinning in circles" if the search process iterates in a flat part of the solution curve (or space). To solve the problem of local minimum and maximum, new approaches have appeared, as tabu search, GA and SA.
5.3 Tabu search
Tabu search is a control strategy for local search algorithms. A tabu list imposes restrictions to guide the search process, avoiding cycles and allowing the exploration of other regions of the solution space. In tabu search, the best neighbor solution is chose, among those that are not prohibited (those who are not in the tabu list). The list of prohibited moves is created by opposed moves most recently performed. The move stays in the tabu list for a limited number of steps. Then it is removed from this list, meaning it goes back to the "allowed moves" list. Therefore, in each iteration, some moves to a certain set of neighbors are not permitted (Brochonski, 1999).
Figure 7-5. Illustrating the hill-climbing algorithm
5.4 Genetic algorithms
Genetic algorithms are an artificial intelligence search method based on natural evolution. In a GA, a population of possible solutions (individuals) evolves according to probabilistic operators conceived from metaphors to the biologic processes, namely, genetic crossover, mutations, and survival of the fittest. As the evolution process progresses, the fittest individuals survive, which represent better solutions to the problem, while the least fit individuals disappear (Tanomaro, 1995).
The decision variables to be optimized are coded in a vector (or object) called a chromosome. A chromosome is composed of a finite number of genes, which can be represented by a binary alphabet or an alphabet with greater cardinality. Selection, crossover, and mutation operators are applied to the individuals of a population. This mimics the natural process, guaranteeing the survival of the fittest individuals (solutions) in successive generations (Brochonski, 1999). In general, genetic algorithms have the following characteristics:
• Operate in a population (set) of points and not on an isolated point;
• Operate in a space of coded solutions and not directly in the search space;
• Only need information about the value of an objective fimction for each individual and do not require derivatives;
• Use probabilistic transitions and not deterministic rules (Tanomaro, 1995).
Generally, genetic algorithms have better chances of finding solutions closer to the optimal one compared to other search algorithms like hillclimbing since such methods operate on bigger search spaces. However, the cost for efficiency improvement is due to an increase in processing time (Tsang, 1995).
Execution of a genetic algorithm approach can be generically represented by the flowchart in Figure 7-6 (Wall, 1996). Starting from the use of any good heuristic, an initial population of individuals is created. A fitness function measures how good (fit) a solution (an individual) is. The process simulating the natural selection starts as each individual has its fitness factor calculated. At this phase, individuals with low fitness are removed and replaced by individuals with higher fitness, so that the population always remains the same size.
The next step consists of mating among individuals. It is important to highlight that individuals with higher fitness will have greater chances for being selected for the crossover; consequently, there will be a tendency to propagate their genes for the next generations.
Initial ivy Population
Select individuals for matixig
Mate mdividuals to produce of¥spmig
MutAte ofTsp riling
Ixifert ofFfprixijg ixito popixLaiion
Finish
Figure 7-6. Generic genetic algorithm flowchart (Wall, 1996)
Generation 1 Generation 2
1 1 2 1
1 2 2 2
X
1 1 2 1 2 2 2 1Figure 7-7. Crossover Operation
Generation 1
1 1 2 1
Generation 2
^ iiBii
Figure 7-8. Mutation operation
After the crossover (represented at Figure 7-7), a mutation operation can occur to the chromosomes according to some probability.
Mutation consists of randomly altering a gene from any individual of a population to a value possible to be found. In the example shown at Figure 7-8, only two values are possible: 1 and 2.
Analogous to nature, genetic mutation has a very small probability of happening and will affect only a small number of individuals, altering a minimum quantity of genes. This way, the selection process forms a population where the fittest survives, mimicking the natural selection process. The crossover combines genetic material from two individuals in the search for individuals with better genetic characteristics than the parents.
In case selection was not applied, GAs would have a similar behavior to the random search algorithms. The mutation, crossover and selection combinations provide a local search selection in the proximity of the fittest population individuals. Without the mutation, the search would be restricted to information from a single population, and genetic material eventually lost in a selection process would never be recovered (Tanomaro, 1995).
5.5 Simulated annealing
The physical annealing phenomena works through gradual temperature cooling of a high temperature metal, where each temperature level represents an energy level. Cooling finishes only when the material reaches the solidification point, which will correspond to the minimum energy state. If
cooling occurs too rapidly, as the material reaches solidification, it will present imperfections, which compromise its resistance, also meaning that it did not reach the minimum energy state.
In simulated annealing, the energy state (level) is represented by an objective function to be minimized. Therefore, the minimum energy level represents the optimal solution and the temperature is a control parameter that helps the system to reach this minimum energy.
SA works similarly to a local search method or hillclimbing: it looks for neighboring solutions and accepts them if they are better than the current one. However, contrary to local search, which easily gets trapped in a local minimum, SA tends to escape from such minimums through the acceptance of worse solutions. The probability of accepting a worse solution depends on the temperature (the higher the temperature, the greater the probability) and on the variation of the objective function given by the solution being evaluated (the less the variation, the greater the probability).
Metropolis & Rosenbluth (1953) have introduced an algorithm that simulates the possibility of atoms movements based on energy gains at a given temperature.
Kirkpatrick et al (1983) applied Metropolis' concepts in problem optimization, being considered as the precursor of a series of studies about simulated annealing. The problems solved by this work were the traveling salesman and the printed circuit board layout. (Bonomi & Lutton (1984) also applied this algorithm to the traveling salesman problem.)
McLaughlin (1989) compared simulated annealing with other meta- heuristics successfully in a cards game, and Connolly (1990) improved the algorithm by introducing innovations to the temperature change procedure and to the initial temperature acquisition in a quadratic distribution problem.
More recently, several other researchers have used simulated annealing in manufacturing problems. Radhakrishnan & Ventura (2000), for instance, have applied simulated annealing in production scheduling problems, with earliness and tardiness penalties and sequence-dependent setup times.
Moccellin et al, (2001) have used a hybrid algorithm combining simulated annealing and tabu search to the permutation flow shops problem.
Zolfaghari & Liang (2002) made a comparative study of simulated annealing, genetic algorithms and tabu search applied to products and machines grouping, with simulated annealing given the best results out of the three techniques considered.
The simulated annealing meta-heuristic tries to minimize an objective function that incorporates to the hill climbing approach concepts from physical metal annealing process. To get out of a local optimum, SA allows the acceptance of a worse solution according to a certain probability given by: P(AE) = Q^-^^'^^\ where
P(.) is the acceptance probability, AE is the objective function increase, T is the current temperature, and k is a system constant.
In the real metal annealing process, temperature must decrease gradually to avoid defects (cracks) in the metal surface. In simulated annealing, such defects will correspond to reaching a poor solution.