University of Nottingham School of Computer Science 1Heuristic Search Methods Dr Dario Landa- Silva Lecture 5 – Evolutionary Algorithms (EAs) •Basics of Evolutionary Algorithms •Design of Evolutionary Algorithms •Examples of Practical Evolutionary Algorithms Learning outcomes: − Understand key components and working principle of EAs − Describe the role of each component in an EA − Appreciate the different ways to implement EAs − Understand the basics of some representative EAs PA184 - Heuristic Search Methods University of Nottingham School of Computer Science 2Heuristic Search Methods Dr Dario Landa- Silva Basics of Evolutionary Algorithms The rationale for evolutionary algorithms (EAs) is to maintain a population of solutions during the search. The solution (individuals) compete between them and are subject to selection, and reproduction operators during a number of generations. Exploitation vs. Exploration •Having many solutions instead of only one. •Survival of the fittest principle. •Pass on good solution components through recombination. •Explore and discover new components through self-adaptation. •Solutions are modified from generation to generation by means of reproduction operators (recombination and self-adaptation). University of Nottingham School of Computer Science 3Heuristic Search Methods Dr Dario Landa- Silva Darwinian Evolutionary Algorithm • Competing population • Dynamic population • Fitness • Dynamic genetic inheritance Lamarckian Evolutionary Algorithm • Competing population • Dynamic population • Fitness • Dynamic cultural inheritance Baldwinian EAs use a kind of intermediate between genetics and culture suggesting the existence of a strong interaction between learning and evolution. Dawkins proposed the concept of memetics to refer to cultural heritage. University of Nottingham School of Computer Science 4Heuristic Search Methods Dr Dario Landa- Silva Typical Evolutionary Algorithm Cycle Create population of solutions Evaluate the fitness of each individual in the population Select individuals and apply variation operators to generate new individuals Select surviving individuals for the next generation (evaluate fitness) Stopping condition true? Output best solution yes no University of Nottingham School of Computer Science 5Heuristic Search Methods Dr Dario Landa- Silva Typical Components of an EA • Solution representation • Population size • Initialisation strategy • Evaluation function • Reproduction operators • Selection mechanism • Stopping condition to detect convergence A diverse range of evolutionary procedures can be designed for the subject problem by tuning the above components. It is crucial to design components and tune parameters while considering the choices made on the various parts of the algorithm. Design of Evolutionary Algorithms University of Nottingham School of Computer Science 6Heuristic Search Methods Dr Dario Landa- Silva Solution Representation (encoding) A good representation ensures that variation operators maintain a functional link between current and new solutions. Some common representations: • Binary vectors • Permutations • Symbolic (e.g. parse trees) The representation can be genotypic of phenotypic and then the Hamming or Euclidean distance can be used to measure the distance between solutions. University of Nottingham School of Computer Science 7Heuristic Search Methods Dr Dario Landa- Silva Population Size and Initialisation Strategy Commonly,  denotes number of parents and  denotes number of offspring, the new population is obtained from the  +  individuals. Common initialisation strategies: • Uniformly at random • Incorporating problem domain knowledge • Incorporating some elite solutions The number of parents  can be seen as the degree of parallelism of the evolutionary search. The number of offspring  can be seen as the degree at which the parents as used as the basis to generate new individuals. The difficulty of the fitness landscape must be taken into account when setting the population size parameters  and . University of Nottingham School of Computer Science 8Heuristic Search Methods Dr Dario Landa- Silva Evaluation Function A good evaluation function assesses the quality of evolved solutions and helps to assess the effectiveness of the reproduction operators. Usually, evaluating fitness is a time-consuming process in EAs so delta and/or approximate evaluation might be useful. Moreover, it might be enough to have a way to discriminate between individuals without conducting an exact fitness evaluation. University of Nottingham School of Computer Science 9Heuristic Search Methods Dr Dario Landa- Silva Reproduction Operators The design of reproduction operators must be done according to the solution representation and the fitness landscape. Some common operators (recombination and mutation): • Changing solution component values • Perturbing solution component values • Combining complete solutions • Blending solution component values When designing reproduction operators  allow infeasibility? Mutation: some variation of the parent after cloning. Recombination: components of parents are cloned then combined. Note: reproduction operators within an EA can be applied to produce 1 or more offspring at the same time. University of Nottingham School of Computer Science 10Heuristic Search Methods Dr Dario Landa- Silva Selection Mechanism Selection is applied in two contexts: for deciding which individuals will reproduce to generate new solutions (selection for reproduction) and for deciding which individuals will survive to the next generation (selection for survival). Some issues to consider in the selection mechanism: • Sampling with or without replacement? • Generational or steady-state? • Elitism strategy? • Deterministic of stochastic? • In stochastic selection: proportional, tournament or ranking? Common notation:  denotes number of parents and  denotes number of offspring. Some selection strategies: (+), (,) In terms of selection pressure, deterministic selection is the most stringent while proportional selection is the least. University of Nottingham School of Computer Science 11Heuristic Search Methods Dr Dario Landa- Silva • Genetic Algorithms (GA) • Evolutionary Strategies (ES) • Genetic Programming (GP) • Ant Colony Optimisation (ACO) • Memetic Algorithms (MA) • Particle Swarm Optimisation (PSO) • Differential Evolution (DE) • Estimation of Distribution Algorithm (EDA) • Cultural Algorithms (CA) Since this a very active research area, new variants of EAs and other population-based meta-heuristics are often proposed in the specialised literature. Examples of Evolutionary Algorithms (and other population-based algorithms) University of Nottingham School of Computer Science Non-dominance Solution x is said to be nondominated to solution y if y doesn’t dominate x. Pareto Optimality The Pareto-optimal set of multi-objective optimisation problem is the set all solutions which are nondominated to any other solution in the search space. A B C D Evolutionary Multi-objective Simulated Annealing Multiobjective Optimisation Problem Minimize {f1(x), …, fm(x)} s.t. x in Ω Dominance Solution x is said to dominate solution y, if and only if the function values of x is not worse than those of y in all objectives, and there exists one objective, in which x is better than y. Heuristic Search Methods Dr Dario Landa- Silva University of Nottingham School of Computer Science 1 2 3 4 56 7 8 9 10 Idea: change the search direction randomly and slightly to explore the whole Pareto-optimal front Weakness: easy to get stuck in the local part of the Pareto front Search Directions in Serafini’s MOSA Heuristic Search Methods Dr Dario Landa- Silva University of Nottingham School of Computer Science Idea: maintain multiple fixed search directions and approximate corresponding Pareto-optimal solution separatively Weakness: no ability to fill the gap between the solutions in the spare part of nondominated front 1 32 4 51 3 2 4 5 Search Directions in Ulungu’s MOSA Heuristic Search Methods Dr Dario Landa- Silva University of Nottingham School of Computer Science Stage 1: all search directions are fixed at each early temperature level when T is bigger than Tc Stage 2: the search direction of each solution is tuned at each late temperature level which T is smaller than Tc 1 3 2 4 5 1 32 5 1 3 2 4 5 4 EMOSA: Two-stage Strategy of Tuning Search Directions Heuristic Search Methods Dr Dario Landa- Silva University of Nottingham School of Computer Science Step 0: Initialisation Produce pop well-distributed weight vectors. For each weight vector, an initial solution is generated randomly. Update the external population (EP) with the nondominated solutions in the initial population. Set T:=T0. Step 1: Local Search and Competition For each individual x in the current population, repeat the follow steps K times. 1.1 find a neighbouring solution y 1.2 update the EP with y if it is not dominated by x 1.3 replace x with the probability P(w, x, y, T) Compete with the other solutions with similar weight vectors to that of x Step 2: Temperature Change Lower the temperature value by using T:=T – alpha. If T>=Tc, adapt the weight vector of each individual in the population, otherwise, go to Step 4. Step 3: Direction Adaptation Modify each weight vector to move the current solution away from its nearest nondominated neighbours in the population. Step 4: Stopping Criteria If T