This chapter presents the basic principles of evolutionary algorithms as an introduction to the subsequent chapters. After a brief history of the domain in section 1.1 , a generic evolutionary model is described in section 1.2 . Sections 1.3 to 1.5 detail widespread variants of the operators composing the evolutionary algorithms, with a particular emphasis on binary representation. The chapter ends with a short presentation in section 1.6 of the famous genetic algorithms that have the originality to favor the binary representation associated with a transcription genotype-phenotype.
1.1. From natural evolution to engineering
According to Charles Darwin [DAR 59], the evolution of living beings rests on several facts:
- the variations of individual characteristics between parents and offspring;
- the heritability of much of these characteristics;
- a competition that selects the fittest individuals of a population in relation to their environment, in order to survive and reproduce.
From these facts, Darwin deduced that competition allows the transmission of hereditary beneficial variations among individuals that accumulate from generation to generation.
In the 1950s, the development of the electronic computer facilitated the simulation of this theory and some researchers desired to test it to solve engineering problems. But these works were not convincing because of the weak performances of the calculators available at that time. Furthermore, the extreme slowness of the evolution appeared prohibitive to usefully simulate this process.
During the 1960s and 1970s, as soon as calculators of more credible capacity became accessible, many attempts to model the process of evolution were undertaken. Among those, three approaches emerged and progressed independently until the beginning of the 1990s:
- the evolution strategies (ESs) of H. P. Schwefel and I. Rechenberg [REC 65, BEY 01], which are derived from an experimental optimization method to solve fluid dynamics problems;
- the evolutionary programming (EP) of L. J. Fogel et al . [FOG 66] which aimed, in the mid-1960s, to evolve the structure of finite-state automata with iterated selections and mutations; it was desired to be an alternative to artificial intelligence at the time;
- Genetic algorithms (GAs) were presented in 1975 by J.H. Holland [HOL 92], with the objective of understanding the underlying mechanisms of systems able to self-adapt to their environment.
Thereafter, these approaches underwent many modifications according to the variety of the problems faced by their founders and their pupils. Genetic algorithms became extremely popular after the publication of the book " Genetic Algorithms in Search, Optimization and Machine Learning " by D. E. Goldberg in 1989 [GOL 89]. This book, distributed worldwide, resulted in exponential growth in interest in this field. While there were about a few hundred publications in this area during the 20 year period before this book was published, there are several hundreds of thousands of references related to evolutionary computation available today, according to the website "google scholar" 1 . Researchers in this field have organized common international conferences for presenting and combining their different approaches.
The widespread term Evolutionary Computation appeared in 1993 as the title of a new journal published by the MIT Press. It was widely used to designate all methods based on the metaphor of the biological evolution theory, as well as many others. For example, although it is inspired by a simplified model of social behavior, it is common to consider "Particle Swarm Optimization" as an evolutionary approach. "Particle Swarm O