Lecture summary 1BM120
Lecture 1: evolutionary computation
Computational intelligence is the theory, design, application and development of biologically and
linguistically motivated computational paradigms. CI consists of three pillars:
CI tends to focus on bio-inspired algorithms (genetic programming, artificial immune systems). AI is
about deductive and symbolic reasoning aiming at replicating animal (human) behavior (logic
programming, Hodgkin-Huxley neuronal models). The main overlap between CI and AI are machine
learning and neural networks.
Bio-inspired meta-heuristic are population-based iterative stochastic algorithms for global
optimization.
Any objective function can be re-stated as an optimization problem. Real-world problems are often
non-convex, non-linear, multi-model etc. Computational intelligence optimization meta-heuristics
can be employed.
- Create a random set of candidate solutions to a given optimization problem and simulate
Darwinian processes to evolve the population towards optimal solutions.
- A candidate solution is encoded as a fixed-length vector which is a feasible solution and its
quality can be evaluated by means of an objective function f (fitness function).
Genetic algorithms:
A set of randomly generated candidate solutions evolves iteratively and converges to the optimal
solution of a given problem.
1. A population of random N individuals is created
2. The fitness value of all N individuals is calculated
3. Survival of the fittest: a selection mechanism is used to choose pairs of individuals with a
probability proportional to their fitness values
4. Each pair of selected individuals (the parents) undergoes a genetic crossover: their
chromosomes are randomly exchanged to produce new individuals (the offspring)
5. The offspring undergo genetic mutation: some symbols of the individuals are randomly
changed
6. When N offsprings are created, they replace the previous population
7. If the termination criterion is met, the algorithm returns the best fitting individuals as
solution; else, perform a new generation by iterating the process from step 2.
Termination criterion:
1. Fitness value threshold
2. Fixed amount of generations
3. Loss of diversity in the population
Selection methods:
, - Roulette wheel: the probability of selecting an individual is proportional to its fitness value:
f ( xi )
Psel ( x i )= N
∑ ❑ f ( xn)
n=1
- Ranking: rank solutions according to their fitness value, the probability of selecting an
1
individual is proportional to its ranking: Psel ( x i )=
r i +1
- Tournament: a selection of individuals are chosen from the population to compete in a
tournament. The best individual wins the tournament and is selected.
Crossover:
Each of the parents, extract a random number. If this number is smaller than the crossover
probability, the parents undergo crossover.
- Single point crossover: select a random crossover point and exchange the parts from this
line.
- Uniform crossover: randomly generate a bit-mask. The mask denotes which bit is kept on
from parents 1 to offspring 1 and which are swapped form parent 1 to offspring 2.
- Partially matched crossover: special type of crossover preserving
relative order:
Mutation:
Mutation introduces new genetic material into the population.
- Uniform mutation: bit flip (1 becomes 0 and the other way
around). A high mutation probability corresponds to a random
search
Elitism: during the evolution, one excellent individual might be “destroyed” by the genetic operators.
Elitism preserves such individual, by copying the best individual to the next generation.
Premature convergence: when a GA converges too fast to a suboptimal population.
Loss of diversity: when the individuals of a GA population are too similar, so that the crossover is no
longer effective.
Handling constraints:
- Set the fitness of unfeasible solutions to extreme values,
- Penalize the fitness function,
- Fix wrong solutions,
- Use special encodings,
- Manipulate the search space.
Lecture 2: evolutionary and swarm computation
Differential evolution is a parallel direct search method based on parameters vectors for real-valued
global optimization. Evolutionary computation approach: a population of solutions evolves
The benefits of buying summaries with Stuvia:
Guaranteed quality through customer reviews
Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.
Quick and easy check-out
You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.
Focus on what matters
Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!
Frequently asked questions
What do I get when I buy this document?
You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.
Satisfaction guarantee: how does it work?
Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.
Who am I buying these notes from?
Stuvia is a marketplace, so you are not buying this document from us, but from seller AnneBannink. Stuvia facilitates payment to the seller.
Will I be stuck with a subscription?
No, you only buy these notes for $5.35. You're not tied to anything after your purchase.