Jasper's Dev Blog

An Exercise in Intuition with Genetic Algorithms

Advertisements

In this blog post I will be further exploring GA’s, evaluating the evolutionary approach to computational problem solving; in particular the crossover and mutation operators and their resultant effects on succeeding generations (no pun intended)!

The Scenario:

Assume that a GA has chromosomes in the following structure:

ch = (x0, x1, x2, x3, x4, x5, x6, x7)

x0-7 can be any digit between zero to nine.
The fitness of each chromosome is calculated using the following formula:

f(ch) = (x0 + x1) − (x2 + x3) + (x4 + x5 ) − (x6 + x7)

This problem is a maximisation problem.

In this scenario we initially have a population of four chromosomes as shown below:

Individual fitnesses, in order of best to worst:

  1. ch2 = (8, 7, 1, 2, 6, 6, 0, 1) | fitness = 29
  2. ch1 = (6, 5, 4, 1, 3, 5, 3, 2) | fitness = 15
  3. ch4 = (4, 1, 8, 5, 2, 0, 9, 4) | fitness = -1
  4. ch3 = (2, 3, 9, 2, 1, 2, 8, 5) | fitness = -2

Crossover

Chromosomal crossover is when two DNA helices break, swap a section and then rejoin.[1]

For this exercise our crossovers are as follows for subsequent generations

1st gen. Use one-point crossover (at the middle) to cross the best two chromosomes. [This would lead in creating the first two children]

2nd gen. Use two-point crossover (after x1 and after x5) to cross the 2nd and 3rd best individual. [This would results in creating two extra children]

3rd gen. Use uniform crossover (for a single gene) to cross the 1st and 3rd best chromosomes. [Now you should have two more children; six in total]

New child chromosomes over all generations:

c1 = [6 5 4 1 6 6 0 1] | fitness = 20
c2 = [8 7 1 2 3 5 3 2] | fitness = 24
c3 = [8 5 4 1 6 5 3 2] | fitness = 29
c4 = [6 7 1 2 3 6 0 1] | fitness = 23

Our optimal chromosome in this case is: (9, 9, 0, 0, 9, 9, 0, 0) | optimal fitness = 32

Is it possible to reach the optimal state without the mutation operator?

Yes, it is possible with a particular uniform crossover implementation which takes genes and places them in a random location for each splice. Albeit, this method has a small probability to succeed, depending on the amount of single gene uniform crossovers that are used for each generation. Alas, it may also take a much longer time to converge to the global optima than if we were to use a mutation operator, as the genetic diversity is somewhat limited. It would be easier if the initial chromosome population had some optimal segments, so that we converge to the optima in less time.

See the experiments: github.com/JasperKirton/GA_exercise

Thanks for reading! This is the eighth part in a series of  blogs for the Natural Computing module at Goldsmiths, University of London.

Advertisements

Advertisements