Genetic Algorithms (GAs) are a type of optimization and search technique inspired by the principles of natural selection and genetics in biological systems. They are commonly used in Artificial Intelligence (AI) to find approximate solutions to complex problems that are hard to solve using traditional methods.
Population
A group of candidate solutions (called individuals or chromosomes) to the problem.
Chromosome / Individual
A representation of a potential solution, usually encoded as a string (binary, real numbers, etc.).
Gene
A part of the chromosome, representing a specific trait or variable in the solution.
Fitness Function
A function that evaluates how good each individual is with respect to the problem.
Selection
The process of choosing individuals based on their fitness to reproduce and pass genes to the next generation.
Crossover (Recombination)
A genetic operator used to combine the genetic information of two parents to generate new offspring.
Mutation
A genetic operator that introduces random changes to individual genes to maintain diversity in the population.
Generations
The algorithm is run over multiple iterations (generations), and in each generation, the process of selection, crossover, and mutation is applied.
Initialization: Generate an initial population of random individuals.
Evaluation: Calculate the fitness of each individual using the fitness function.
Selection: Select pairs of individuals to breed based on their fitness (e.g., roulette wheel, tournament selection).
Crossover: Apply crossover to selected parents to produce offspring.
Mutation: Mutate some genes in the offspring with a small probability.
Replacement: Form the new generation by replacing some or all of the old population.
Termination: Repeat steps 2–6 until a stopping criterion is met (e.g., number of generations, convergence, or satisfactory fitness level).
Optimization Problems (e.g., TSP, scheduling, resource allocation)
Machine Learning (e.g., feature selection, hyperparameter tuning)
Robotics (e.g., path planning, behavior optimization)
Game Playing (e.g., strategy evolution)
Neural Network Training (weights and architecture tuning)
Can handle complex, nonlinear, and multi-modal problems
Works with both continuous and discrete variables
Doesn't require gradient information
Capable of global search and avoiding local minima
Computationally expensive
May require careful tuning of parameters (mutation rate, crossover rate, etc.)
No guarantee of finding the optimal solution
Let’s say we want to maximize the function:
f(x) = x^2
Where: x ∈ [0, 31]
(i.e., integers from 0 to 31).
We want to find the x
in that range which gives the highest x^2
.
Best answer = x = 31
, f(x) = 961
We’ll encode x
as a 5-bit binary string.
import random
# Parameters
POP_SIZE = 6
GENES = 5 # Bits to represent numbers from 0 to 31
MAX_GENERATIONS = 20
MUTATION_RATE = 0.1
# Fitness function: f(x) = x^2
def fitness(individual):
x = int(individual, 2)
return x ** 2
# Generate a random individual
def random_individual():
return ''.join(random.choice('01') for _ in range(GENES))
# Selection: Roulette Wheel
def selection(population, fitnesses):
total_fitness = sum(fitnesses)
pick = random.uniform(0, total_fitness)
current = 0
for individual, fit in zip(population, fitnesses):
current += fit
if current > pick:
return individual
return population[-1]
# Crossover: Single-point
def crossover(parent1, parent2):
point = random.randint(1, GENES - 1)
return parent1[:point] + parent2[point:], parent2[:point] + parent1[point:]
# Mutation: Flip a bit
def mutate(individual):
result = ''
for bit in individual:
if random.random() < MUTATION_RATE:
result += '1' if bit == '0' else '0'
else:
result += bit
return result
# Genetic Algorithm
def genetic_algorithm():
population = [random_individual() for _ in range(POP_SIZE)]
for generation in range(MAX_GENERATIONS):
fitnesses = [fitness(ind) for ind in population]
print(f"Generation {generation}:")
for i, ind in enumerate(population):
print(f" {ind} (x={int(ind,2)}): fitness={fitnesses[i]}")
new_population = []
for _ in range(POP_SIZE // 2):
parent1 = selection(population, fitnesses)
parent2 = selection(population, fitnesses)
child1, child2 = crossover(parent1, parent2)
new_population.append(mutate(child1))
new_population.append(mutate(child2))
population = new_population
# Best result
best = max(population, key=lambda x: fitness(x))
print(f"\nBest solution: {best} (x={int(best,2)}), fitness={fitness(best)}")
genetic_algorithm()
Sample Output (will vary each time):
Generation 0:
11010 (x=26): fitness=676
00101 (x=5): fitness=25
...
Best solution: 11111 (x=31), fitness=961