Resuming an optimization with pyevolve
I have done an optimization with Pyevolve and after a look at the results I wanted to add a few generation to have a better convergence. As an evaluation is quite long, I was wondering if I can resume my optimization to the last generation and add like 20 more generations. Everything must be set in the DB I hope so he can be possible.
Here is my GA properties (similar to the first example but with a more complicated evaluation function):
# Genome instance, 1D List of 6 elements
genome = G1DList.G1DList(6)
# Sets the range max and min of the 1D List
genome.setParams(rangemin=1, rangemax=15)
# The evaluator function (evaluation function)
genome.evaluator.set(eval_func)
# Genetic Algorithm Instance
ga=GSimpleGA.GSimpleGA(genome)
# Set the Roulette Wheel selector method, the number of generations and
# the termination criteria
ga.selector.set(Selectors.GRouletteWheel)
ga.setGenerations(50)
ga.setPopulationSize(10)
ga.terminationCriteria.set(GSimpleGA.ConvergenceCriteria)
# Sets the DB Adapter, the resetDB flag will make the Adapter recreate
# the database and erase all data every run, you should use this flag
# just in the first time, after the pyevolve.db was created, you can
# omit it.
sqlite_adapter = DBAdapters.DBSQLite(identify="F-Beam-Optimization", resetDB=True)
ga.setDBAdapter(sqlite_adapter)
# Do the evolution, with stats dump
# frequency of 5 generations
ga.evolve(freq_stats=2)
An开发者_运维技巧yone with the idea?
Hi after reviewing the documentation of Pyevolve there doesn't seem to be any way to resume an evolution base on what you stored in the database (strange behaviour).
If you want to implement this type of mechanism, you could look at pickling your population once and a while and implementing the whole thing in Pyevolve.
Or, you could try DEAP a very open framework that let you see and manipulate every aspect of an evolutionary algorithm, transparently. And there is already some checkpointing mechanism implemented.
Here is what your code would look like in DEAP.
import random
from deap import algorithms, base, creator, tools
# Create the needed types
creator.create("FitnessMax", base.Fitness, weights=(1.0,))
creator.create("Individual", list, fitness=creator.FitnessMax)
# Container for the evolutionary tools
toolbox = base.Toolbox()
toolbox.register("attr", random.random, 1, 15)
toolbox.register("individual", tools.initRepeat, creator.Individual, toolbox.attr, 6)
toolbox.register("population", tools.initRepeat, list, toolbox.individual)
# Operator registering
toolbox.register("evaluate", eval_func)
toolbox.register("mate", tools.cxTwoPoints)
toolbox.register("mutate", tools.mutGaussian, mu=0, sigma=1, indpb=0.05)
toolbox.register("select", tools.selTournament, tournsize=3)
population = toolbox.population(n=10)
stats = tools.Statistics(key=lambda ind: ind.fitness.values)
stats.register("Max", max)
checkpoint = tools.Checkpoint(population=population)
GEN, CXPB, MUTPB = 0, 0.5, 0.1
while stats.Max() < CONDITION:
# Apply standard variation (crossover followed by mutation)
offspring = algorithms.varSimple(toolbox, population, cxpb=CXPB, mutpb=MUTPB)
# Evaluate the individuals
fits = toolbox.map(toolbox.evaluate, offspring)
for fit, ind in zip(fits, offspring):
ind.fitness.values = fit
# Select the fittest individuals
offspring = [toolbox.clone(ind) for ind in toolbox.select(offspring, len(offspring)]
# The "[:]" is important to not replace the label but what it contains
population[:] = offspring
stats.update(population)
if GEN % 20 == 0:
checkpoint.dump("my_checkpoint")
GEN += 1
Note that the above code has not been tested. But it does everything you request for. Now how to load a checkpoint and restart an evolution.
checkpoint = tools.Checkpoint()
checkpoint.load("my_checkpoint.ems")
population = checkpoint["population"]
# Continue the evolution has in before
Moreover, DEAP is very well documented and has over 25 diversified examples that help new user to ramp up very quickly, I also heard that developers answer to question very quickly.
精彩评论