Distributed Evolutionary Algorithms in Python

Overview

DEAP

Build status Download Join the chat at https://gitter.im/DEAP/deap Build Status Documentation Status

DEAP is a novel evolutionary computation framework for rapid prototyping and testing of ideas. It seeks to make algorithms explicit and data structures transparent. It works in perfect harmony with parallelisation mechanisms such as multiprocessing and SCOOP.

DEAP includes the following features:

  • Genetic algorithm using any imaginable representation
    • List, Array, Set, Dictionary, Tree, Numpy Array, etc.
  • Genetic programing using prefix trees
    • Loosely typed, Strongly typed
    • Automatically defined functions
  • Evolution strategies (including CMA-ES)
  • Multi-objective optimisation (NSGA-II, NSGA-III, SPEA2, MO-CMA-ES)
  • Co-evolution (cooperative and competitive) of multiple populations
  • Parallelization of the evaluations (and more)
  • Hall of Fame of the best individuals that lived in the population
  • Checkpoints that take snapshots of a system regularly
  • Benchmarks module containing most common test functions
  • Genealogy of an evolution (that is compatible with NetworkX)
  • Examples of alternative algorithms : Particle Swarm Optimization, Differential Evolution, Estimation of Distribution Algorithm

Downloads

Following acceptance of PEP 438 by the Python community, we have moved DEAP's source releases on PyPI.

You can find the most recent releases at: https://pypi.python.org/pypi/deap/.

Documentation

See the DEAP User's Guide for DEAP documentation.

In order to get the tip documentation, change directory to the doc subfolder and type in make html, the documentation will be under _build/html. You will need Sphinx to build the documentation.

Notebooks

Also checkout our new notebook examples. Using Jupyter notebooks you'll be able to navigate and execute each block of code individually and tell what every line is doing. Either, look at the notebooks online using the notebook viewer links at the botom of the page or download the notebooks, navigate to the you download directory and run

jupyter notebook

Installation

We encourage you to use easy_install or pip to install DEAP on your system. Other installation procedure like apt-get, yum, etc. usually provide an outdated version.

pip install deap

The latest version can be installed with

pip install git+https://github.com/DEAP/deap@master

If you wish to build from sources, download or clone the repository and type

python setup.py install

Build Status

DEAP build status is available on Travis-CI https://travis-ci.org/DEAP/deap.

Requirements

The most basic features of DEAP requires Python2.6. In order to combine the toolbox and the multiprocessing module Python2.7 is needed for its support to pickle partial functions. CMA-ES requires Numpy, and we recommend matplotlib for visualization of results as it is fully compatible with DEAP's API.

Since version 0.8, DEAP is compatible out of the box with Python 3. The installation procedure automatically translates the source to Python 3 with 2to3.

Example

The following code gives a quick overview how simple it is to implement the Onemax problem optimization with genetic algorithm using DEAP. More examples are provided here.

import random
from deap import creator, base, tools, algorithms

creator.create("FitnessMax", base.Fitness, weights=(1.0,))
creator.create("Individual", list, fitness=creator.FitnessMax)

toolbox = base.Toolbox()

toolbox.register("attr_bool", random.randint, 0, 1)
toolbox.register("individual", tools.initRepeat, creator.Individual, toolbox.attr_bool, n=100)
toolbox.register("population", tools.initRepeat, list, toolbox.individual)

def evalOneMax(individual):
    return sum(individual),

toolbox.register("evaluate", evalOneMax)
toolbox.register("mate", tools.cxTwoPoint)
toolbox.register("mutate", tools.mutFlipBit, indpb=0.05)
toolbox.register("select", tools.selTournament, tournsize=3)

population = toolbox.population(n=300)

NGEN=40
for gen in range(NGEN):
    offspring = algorithms.varAnd(population, toolbox, cxpb=0.5, mutpb=0.1)
    fits = toolbox.map(toolbox.evaluate, offspring)
    for fit, ind in zip(fits, offspring):
        ind.fitness.values = fit
    population = toolbox.select(offspring, k=len(population))
top10 = tools.selBest(population, k=10)

How to cite DEAP

Authors of scientific papers including results generated using DEAP are encouraged to cite the following paper.

@article{DEAP_JMLR2012, 
    author    = " F\'elix-Antoine Fortin and Fran\c{c}ois-Michel {De Rainville} and Marc-Andr\'e Gardner and Marc Parizeau and Christian Gagn\'e ",
    title     = { {DEAP}: Evolutionary Algorithms Made Easy },
    pages    = { 2171--2175 },
    volume    = { 13 },
    month     = { jul },
    year      = { 2012 },
    journal   = { Journal of Machine Learning Research }
}

Publications on DEAP

  • François-Michel De Rainville, Félix-Antoine Fortin, Marc-André Gardner, Marc Parizeau and Christian Gagné, "DEAP -- Enabling Nimbler Evolutions", SIGEVOlution, vol. 6, no 2, pp. 17-26, February 2014. Paper
  • Félix-Antoine Fortin, François-Michel De Rainville, Marc-André Gardner, Marc Parizeau and Christian Gagné, "DEAP: Evolutionary Algorithms Made Easy", Journal of Machine Learning Research, vol. 13, pp. 2171-2175, jul 2012. Paper
  • François-Michel De Rainville, Félix-Antoine Fortin, Marc-André Gardner, Marc Parizeau and Christian Gagné, "DEAP: A Python Framework for Evolutionary Algorithms", in !EvoSoft Workshop, Companion proc. of the Genetic and Evolutionary Computation Conference (GECCO 2012), July 07-11 2012. Paper

Projects using DEAP

  • Ribaric, T., & Houghten, S. (2017, June). Genetic programming for improved cryptanalysis of elliptic curve cryptosystems. In 2017 IEEE Congress on Evolutionary Computation (CEC) (pp. 419-426). IEEE.
  • Ellefsen, Kai Olav, Herman Augusto Lepikson, and Jan C. Albiez. "Multiobjective coverage path planning: Enabling automated inspection of complex, real-world structures." Applied Soft Computing 61 (2017): 264-282.
  • S. Chardon, B. Brangeon, E. Bozonnet, C. Inard (2016), Construction cost and energy performance of single family houses : From integrated design to automated optimization, Automation in Construction, Volume 70, p.1-13.
  • B. Brangeon, E. Bozonnet, C. Inard (2016), Integrated refurbishment of collective housing and optimization process with real products databases, Building Simulation Optimization, pp. 531–538 Newcastle, England.
  • Randal S. Olson, Ryan J. Urbanowicz, Peter C. Andrews, Nicole A. Lavender, La Creis Kidd, and Jason H. Moore (2016). Automating biomedical data science through tree-based pipeline optimization. Applications of Evolutionary Computation, pages 123-137.
  • Randal S. Olson, Nathan Bartley, Ryan J. Urbanowicz, and Jason H. Moore (2016). Evaluation of a Tree-based Pipeline Optimization Tool for Automating Data Science. Proceedings of GECCO 2016, pages 485-492.
  • Van Geit W, Gevaert M, Chindemi G, Rössert C, Courcol J, Muller EB, Schürmann F, Segev I and Markram H (2016). BluePyOpt: Leveraging open source software and cloud infrastructure to optimise model parameters in neuroscience. Front. Neuroinform. 10:17. doi: 10.3389/fninf.2016.00017 https://github.com/BlueBrain/BluePyOpt
  • Lara-Cabrera, R., Cotta, C. and Fernández-Leiva, A.J. (2014). Geometrical vs topological measures for the evolution of aesthetic maps in a rts game, Entertainment Computing,
  • Macret, M. and Pasquier, P. (2013). Automatic Tuning of the OP-1 Synthesizer Using a Multi-objective Genetic Algorithm. In Proceedings of the 10th Sound and Music Computing Conference (SMC). (pp 614-621).
  • Fortin, F. A., Grenier, S., & Parizeau, M. (2013, July). Generalizing the improved run-time complexity algorithm for non-dominated sorting. In Proceeding of the fifteenth annual conference on Genetic and evolutionary computation conference (pp. 615-622). ACM.
  • Fortin, F. A., & Parizeau, M. (2013, July). Revisiting the NSGA-II crowding-distance computation. In Proceeding of the fifteenth annual conference on Genetic and evolutionary computation conference (pp. 623-630). ACM.
  • Marc-André Gardner, Christian Gagné, and Marc Parizeau. Estimation of Distribution Algorithm based on Hidden Markov Models for Combinatorial Optimization. in Comp. Proc. Genetic and Evolutionary Computation Conference (GECCO 2013), July 2013.
  • J. T. Zhai, M. A. Bamakhrama, and T. Stefanov. "Exploiting Just-enough Parallelism when Mapping Streaming Applications in Hard Real-time Systems". Design Automation Conference (DAC 2013), 2013.
  • V. Akbarzadeh, C. Gagné, M. Parizeau, M. Argany, M. A Mostafavi, "Probabilistic Sensing Model for Sensor Placement Optimization Based on Line-of-Sight Coverage", Accepted in IEEE Transactions on Instrumentation and Measurement, 2012.
  • M. Reif, F. Shafait, and A. Dengel. "Dataset Generation for Meta-Learning". Proceedings of the German Conference on Artificial Intelligence (KI'12). 2012.
  • M. T. Ribeiro, A. Lacerda, A. Veloso, and N. Ziviani. "Pareto-Efficient Hybridization for Multi-Objective Recommender Systems". Proceedings of the Conference on Recommanders Systems (!RecSys'12). 2012.
  • M. Pérez-Ortiz, A. Arauzo-Azofra, C. Hervás-Martínez, L. García-Hernández and L. Salas-Morera. "A system learning user preferences for multiobjective optimization of facility layouts". Pr,oceedings on the Int. Conference on Soft Computing Models in Industrial and Environmental Applications (SOCO'12). 2012.
  • Lévesque, J.C., Durand, A., Gagné, C., and Sabourin, R., Multi-Objective Evolutionary Optimization for Generating Ensembles of Classifiers in the ROC Space, Genetic and Evolutionary Computation Conference (GECCO 2012), 2012.
  • Marc-André Gardner, Christian Gagné, and Marc Parizeau, "Bloat Control in Genetic Programming with Histogram-based Accept-Reject Method", in Proc. Genetic and Evolutionary Computation Conference (GECCO 2011), 2011.
  • Vahab Akbarzadeh, Albert Ko, Christian Gagné, and Marc Parizeau, "Topography-Aware Sensor Deployment Optimization with CMA-ES", in Proc. of Parallel Problem Solving from Nature (PPSN 2010), Springer, 2010.
  • DEAP is used in TPOT, an open source tool that uses genetic programming to optimize machine learning pipelines.
  • DEAP is also used in ROS as an optimization package http://www.ros.org/wiki/deap.
  • DEAP is an optional dependency for PyXRD, a Python implementation of the matrix algorithm developed for the X-ray diffraction analysis of disordered lamellar structures.
  • DEAP is used in glyph, a library for symbolic regression with applications to MLC.

If you want your project listed here, send us a link and a brief description and we'll be glad to add it.

Comments
  • New Release and Hypervolume warning

    New Release and Hypervolume warning

    My Ubuntu 14.04, after upgrading to the new pip release(omg thank you by the way!), has begun throwing the Hypervolume import warning. I am used to seeing this harmless(?) warning in my windows setup using older pip but I don't believe I was seeing it before in Linux when I had installed directly from the master via: pip install git+git://github.com/DEAP/deap/

    Is this expected behavior for the pip release?

    /usr/local/lib/python2.7/dist-packages/deap/tools/_hypervolume/pyhv.py:33: ImportWarning: Falling back to the python version of hypervolume module. Expect this to be very slow.
      "module. Expect this to be very slow.", ImportWarning)
    
    opened by DMTSource 34
  • Maximization isted of minimization

    Maximization isted of minimization

    Hi.

    i have problem with minimization of ackley benchmark function. I cant achieve minimization.

    im using

    creator.create("FitnessMin", base.Fitness, weights=(-1.0,)) creator.create("Individual", list, typecode='d', fitness=creator.FitnessMin)

    Im loking for help on stackoveflow. There is my code. Link:

    https://stackoverflow.com/questions/44576799/minimize-a-function-using-deap

    opened by GrzegorzGiniewicz 14
  • Hypervolume compile error again.

    Hypervolume compile error again.

    My windows 10 running py 3.6 is having issues compiling hypervolume. I cant seem to find the error. Here is a copy of the verbose output when following the recommendation in https://github.com/DEAP/deap/issues/240. Mingw is in path, gcc -- version reads 6.3.0

    running install
    running bdist_egg
    running egg_info
    writing deap.egg-info\PKG-INFO
    writing dependency_links to deap.egg-info\dependency_links.txt
    writing top-level names to deap.egg-info\top_level.txt
    reading manifest file 'deap.egg-info\SOURCES.txt'
    reading manifest template 'MANIFEST.in'
    warning: no files found matching '*.hpp' under directory 'deap'
    warning: no files found matching '*.h' under directory 'examples'
    no previously-included directories found matching 'doc\_build'
    warning: no previously-included files matching '.DS_Store' found anywhere in distribution
    warning: no previously-included files matching '*.pyc' found anywhere in distribution
    writing manifest file 'deap.egg-info\SOURCES.txt'
    installing library code to build\bdist.win-amd64\egg
    running install_lib
    running build_py
    creating build
    creating build\lib.win-amd64-3.6
    creating build\lib.win-amd64-3.6\deap
    copying deap\algorithms.py -> build\lib.win-amd64-3.6\deap
    copying deap\base.py -> build\lib.win-amd64-3.6\deap
    copying deap\cma.py -> build\lib.win-amd64-3.6\deap
    copying deap\creator.py -> build\lib.win-amd64-3.6\deap
    copying deap\gp.py -> build\lib.win-amd64-3.6\deap
    copying deap\__init__.py -> build\lib.win-amd64-3.6\deap
    creating build\lib.win-amd64-3.6\deap\benchmarks
    copying deap\benchmarks\binary.py -> build\lib.win-amd64-3.6\deap\benchmarks
    copying deap\benchmarks\gp.py -> build\lib.win-amd64-3.6\deap\benchmarks
    copying deap\benchmarks\movingpeaks.py -> build\lib.win-amd64-3.6\deap\benchmarks
    copying deap\benchmarks\tools.py -> build\lib.win-amd64-3.6\deap\benchmarks
    copying deap\benchmarks\__init__.py -> build\lib.win-amd64-3.6\deap\benchmarks
    creating build\lib.win-amd64-3.6\deap\tests
    copying deap\tests\test_algorithms.py -> build\lib.win-amd64-3.6\deap\tests
    copying deap\tests\test_benchmarks.py -> build\lib.win-amd64-3.6\deap\tests
    copying deap\tests\test_creator.py -> build\lib.win-amd64-3.6\deap\tests
    copying deap\tests\test_logbook.py -> build\lib.win-amd64-3.6\deap\tests
    copying deap\tests\test_pickle.py -> build\lib.win-amd64-3.6\deap\tests
    copying deap\tests\__init__.py -> build\lib.win-amd64-3.6\deap\tests
    creating build\lib.win-amd64-3.6\deap\tools
    copying deap\tools\constraint.py -> build\lib.win-amd64-3.6\deap\tools
    copying deap\tools\crossover.py -> build\lib.win-amd64-3.6\deap\tools
    copying deap\tools\emo.py -> build\lib.win-amd64-3.6\deap\tools
    copying deap\tools\indicator.py -> build\lib.win-amd64-3.6\deap\tools
    copying deap\tools\init.py -> build\lib.win-amd64-3.6\deap\tools
    copying deap\tools\migration.py -> build\lib.win-amd64-3.6\deap\tools
    copying deap\tools\mutation.py -> build\lib.win-amd64-3.6\deap\tools
    copying deap\tools\selection.py -> build\lib.win-amd64-3.6\deap\tools
    copying deap\tools\support.py -> build\lib.win-amd64-3.6\deap\tools
    copying deap\tools\__init__.py -> build\lib.win-amd64-3.6\deap\tools
    creating build\lib.win-amd64-3.6\deap\tools\_hypervolume
    copying deap\tools\_hypervolume\pyhv.py -> build\lib.win-amd64-3.6\deap\tools\_hypervolume
    copying deap\tools\_hypervolume\__init__.py -> build\lib.win-amd64-3.6\deap\tools\_hypervolume
    Fixing build\lib.win-amd64-3.6\deap\algorithms.py build\lib.win-amd64-3.6\deap\base.py build\lib.win-amd64-3.6\deap\cma.py build\lib.win-amd64-3.6\deap\creator.py build\lib.win-amd64-3.6\deap\gp.py build\lib.win-amd64-3.6\deap\__init__.py build\lib.win-amd64-3.6\deap\benchmarks\binary.py build\lib.win-amd64-3.6\deap\benchmarks\gp.py build\lib.win-amd64-3.6\deap\benchmarks\movingpeaks.py build\lib.win-amd64-3.6\deap\benchmarks\tools.py build\lib.win-amd64-3.6\deap\benchmarks\__init__.py build\lib.win-amd64-3.6\deap\tests\test_algorithms.py build\lib.win-amd64-3.6\deap\tests\test_benchmarks.py build\lib.win-amd64-3.6\deap\tests\test_creator.py build\lib.win-amd64-3.6\deap\tests\test_logbook.py build\lib.win-amd64-3.6\deap\tests\test_pickle.py build\lib.win-amd64-3.6\deap\tests\__init__.py build\lib.win-amd64-3.6\deap\tools\constraint.py build\lib.win-amd64-3.6\deap\tools\crossover.py build\lib.win-amd64-3.6\deap\tools\emo.py build\lib.win-amd64-3.6\deap\tools\indicator.py build\lib.win-amd64-3.6\deap\tools\init.py build\lib.win-amd64-3.6\deap\tools\migration.py build\lib.win-amd64-3.6\deap\tools\mutation.py build\lib.win-amd64-3.6\deap\tools\selection.py build\lib.win-amd64-3.6\deap\tools\support.py build\lib.win-amd64-3.6\deap\tools\__init__.py build\lib.win-amd64-3.6\deap\tools\_hypervolume\pyhv.py build\lib.win-amd64-3.6\deap\tools\_hypervolume\__init__.py
    Skipping optional fixer: buffer
    Skipping optional fixer: idioms
    Skipping optional fixer: set_literal
    Skipping optional fixer: ws_comma
    Fixing build\lib.win-amd64-3.6\deap\algorithms.py build\lib.win-amd64-3.6\deap\base.py build\lib.win-amd64-3.6\deap\cma.py build\lib.win-amd64-3.6\deap\creator.py build\lib.win-amd64-3.6\deap\gp.py build\lib.win-amd64-3.6\deap\__init__.py build\lib.win-amd64-3.6\deap\benchmarks\binary.py build\lib.win-amd64-3.6\deap\benchmarks\gp.py build\lib.win-amd64-3.6\deap\benchmarks\movingpeaks.py build\lib.win-amd64-3.6\deap\benchmarks\tools.py build\lib.win-amd64-3.6\deap\benchmarks\__init__.py build\lib.win-amd64-3.6\deap\tests\test_algorithms.py build\lib.win-amd64-3.6\deap\tests\test_benchmarks.py build\lib.win-amd64-3.6\deap\tests\test_creator.py build\lib.win-amd64-3.6\deap\tests\test_logbook.py build\lib.win-amd64-3.6\deap\tests\test_pickle.py build\lib.win-amd64-3.6\deap\tests\__init__.py build\lib.win-amd64-3.6\deap\tools\constraint.py build\lib.win-amd64-3.6\deap\tools\crossover.py build\lib.win-amd64-3.6\deap\tools\emo.py build\lib.win-amd64-3.6\deap\tools\indicator.py build\lib.win-amd64-3.6\deap\tools\init.py build\lib.win-amd64-3.6\deap\tools\migration.py build\lib.win-amd64-3.6\deap\tools\mutation.py build\lib.win-amd64-3.6\deap\tools\selection.py build\lib.win-amd64-3.6\deap\tools\support.py build\lib.win-amd64-3.6\deap\tools\__init__.py build\lib.win-amd64-3.6\deap\tools\_hypervolume\pyhv.py build\lib.win-amd64-3.6\deap\tools\_hypervolume\__init__.py
    Skipping optional fixer: buffer
    Skipping optional fixer: idioms
    Skipping optional fixer: set_literal
    Skipping optional fixer: ws_comma
    running build_ext
    building 'deap.tools._hypervolume.hv' extension
    ***************************************************************************
    WARNING: The C extensions could not be compiled, speedups won't be available.
    Now building without C extensions.
    ***************************************************************************
    running install
    running bdist_egg
    running egg_info
    writing deap.egg-info\PKG-INFO
    writing dependency_links to deap.egg-info\dependency_links.txt
    writing top-level names to deap.egg-info\top_level.txt
    reading manifest file 'deap.egg-info\SOURCES.txt'
    reading manifest template 'MANIFEST.in'
    warning: no files found matching '*.hpp' under directory 'deap'
    warning: no files found matching '*.h' under directory 'examples'
    no previously-included directories found matching 'doc\_build'
    warning: no previously-included files matching '.DS_Store' found anywhere in distribution
    warning: no previously-included files matching '*.pyc' found anywhere in distribution
    writing manifest file 'deap.egg-info\SOURCES.txt'
    installing library code to build\bdist.win-amd64\egg
    running install_lib
    running build_py
    creating build\lib
    creating build\lib\deap
    copying deap\algorithms.py -> build\lib\deap
    copying deap\base.py -> build\lib\deap
    copying deap\cma.py -> build\lib\deap
    copying deap\creator.py -> build\lib\deap
    copying deap\gp.py -> build\lib\deap
    copying deap\__init__.py -> build\lib\deap
    creating build\lib\deap\benchmarks
    copying deap\benchmarks\binary.py -> build\lib\deap\benchmarks
    copying deap\benchmarks\gp.py -> build\lib\deap\benchmarks
    copying deap\benchmarks\movingpeaks.py -> build\lib\deap\benchmarks
    copying deap\benchmarks\tools.py -> build\lib\deap\benchmarks
    copying deap\benchmarks\__init__.py -> build\lib\deap\benchmarks
    creating build\lib\deap\tests
    copying deap\tests\test_algorithms.py -> build\lib\deap\tests
    copying deap\tests\test_benchmarks.py -> build\lib\deap\tests
    copying deap\tests\test_creator.py -> build\lib\deap\tests
    copying deap\tests\test_logbook.py -> build\lib\deap\tests
    copying deap\tests\test_pickle.py -> build\lib\deap\tests
    copying deap\tests\__init__.py -> build\lib\deap\tests
    creating build\lib\deap\tools
    copying deap\tools\constraint.py -> build\lib\deap\tools
    copying deap\tools\crossover.py -> build\lib\deap\tools
    copying deap\tools\emo.py -> build\lib\deap\tools
    copying deap\tools\indicator.py -> build\lib\deap\tools
    copying deap\tools\init.py -> build\lib\deap\tools
    copying deap\tools\migration.py -> build\lib\deap\tools
    copying deap\tools\mutation.py -> build\lib\deap\tools
    copying deap\tools\selection.py -> build\lib\deap\tools
    copying deap\tools\support.py -> build\lib\deap\tools
    copying deap\tools\__init__.py -> build\lib\deap\tools
    creating build\lib\deap\tools\_hypervolume
    copying deap\tools\_hypervolume\pyhv.py -> build\lib\deap\tools\_hypervolume
    copying deap\tools\_hypervolume\__init__.py -> build\lib\deap\tools\_hypervolume
    Fixing build\lib\deap\algorithms.py build\lib\deap\base.py build\lib\deap\cma.py build\lib\deap\creator.py build\lib\deap\gp.py build\lib\deap\__init__.py build\lib\deap\benchmarks\binary.py build\lib\deap\benchmarks\gp.py build\lib\deap\benchmarks\movingpeaks.py build\lib\deap\benchmarks\tools.py build\lib\deap\benchmarks\__init__.py build\lib\deap\tests\test_algorithms.py build\lib\deap\tests\test_benchmarks.py build\lib\deap\tests\test_creator.py build\lib\deap\tests\test_logbook.py build\lib\deap\tests\test_pickle.py build\lib\deap\tests\__init__.py build\lib\deap\tools\constraint.py build\lib\deap\tools\crossover.py build\lib\deap\tools\emo.py build\lib\deap\tools\indicator.py build\lib\deap\tools\init.py build\lib\deap\tools\migration.py build\lib\deap\tools\mutation.py build\lib\deap\tools\selection.py build\lib\deap\tools\support.py build\lib\deap\tools\__init__.py build\lib\deap\tools\_hypervolume\pyhv.py build\lib\deap\tools\_hypervolume\__init__.py
    Skipping optional fixer: buffer
    Skipping optional fixer: idioms
    Skipping optional fixer: set_literal
    Skipping optional fixer: ws_comma
    Fixing build\lib\deap\algorithms.py build\lib\deap\base.py build\lib\deap\cma.py build\lib\deap\creator.py build\lib\deap\gp.py build\lib\deap\__init__.py build\lib\deap\benchmarks\binary.py build\lib\deap\benchmarks\gp.py build\lib\deap\benchmarks\movingpeaks.py build\lib\deap\benchmarks\tools.py build\lib\deap\benchmarks\__init__.py build\lib\deap\tests\test_algorithms.py build\lib\deap\tests\test_benchmarks.py build\lib\deap\tests\test_creator.py build\lib\deap\tests\test_logbook.py build\lib\deap\tests\test_pickle.py build\lib\deap\tests\__init__.py build\lib\deap\tools\constraint.py build\lib\deap\tools\crossover.py build\lib\deap\tools\emo.py build\lib\deap\tools\indicator.py build\lib\deap\tools\init.py build\lib\deap\tools\migration.py build\lib\deap\tools\mutation.py build\lib\deap\tools\selection.py build\lib\deap\tools\support.py build\lib\deap\tools\__init__.py build\lib\deap\tools\_hypervolume\pyhv.py build\lib\deap\tools\_hypervolume\__init__.py
    Skipping optional fixer: buffer
    Skipping optional fixer: idioms
    Skipping optional fixer: set_literal
    Skipping optional fixer: ws_comma
    creating build\bdist.win-amd64
    creating build\bdist.win-amd64\egg
    creating build\bdist.win-amd64\egg\deap
    copying build\lib\deap\algorithms.py -> build\bdist.win-amd64\egg\deap
    copying build\lib\deap\base.py -> build\bdist.win-amd64\egg\deap
    creating build\bdist.win-amd64\egg\deap\benchmarks
    copying build\lib\deap\benchmarks\binary.py -> build\bdist.win-amd64\egg\deap\benchmarks
    copying build\lib\deap\benchmarks\gp.py -> build\bdist.win-amd64\egg\deap\benchmarks
    copying build\lib\deap\benchmarks\movingpeaks.py -> build\bdist.win-amd64\egg\deap\benchmarks
    copying build\lib\deap\benchmarks\tools.py -> build\bdist.win-amd64\egg\deap\benchmarks
    copying build\lib\deap\benchmarks\__init__.py -> build\bdist.win-amd64\egg\deap\benchmarks
    copying build\lib\deap\cma.py -> build\bdist.win-amd64\egg\deap
    copying build\lib\deap\creator.py -> build\bdist.win-amd64\egg\deap
    copying build\lib\deap\gp.py -> build\bdist.win-amd64\egg\deap
    creating build\bdist.win-amd64\egg\deap\tests
    copying build\lib\deap\tests\test_algorithms.py -> build\bdist.win-amd64\egg\deap\tests
    copying build\lib\deap\tests\test_benchmarks.py -> build\bdist.win-amd64\egg\deap\tests
    copying build\lib\deap\tests\test_creator.py -> build\bdist.win-amd64\egg\deap\tests
    copying build\lib\deap\tests\test_logbook.py -> build\bdist.win-amd64\egg\deap\tests
    copying build\lib\deap\tests\test_pickle.py -> build\bdist.win-amd64\egg\deap\tests
    copying build\lib\deap\tests\__init__.py -> build\bdist.win-amd64\egg\deap\tests
    creating build\bdist.win-amd64\egg\deap\tools
    copying build\lib\deap\tools\constraint.py -> build\bdist.win-amd64\egg\deap\tools
    copying build\lib\deap\tools\crossover.py -> build\bdist.win-amd64\egg\deap\tools
    copying build\lib\deap\tools\emo.py -> build\bdist.win-amd64\egg\deap\tools
    copying build\lib\deap\tools\indicator.py -> build\bdist.win-amd64\egg\deap\tools
    copying build\lib\deap\tools\init.py -> build\bdist.win-amd64\egg\deap\tools
    copying build\lib\deap\tools\migration.py -> build\bdist.win-amd64\egg\deap\tools
    copying build\lib\deap\tools\mutation.py -> build\bdist.win-amd64\egg\deap\tools
    copying build\lib\deap\tools\selection.py -> build\bdist.win-amd64\egg\deap\tools
    copying build\lib\deap\tools\support.py -> build\bdist.win-amd64\egg\deap\tools
    creating build\bdist.win-amd64\egg\deap\tools\_hypervolume
    copying build\lib\deap\tools\_hypervolume\pyhv.py -> build\bdist.win-amd64\egg\deap\tools\_hypervolume
    copying build\lib\deap\tools\_hypervolume\__init__.py -> build\bdist.win-amd64\egg\deap\tools\_hypervolume
    copying build\lib\deap\tools\__init__.py -> build\bdist.win-amd64\egg\deap\tools
    copying build\lib\deap\__init__.py -> build\bdist.win-amd64\egg\deap
    byte-compiling build\bdist.win-amd64\egg\deap\algorithms.py to algorithms.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\base.py to base.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\benchmarks\binary.py to binary.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\benchmarks\gp.py to gp.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\benchmarks\movingpeaks.py to movingpeaks.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\benchmarks\tools.py to tools.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\benchmarks\__init__.py to __init__.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\cma.py to cma.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\creator.py to creator.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\gp.py to gp.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tests\test_algorithms.py to test_algorithms.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tests\test_benchmarks.py to test_benchmarks.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tests\test_creator.py to test_creator.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tests\test_logbook.py to test_logbook.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tests\test_pickle.py to test_pickle.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tests\__init__.py to __init__.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tools\constraint.py to constraint.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tools\crossover.py to crossover.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tools\emo.py to emo.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tools\indicator.py to indicator.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tools\init.py to init.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tools\migration.py to migration.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tools\mutation.py to mutation.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tools\selection.py to selection.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tools\support.py to support.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tools\_hypervolume\pyhv.py to pyhv.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tools\_hypervolume\__init__.py to __init__.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\tools\__init__.py to __init__.cpython-36.pyc
    byte-compiling build\bdist.win-amd64\egg\deap\__init__.py to __init__.cpython-36.pyc
    creating build\bdist.win-amd64\egg\EGG-INFO
    copying deap.egg-info\PKG-INFO -> build\bdist.win-amd64\egg\EGG-INFO
    copying deap.egg-info\SOURCES.txt -> build\bdist.win-amd64\egg\EGG-INFO
    copying deap.egg-info\dependency_links.txt -> build\bdist.win-amd64\egg\EGG-INFO
    copying deap.egg-info\pbr.json -> build\bdist.win-amd64\egg\EGG-INFO
    copying deap.egg-info\top_level.txt -> build\bdist.win-amd64\egg\EGG-INFO
    zip_safe flag not set; analyzing archive contents...
    creating dist
    creating 'dist\deap-1.2.2-py3.6.egg' and adding 'build\bdist.win-amd64\egg' to it
    removing 'build\bdist.win-amd64\egg' (and everything under it)
    Processing deap-1.2.2-py3.6.egg
    Removing c:\program files\python36\lib\site-packages\deap-1.2.2-py3.6.egg
    Copying deap-1.2.2-py3.6.egg to c:\program files\python36\lib\site-packages
    deap 1.2.2 is already the active version in easy-install.pth
    
    Installed c:\program files\python36\lib\site-packages\deap-1.2.2-py3.6.egg
    Processing dependencies for deap==1.2.2
    Finished processing dependencies for deap==1.2.2
    ***************************************************************************
    WARNING: The C extensions could not be compiled, speedups won't be available.
    Plain-Python installation succeeded.
    ***************************************************************************
    
    opened by PaoloBenigni 13
  • How can I define my fitness functions.

    How can I define my fitness functions.

    Hi, I am running PSO and EDA for my fitness functions. So far, the default code is working on my computer. But, I cannot find where I can define my own fitness functions. How the default code creates the objective functions? Furthermore, I need to control the range of variables. Then, I can run it quickly for my situation. Thanks!

    opened by langongjin 11
  • TypeError: Both weights and assigned values must be a sequence of numbers when assigning to values of <class 'deap.creator.FitnessMax'>

    TypeError: Both weights and assigned values must be a sequence of numbers when assigning to values of

    Traceback (most recent call last): File "oneMax.py", line 58, in pop, log, hof = main() File "oneMax.py", line 53, in main pop, logbook = algorithms.eaSimple(pop, toolbox, cxpb=0.5, mutpb=0.2, ngen=10, stats=stats, halloffame=hof, verbose=True) File "/usr/local/lib/python3.6/site-packages/deap/algorithms.py", line 152, in eaSimple ind.fitness.values = fit File "/usr/local/lib/python3.6/site-packages/deap/base.py", line 193, in setValues self.weights)).with_traceback(traceback) File "/usr/local/lib/python3.6/site-packages/deap/base.py", line 185, in setValues self.wvalues = tuple(map(mul, values, self.weights)) TypeError: Both weights and assigned values must be a sequence of numbers when assigning to values of <class 'deap.creator.FitnessMax'>. Currently assigning value(s) 4 of <class 'int'> to a fitness with weights (1.0,).

    opened by lemo2012 11
  • Enable Support for python >=3.10

    Enable Support for python >=3.10

    Hi, I create this issue because it seems like is not possible to install deap 1.3.X on python>=3.10; the latest version that the setup can resolve is 1.0. It would be great if we can use the newest python versions

    image image

    opened by rodrigo-arenas 10
  • Can't run DEAP program with SCOOP

    Can't run DEAP program with SCOOP

    I'm trying to run Python script that uses DEAP and SCOOP to process evaluations in parallel, however I get the following traceback.

    [2015-02-12 15:24:25,021] brokerzmq (127.0.0.1:57627) DEBUG   SHUTDOWN command received.
    Traceback (most recent call last):
      File "/Applications/Canopy.app/appdata/canopy-1.1.0.1371.macosx-x86_64/Canopy.app/Contents/lib/python2.7/runpy.py", line 162, in _run_module_as_main
        "__main__", fname, loader, pkg_name)
      File "/Applications/Canopy.app/appdata/canopy-1.1.0.1371.macosx-x86_64/Canopy.app/Contents/lib/python2.7/runpy.py", line 72, in _run_code
        exec code in run_globals
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/bootstrap/__main__.py", line 302, in <module>
        b.main()
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/bootstrap/__main__.py", line 92, in main
        self.run()
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/bootstrap/__main__.py", line 290, in run
        futures_startup()
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/bootstrap/__main__.py", line 271, in futures_startup
        run_name="__main__"
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/futures.py", line 64, in _startup
        result = _controller.switch(rootFuture, *args, **kargs)
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/_control.py", line 253, in runController
        raise future.exceptionValue
    IndexError: string index out of range
    

    I've tried to run it through debugging, but I can never reach the code in _control.py because the program terminates before then with the following traceback.

    [2015-02-12 15:51:56,111] scoopzmq  (127.0.0.1:51799) ERROR   An instance could not find its base reference on a worker. Ensure that your objects have their definition available in the root scope of your program.
    'module' object has no attribute 'Individual'
    [2015-02-12 15:51:56,111] brokerzmq (127.0.0.1:51893) DEBUG   SHUTDOWN command received.
    [2015-02-12 15:51:56,113] scoopzmq  (127.0.0.1:52298) ERROR   A worker exited unexpectedly. Read the worker logs for more information. SCOOP pool will now shutdown.
    Traceback (most recent call last):
      File "/Applications/Canopy.app/appdata/canopy-1.1.0.1371.macosx-x86_64/Canopy.app/Contents/lib/python2.7/runpy.py", line 162, in _run_module_as_main
        "__main__", fname, loader, pkg_name)
      File "/Applications/Canopy.app/appdata/canopy-1.1.0.1371.macosx-x86_64/Canopy.app/Contents/lib/python2.7/runpy.py", line 72, in _run_code
        exec code in run_globals
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/bootstrap/__main__.py", line 302, in <module>
        b.main()
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/bootstrap/__main__.py", line 92, in main
        self.run()
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/bootstrap/__main__.py", line 290, in run
        futures_startup()
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/bootstrap/__main__.py", line 271, in futures_startup
        run_name="__main__"
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/futures.py", line 64, in _startup
        result = _controller.switch(rootFuture, *args, **kargs)
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/_control.py", line 207, in runController
        future = execQueue.pop()
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/_types.py", line 320, in pop
        self.updateQueue()
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/_types.py", line 343, in updateQueue
        for future in self.socket.recvFuture():
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/_comm/scoopzmq.py", line 279, in recvFuture
        received = self._recv()
      File "/Users/gvdb/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/scoop/_comm/scoopzmq.py", line 197, in _recv
        raise ReferenceBroken(e)
    scoop._comm.scoopexceptions.ReferenceBroken: 'module' object has no attribute 'Individual'
    

    I understand that this may not necessarily be an issue with DEAP, so I will also post this query to SCOOP developers for help. Any help or advice you can provide would be appreciated.

    invalid 
    opened by gvdb 10
  • pip install issue - README.md not found

    pip install issue - README.md not found

    I am getting this issue when trying to install the package in VS2013 (PTVS). I am also getting this from the command line as well. I am using Python 3.4.

    Installing 'deap'
    Collecting deap
      Using cached deap-1.0.2.post1.tar.gz
        warning: pypandoc module not found, could not convert Markdown to RST
        Traceback (most recent call last):
          File "<string>", line 20, in <module>
          File "C:\DOCUME~1\USER\LOCALS~1\Temp\pip-build-a1bo2jcu\deap\setup.py", line 22, in <module>
            long_description=read_md('README.md'),
          File "C:\DOCUME~1\USER\LOCALS~1\Temp\pip-build-a1bo2jcu\deap\setup.py", line 13, in <lambda>
            read_md = lambda f: open(f, 'r').read()
        FileNotFoundError: [Errno 2] No such file or directory: 'README.md'
        Complete output from command python setup.py egg_info:
        warning: pypandoc module not found, could not convert Markdown to RST
    
        Traceback (most recent call last):
    
          File "<string>", line 20, in <module>
    
          File "C:\DOCUME~1\USER\LOCALS~1\Temp\pip-build-a1bo2jcu\deap\setup.py", line 22, in <module>
    
            long_description=read_md('README.md'),
    
          File "C:\DOCUME~1\USER\LOCALS~1\Temp\pip-build-a1bo2jcu\deap\setup.py", line 13, in <lambda>
    
            read_md = lambda f: open(f, 'r').read()
    
        FileNotFoundError: [Errno 2] No such file or directory: 'README.md'
    
        ----------------------------------------
        Command "python setup.py egg_info" failed with error code 1 in C:\DOCUME~1\USER\LOCALS~1\Temp\pip-build-a1bo2jcu\deap
    'deap' failed to install. Exit code: 1
    
    opened by pntt 9
  • Support for subclass type with STGP

    Support for subclass type with STGP

    From [email protected] on January 18, 2013 10:17:47

    Below is the patch text generated by mercurial. I am hoping this was the correct avenue to submit a change.

    # HG changeset patch
    # User Jason Zutty <[email protected]>
    # Date 1358521911 18000
    # Branch dev
    # Node ID 72e303c713bc7dd83b1286a7109e5881b5041b97
    # Parent  2bb81fcc9b7bf352e60d31814f7910152641f1d5
    Added a method polymorph to class PrimitiveSetTyped.  Polymorph allows the terminals from one type to be used wherever terminals of another type are permitted.
    
    diff -r 2bb81fcc9b7b -r 72e303c713bc deap/gp.py
    --- a/deap/gp.py    Mon Jan 14 11:08:22 2013 -0500
    +++ b/deap/gp.py    Fri Jan 18 10:11:51 2013 -0500
    @@ -304,6 +304,12 @@
             """
             return self.terms_count / float(self.terms_count + self.prims_count)
    
    +    def polymorph(self,supertype,subtype):
    +   """Allows the use of all terminals of a subtype,
    +   wherever the use of the supertype is permitted
    +   """
    +   self.terminals[supertype] = self.terminals[supertype] + self.terminals[subtype]
    +
     class PrimitiveSet(PrimitiveSetTyped):
         """Class same as :class:`~deap.gp.PrimitiveSetTyped`, except there is no 
         definition of type.
    

    Original issue: http://code.google.com/p/deap/issues/detail?id=12

    enhancement gp 
    opened by cmd-ntrf 9
  • Pool objects cannot be passed between processes or pickled

    Pool objects cannot be passed between processes or pickled

    I followed multi processors approach described here. https://deap.readthedocs.io/en/master/tutorials/basic/part4.html

    import multiprocessing
    
    pool = multiprocessing.Pool()
    toolbox.register("map", pool.map)
    
    # Toolbox definition
    toolbox.map(toolbox.evaluate, offspring)
    

    Above code gets error Pool objects cannot be passed between processes or pickled. But when I used dummy pool it works fine.

    from multiprocessing.dummy import Pool
    

    But latter code uses multithreading. Please let me know what's wrong with multi processors.

    question 
    opened by discover59 8
  • When using multiprocessing, something goes wrong.

    When using multiprocessing, something goes wrong.

    In the evaluation, I tried to output the evaluation value and the individual at that time.

    def func(x,y,z,w):
        return x*y + z + 4*w
    
    train = []
    for i in range(DATASIZE):
        X = []
        for j in range(4):
            X.append(random.uniform(10,100))
        X.append(random.choice([func(X[0],X[1],X[2],X[3]),0,0,0,0]))
        train.append(X)
    
    
    def step_function(x):
      if x>0:
        return 1
      else:
        return 0
    
    def sigmoid(x):
      return 1 / (1+np.exp(-x))
    
    def mapnp(f,x):
        return np.array(list(map(f,x)))
    
    def paramfunc(vec, mat1, mat2, mat3):
        return mapnp(step_function, np.dot(mapnp(sigmoid,np.dot(mapnp(sigmoid, np.dot(vec, mat1)), mat2)), mat3))
    
    def listparamfunc(vec, paramlist):
        mat1 = [paramlist[0:4], paramlist[4:8], paramlist[8:12], paramlist[12:16]]
        mat2 = [paramlist[16:20], paramlist[20:24], paramlist[24:28], paramlist[28:32]]
        mat3 = paramlist[32:]
        return paramfunc(np.array(vec), np.array(mat1), np.array(mat2), np.array(mat3))
    
    def evalind(individual):
        profit = 0
        nptrain = np.array(train)
        result = listparamfunc(nptrain[:,0:4],individual)
        for x in range(len(train)):
            if result[x] == 1:
                profit = profit + train[x][4] - 1000
        ################################################
        print(profit, individual) ############ output ############
        ################################################
        return profit,
    
    
    creator.create("FitnessMax", base.Fitness, weights=(1.0,))
    creator.create("Individual", list, fitness=creator.FitnessMax)
    
    toolbox = base.Toolbox()
    toolbox.register("attr_float", random.uniform, -1, 1)
    toolbox.register("individual", tools.initRepeat, creator.Individual, toolbox.attr_float, 36)
    toolbox.register("population", tools.initRepeat, list, toolbox.individual)
    
    toolbox.register("evaluate", evalind)
    toolbox.register("mate", tools.cxUniform, indpb=MATE_GENOM_RATE)
    toolbox.register("mutate", tools.mutGaussian, indpb=MUTATE_GENOM_RATE, mu=MUTATE_MU, sigma=MUTATE_SIGMA)
    toolbox.register("select", tools.selTournament, tournsize=TOURNAMENT)
    
    
    
    def main(pop, gen=GEN):
        random.seed(time.time())
        if pop == None:
            pop = toolbox.population(n=500)
        hof = tools.HallOfFame(1)
    
        pop, log = algorithms.eaSimple(pop, toolbox, MATE_RATE, MUTATE_RATE, gen,halloffame=hof, verbose=True)
        return pop, log, hof
    
    
    
    
    if __name__ == "__main__":
    
        pool = multiprocessing.Pool(11)
        toolbox.register("map", pool.map)
    
        pop, log, hof = main(pop=None, gen=10)
    
        best = tools.selBest(pop, 1)[0]
        print("best: ", best)
        print("best.fitness.values : ", best.fitness.values)
        print("evalind(best) : ", evalind(best))
    

    This was the one with the highest rating.

    531564.2498946949 : [-0.014781777922400874, -0.3787021885147608, ....
    

    However, when I evaluated best (hof) again after completion, different value ​​cames out. (The gene is the same)

    best:  [-0.014781777922400874, -0.3787021885147608, ....
    best.fitness.values :  (531564.2498946949,)
    
    evalind(best) :  (265238.98228751967,)  #     !=531564.2498946949
    

    turn off multiprocessing

    Several studies have shown that multiprocessing is affecting me. As a result, when I stopped multiprocessing, this was output correctly.

    #pool = multiprocessing.Pool(11)
    #toolbox.register("map", pool.map)
    
    best.fitness.values :  (511370.0290722571,)
    evalind(best) :  (511370.0290722571,)
    

    why? Thank you.

    opened by dotcom 8
  • Implement elitism

    Implement elitism

    I implemented the possibility to add elitism in algorithms.eaSimple by using a new attribute of the toolbox ("replacement"), whose operators are defined in tools/replacement.py

    opened by leocus 0
  • request for enhancement of operational readiness

    request for enhancement of operational readiness

    I'm intending to include deap as a backend for the godon project.

    I've only found references to scoop for operationally parallelizing the framework.

    Hence the question, are there near or mid term plans on your end to extend the set of supported orchestration engines? Examples could be docker swarm, kubernetes or dask.

    If not, could you give me pointers of what are be stumbling blocks to do so?

    opened by cherusk 2
  • AttributeError: module 'deap.creator' has no attribute 'Particle'

    AttributeError: module 'deap.creator' has no attribute 'Particle'

    I'm following this example: https://deap.readthedocs.io/en/master/examples/pso_basic.html

    when I run it, it shows be the following error

    AttributeError: module 'deap.creator' has no attribute 'Particle'
    

    When I check the codebase, I only see it defined in examples.

    opened by codezart 1
  • Add gp.MultiOutputTree and gp.Modi

    Add gp.MultiOutputTree and gp.Modi

    Implementation with modification of multiple-output genetic programming tree [Zhang, Yun & Zhang, Mengjie. (2005). A multiple-output program tree structure in genetic programming. 6-10.]

    Modification: Modi node accepts only one argument.

    Reference: https://www.researchgate.net/publication/228824043_A_multiple-output_program_tree_structure_in_genetic_programming

    Code authors: @SyrexMinus Makar Shevchenko* @AlekseyKorshuk Aleksey Korshuk* @ummagumm-a Sinii Viacheslav* * - Innopolis University, Innopolis, Russia

    opened by SyrexMinus 1
  • CMA StrategyMultiObjective Slowdown for number of objectives ~ 15

    CMA StrategyMultiObjective Slowdown for number of objectives ~ 15

    Hi there, I noticed that CMA StrategyMultiObjective gets prohibitively slow when the number of objectives reaches around 15. Is there any reason for this? I've tested with a dummy test function with mu=25, lambda=100. Cheers

    opened by romanovzky 2
  • How to generate random strings or characters for individuals

    How to generate random strings or characters for individuals

    I am currently trying to make a simple algorithm that can guess words, I am using the One Max example as base, but I can't find a way to generate the individual's attributes.

    What I tried doing is:

    res = string.ascii_letters
    the_toolbox.register("attr_bool", random.choice, res)
    

    But I get the following error:

    TypeError: array() argument 1 must be a unicode character, not getset_descriptor
    

    I don't really understand what's causing the error and I haven't been able to find anything online

    opened by r8vnhill 1
Owner
Distributed Evolutionary Algorithms in Python
Distributed Evolutionary Algorithms in Python
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.

Light Gradient Boosting Machine LightGBM is a gradient boosting framework that uses tree based learning algorithms. It is designed to be distributed a

Microsoft 14.5k Jan 7, 2023
LibRerank is a toolkit for re-ranking algorithms. There are a number of re-ranking algorithms, such as PRM, DLCM, GSF, miDNN, SetRank, EGRerank, Seq2Slate.

LibRerank LibRerank is a toolkit for re-ranking algorithms. There are a number of re-ranking algorithms, such as PRM, DLCM, GSF, miDNN, SetRank, EGRer

null 126 Dec 28, 2022
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

eXtreme Gradient Boosting Community | Documentation | Resources | Contributors | Release Notes XGBoost is an optimized distributed gradient boosting l

Distributed (Deep) Machine Learning Community 23.6k Jan 3, 2023
Uber Open Source 1.6k Dec 31, 2022
An open source framework that provides a simple, universal API for building distributed applications. Ray is packaged with RLlib, a scalable reinforcement learning library, and Tune, a scalable hyperparameter tuning library.

Ray provides a simple, universal API for building distributed applications. Ray is packaged with the following libraries for accelerating machine lear

null 23.3k Dec 31, 2022
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.

Horovod Horovod is a distributed deep learning training framework for TensorFlow, Keras, PyTorch, and Apache MXNet. The goal of Horovod is to make dis

Horovod 12.9k Jan 7, 2023
BigDL: Distributed Deep Learning Framework for Apache Spark

BigDL: Distributed Deep Learning on Apache Spark What is BigDL? BigDL is a distributed deep learning library for Apache Spark; with BigDL, users can w

null 4.1k Jan 9, 2023
Distributed Deep learning with Keras & Spark

Elephas: Distributed Deep Learning with Keras & Spark Elephas is an extension of Keras, which allows you to run distributed deep learning models at sc

Max Pumperla 1.6k Dec 29, 2022
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.

DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective. 10x Larger Models 10x Faster Trainin

Microsoft 8.4k Dec 30, 2022
Distributed Tensorflow, Keras and PyTorch on Apache Spark/Flink & Ray

A unified Data Analytics and AI platform for distributed TensorFlow, Keras and PyTorch on Apache Spark/Flink & Ray What is Analytics Zoo? Analytics Zo

null 2.5k Dec 28, 2022
a distributed deep learning platform

Apache SINGA Distributed deep learning system http://singa.apache.org Quick Start Installation Examples Issues JIRA tickets Code Analysis: Mailing Lis

The Apache Software Foundation 2.7k Jan 5, 2023
A high performance and generic framework for distributed DNN training

BytePS BytePS is a high performance and general distributed training framework. It supports TensorFlow, Keras, PyTorch, and MXNet, and can run on eith

Bytedance Inc. 3.3k Dec 28, 2022
Distributed Computing for AI Made Simple

Project Home Blog Documents Paper Media Coverage Join Fiber users email list [email protected] Fiber Distributed Computing for AI Made Simp

Uber Open Source 997 Dec 30, 2022
Distributed scikit-learn meta-estimators in PySpark

sk-dist: Distributed scikit-learn meta-estimators in PySpark What is it? sk-dist is a Python package for machine learning built on top of scikit-learn

Ibotta 282 Dec 9, 2022
🎛 Distributed machine learning made simple.

?? lazycluster Distributed machine learning made simple. Use your preferred distributed ML framework like a lazy engineer. Getting Started • Highlight

Machine Learning Tooling 44 Nov 27, 2022
DistML is a Ray extension library to support large-scale distributed ML training on heterogeneous multi-node multi-GPU clusters

DistML is a Ray extension library to support large-scale distributed ML training on heterogeneous multi-node multi-GPU clusters

null 27 Aug 19, 2022
WAGMA-SGD is a decentralized asynchronous SGD for distributed deep learning training based on model averaging.

WAGMA-SGD is a decentralized asynchronous SGD based on wait-avoiding group model averaging. The synchronization is relaxed by making the collectives externally-triggerable, namely, a collective can be initiated without requiring that all the processes enter it. It partially reduces the data within non-overlapping groups of process, improving the parallel scalability.

Shigang Li 6 Jun 18, 2022
Management of exclusive GPU access for distributed machine learning workloads

TensorHive is an open source tool for managing computing resources used by multiple users across distributed hosts. It focuses on granting

Paweł Rościszewski 131 Dec 12, 2022
Metric learning algorithms in Python

metric-learn: Metric Learning in Python metric-learn contains efficient Python implementations of several popular supervised and weakly-supervised met

null 1.3k Dec 28, 2022