Parameter-ensemble-differential-evolution - Shows how to do parameter ensembling using differential evolution.

Overview

Ensembling parameters with differential evolution

This repository shows how to ensemble parameters of two trained neural networks using differential evolution. The steps followed are as follows:

  • Train two networks (architecturally same) on the same dataset (CIFAR-10 used here) but from two different random initializations.

  • Ensemble their weights using the following formulae:

    w_t = w_o * ema + (1 - ema) * w_p

    w_o and w_p represents the learned of a neural network.

  • Randomly initialize a network (same architecture as above) and populate its parameters w_t using the above formulae.

ema is usually chosen by the developer in an empirical manner. This project uses differential evolution to find it.

Below are the top-1 accuracies (on CIFAR-10 test set) of two individually trained two models along with their ensembled variant:

  • Model one: 63.23%
  • Model two: 63.42%
  • Ensembled: 63.35%

With the more conventional average prediction ensembling, I was able to get to 64.92%. This is way better than what I got by ensembling the parameters. Nevertheless, the purpose of this project was to just try out an idea.

Reproducing the results

Ensure the requirements.txt is satisfied. Then train two models with ensuring your working directory is at the root of this project:

$ git clone https://github.com/sayakpaul/parameter-ensemble-differential-evolution
$ cd parameter-ensemble-differential-evolution
$ pip install -qr requirements.txt
$ for i in `seq 1 2`; python train.py; done

Then just follow the ensemble-parameters.ipynb notebook. You can also use the networks I trained. Instructions are available inside the notebook.

References

You might also like...
Neural Ensemble Search for Performant and Calibrated Predictions
Neural Ensemble Search for Performant and Calibrated Predictions

Neural Ensemble Search Introduction This repo contains the code accompanying the paper: Neural Ensemble Search for Performant and Calibrated Predictio

 An Ensemble of CNN (Python 3.5.1 Tensorflow 1.3 numpy 1.13)
An Ensemble of CNN (Python 3.5.1 Tensorflow 1.3 numpy 1.13)

An Ensemble of CNN (Python 3.5.1 Tensorflow 1.3 numpy 1.13)

zeus is a Python implementation of the Ensemble Slice Sampling method.
zeus is a Python implementation of the Ensemble Slice Sampling method.

zeus is a Python implementation of the Ensemble Slice Sampling method. Fast & Robust Bayesian Inference, Efficient Markov Chain Monte Carlo (MCMC), Bl

Pytorch implementation of SenFormer: Efficient Self-Ensemble Framework for Semantic Segmentation
Pytorch implementation of SenFormer: Efficient Self-Ensemble Framework for Semantic Segmentation

SenFormer: Efficient Self-Ensemble Framework for Semantic Segmentation Efficient Self-Ensemble Framework for Semantic Segmentation by Walid Bousselham

Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning
Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning

Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning This repository is official Tensorflow implementation of paper: Ensemb

This Jupyter notebook shows one way to implement a simple first-order low-pass filter on sampled data in discrete time.

How to Implement a First-Order Low-Pass Filter in Discrete Time We often teach or learn about filters in continuous time, but then need to implement t

A fast Evolution Strategy implementation in Python

Evostra: Evolution Strategy for Python Evolution Strategy (ES) is an optimization technique based on ideas of adaptation and evolution. You can learn

Code for the paper Task Agnostic Morphology Evolution.

Task-Agnostic Morphology Optimization This repository contains code for the paper Task-Agnostic Morphology Evolution by Donald (Joey) Hejna, Pieter Ab

Pytorch implementation of FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks
Pytorch implementation of FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks

flownet2-pytorch Pytorch implementation of FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks. Multiple GPU training is supported, a

Releases(v0.1.0)
Owner
Sayak Paul
Trying to learn how machines learn.
Sayak Paul
A web-based application for quick, scalable, and automated hyperparameter tuning and stacked ensembling in Python.

Xcessiv Xcessiv is a tool to help you create the biggest, craziest, and most excessive stacked ensembles you can think of. Stacked ensembles are simpl

Reiichiro Nakano 1.3k Nov 17, 2022
Invert and perturb GAN images for test-time ensembling

GAN Ensembling Project Page | Paper | Bibtex Ensembling with Deep Generative Views. Lucy Chai, Jun-Yan Zhu, Eli Shechtman, Phillip Isola, Richard Zhan

Lucy Chai 93 Dec 8, 2022
Invert and perturb GAN images for test-time ensembling

Invert and perturb GAN images for test-time ensembling

Lucy Chai 49 May 2, 2021
Ensembling Off-the-shelf Models for GAN Training

Data-Efficient GANs with DiffAugment project | paper | datasets | video | slides Generated using only 100 images of Obama, grumpy cats, pandas, the Br

MIT HAN Lab 1.2k Dec 26, 2022
Deep Ensembling with No Overhead for either Training or Testing: The All-Round Blessings of Dynamic Sparsity

[ICLR 2022] Deep Ensembling with No Overhead for either Training or Testing: The All-Round Blessings of Dynamic Sparsity by Shiwei Liu, Tianlong Chen, Zahra Atashgahi, Xiaohan Chen, Ghada Sokar, Elena Mocanu, Mykola Pechenizkiy, Zhangyang Wang, Decebal Constantin Mocanu

VITA 18 Dec 31, 2022
Intrusion Detection System using ensemble learning (machine learning)

IDS-ML implementation of an intrusion detection system using ensemble machine learning methods Data set This project is carried out using the UNSW-15

null 4 Nov 25, 2022
Research shows Google collects 20x more data from Android than Apple collects from iOS. Block this non-consensual telemetry using pihole blocklists.

pihole-antitelemetry Research shows Google collects 20x more data from Android than Apple collects from iOS. Block both using these pihole lists. Proj

Adrian Edwards 290 Jan 9, 2023
wlad 2 Dec 19, 2022
A data-driven approach to quantify the value of classifiers in a machine learning ensemble.

Documentation | External Resources | Research Paper Shapley is a Python library for evaluating binary classifiers in a machine learning ensemble. The

Benedek Rozemberczki 188 Dec 29, 2022
The Python ensemble sampling toolkit for affine-invariant MCMC

emcee The Python ensemble sampling toolkit for affine-invariant MCMC emcee is a stable, well tested Python implementation of the affine-invariant ense

Dan Foreman-Mackey 1.3k Dec 31, 2022