A plug-and-play library for neural networks written in Python

Overview

Synapses

A plug-and-play library for neural networks written in Python!

# run
pip install synapses-py==7.4.1
# in the directory of your project

Neural Network

Create a neural network

Import Synapses, call NeuralNetwork.init and provide the size of each layer.

from synapses_py import NeuralNetwork, ActivationFunction, DataPreprocessor, Statistics
layers = [4, 6, 5, 3]
neuralNetwork = NeuralNetwork.init(layers)

neuralNetwork has 4 layers. The first layer has 4 input nodes and the last layer has 3 output nodes. There are 2 hidden layers with 6 and 5 neurons respectively.

Get a prediction

inputValues = [1.0, 0.5625, 0.511111, 0.47619]
prediction = \
        NeuralNetwork.prediction(neuralNetwork, inputValues)

prediction should be something like [ 0.8296, 0.6996, 0.4541 ].

Note that the lengths of inputValues and prediction equal to the sizes of input and output layers respectively.

Fit network

learningRate = 0.5
expectedOutput = [0.0, 1.0, 0.0]
fitNetwork = \
        NeuralNetwork.fit(
            neuralNetwork,
            learningRate,
            inputValues,
            expectedOutput
        )

fitNetwork is a new neural network trained with a single observation.

To train a neural network, you should fit with multiple datapoints

Create a customized neural network

The activation function of the neurons created with NeuralNetwork.init, is a sigmoid one. If you want to customize the activation functions and the weight distribution, call NeuralNetwork.customizedInit.

def activationF(layerIndex):
    if layerIndex == 0:
        return ActivationFunction.sigmoid
    elif layerIndex == 1:
        return ActivationFunction.identity
    elif layerIndex == 2:
        return ActivationFunction.leakyReLU
    else:
        return ActivationFunction.tanh

def weightInitF(_layerIndex):
    return 1.0 - 2.0 * random()

customizedNetwork = \
        NeuralNetwork.customizedInit(
            layers,
            activationF,
            weightInitF
        )

Visualization

Call NeuralNetwork.toSvg to take a brief look at its svg drawing.

Network Drawing

The color of each neuron depends on its activation function while the transparency of the synapses depends on their weight.

svg = NeuralNetwork.toSvg(customizedNetwork)

Save and load a neural network

JSON instances are compatible across platforms! We can generate, train and save a neural network in Python and then load and make predictions in Javascript!

toJson

Call NeuralNetwork.toJson on a neural network and get a string representation of it. Use it as you like. Save json in the file system or insert into a database table.

json = NeuralNetwork.toJson(customizedNetwork)

ofJson

loadedNetwork = NeuralNetwork.ofJson(json)

As the name suggests, NeuralNetwork.ofJson turns a json string into a neural network.

Encoding and decoding

One hot encoding is a process that turns discrete attributes into a list of 0.0 and 1.0. Minmax normalization scales continuous attributes into values between 0.0 and 1.0. You can use DataPreprocessor for datapoint encoding and decoding.

The first parameter of DataPreprocessor.init is a list of tuples (attributeName, discreteOrNot).

setosaDatapoint = {
    "petal_length": "1.5",
    "petal_width": "0.1",
    "sepal_length": "4.9",
    "sepal_width": "3.1",
    "species": "setosa"
}

versicolorDatapoint = {
    "petal_length": "3.8",
    "petal_width": "1.1",
    "sepal_length": "5.5",
    "sepal_width": "2.4",
    "species": "versicolor"
}

virginicaDatapoint = {
    "petal_length": "6.0",
    "petal_width": "2.2",
    "sepal_length": "5.0",
    "sepal_width": "1.5",
    "species": "virginica"
}

datasetList = [ setosaDatapoint,
                versicolorDatapoint,
                virginicaDatapoint ]

dataPreprocessor = \
        DataPreprocessor.init(
             [ ("petal_length", False),
               ("petal_width", False),
               ("sepal_length", False),
               ("sepal_width", False),
               ("species", True) ],
             iter(datasetList)
        )

encodedDatapoints = map(lambda x:
        DataPreprocessor.encodedDatapoint(dataPreprocessor, x),
        datasetList
)

encodedDatapoints equals to:

[ [ 0.0     , 0.0     , 0.0     , 1.0     , 0.0, 0.0, 1.0 ],
  [ 0.511111, 0.476190, 1.0     , 0.562500, 0.0, 1.0, 0.0 ],
  [ 1.0     , 1.0     , 0.166667, 0.0     , 1.0, 0.0, 0.0 ] ]

Save and load the preprocessor by calling DataPreprocessor.toJson and DataPreprocessor.ofJson.

Evaluation

To evaluate a neural network, you can call Statistics.rootMeanSquareError and provide the expected and predicted values.

expectedWithOutputValuesList = \
        [ ( [ 0.0, 0.0, 1.0], [ 0.0, 0.0, 1.0] ),
          ( [ 0.0, 0.0, 1.0], [ 0.0, 1.0, 1.0] ) ]

expectedWithOutputValuesIter = \
        iter(expectedWithOutputValuesList)

rmse = Statistics.rootMeanSquareError(
                        expectedWithOutputValuesIter
)
You might also like...
Scripts for training an AI to play the endless runner Subway Surfers using a supervised machine learning approach by imitation and a convolutional neural network (CNN) for image classification
Scripts for training an AI to play the endless runner Subway Surfers using a supervised machine learning approach by imitation and a convolutional neural network (CNN) for image classification

About subwAI subwAI - a project for training an AI to play the endless runner Subway Surfers using a supervised machine learning approach by imitation

A framework that constructs deep neural networks, autoencoders, logistic regressors, and linear networks

A framework that constructs deep neural networks, autoencoders, logistic regressors, and linear networks without the use of any outside machine learning libraries - all from scratch.

DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks

What is DeepHyper? DeepHyper is a software package that uses learning, optimization, and parallel computing to automate the design and development of

NeuPy is a Tensorflow based python library for prototyping and building neural networks
NeuPy is a Tensorflow based python library for prototyping and building neural networks

NeuPy v0.8.2 NeuPy is a python library for prototyping and building neural networks. NeuPy uses Tensorflow as a computational backend for deep learnin

Selene is a Python library and command line interface for training deep neural networks from biological sequence data such as genomes.
Selene is a Python library and command line interface for training deep neural networks from biological sequence data such as genomes.

Selene is a Python library and command line interface for training deep neural networks from biological sequence data such as genomes.

High-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently.

TL;DR Ignite is a high-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently. Click on the image to

A Research-oriented Federated Learning Library and Benchmark Platform for Graph Neural Networks. Accepted to ICLR'2021 - DPML and MLSys'21 - GNNSys workshops.

FedGraphNN: A Federated Learning System and Benchmark for Graph Neural Networks A Research-oriented Federated Learning Library and Benchmark Platform

A machine learning library for spiking neural networks. Supports training with both torch and jax pipelines, and deployment to neuromorphic hardware.
A machine learning library for spiking neural networks. Supports training with both torch and jax pipelines, and deployment to neuromorphic hardware.

Rockpool Rockpool is a Python package for developing signal processing applications with spiking neural networks. Rockpool allows you to build network

Code for
Code for "Neural Parts: Learning Expressive 3D Shape Abstractions with Invertible Neural Networks", CVPR 2021

Neural Parts: Learning Expressive 3D Shape Abstractions with Invertible Neural Networks This repository contains the code that accompanies our CVPR 20

Owner
Dimos Michailidis
Dimos Michailidis
Plug and play transformer you can find network structure and official complete code by clicking List

Plug-and-play Module Plug and play transformer you can find network structure and official complete code by clicking List The following is to quickly

null 8 Mar 27, 2022
计算机视觉中用到的注意力模块和其他即插即用模块PyTorch Implementation Collection of Attention Module and Plug&Play Module

PyTorch实现多种计算机视觉中网络设计中用到的Attention机制,还收集了一些即插即用模块。由于能力有限精力有限,可能很多模块并没有包括进来,有任何的建议或者改进,可以提交issue或者进行PR。

PJDong 599 Dec 23, 2022
Gradient Step Denoiser for convergent Plug-and-Play

Source code for the paper "Gradient Step Denoiser for convergent Plug-and-Play"

Samuel Hurault 11 Sep 17, 2022
GEP (GDB Enhanced Prompt) - a GDB plug-in for GDB command prompt with fzf history search, fish-like autosuggestions, auto-completion with floating window, partial string matching in history, and more!

GEP (GDB Enhanced Prompt) GEP (GDB Enhanced Prompt) is a GDB plug-in which make your GDB command prompt more convenient and flexibility. Why I need th

Alan Li 23 Dec 21, 2022
A minimal solution to hand motion capture from a single color camera at over 100fps. Easy to use, plug to run.

Minimal Hand A minimal solution to hand motion capture from a single color camera at over 100fps. Easy to use, plug to run. This project provides the

Yuxiao Zhou 824 Jan 7, 2023
A Moonraker plug-in for real-time compensation of frame thermal expansion

Frame Expansion Compensation A Moonraker plug-in for real-time compensation of frame thermal expansion. Installation Credit to protoloft, from whom I

null 58 Jan 2, 2023
A Novel Plug-in Module for Fine-grained Visual Classification

Pytorch implementation for A Novel Plug-in Module for Fine-Grained Visual Classification. fine-grained visual classification task.

ChouPoYung 109 Dec 20, 2022
Complex-Valued Neural Networks (CVNN)Complex-Valued Neural Networks (CVNN)

Complex-Valued Neural Networks (CVNN) Done by @NEGU93 - J. Agustin Barrachina Using this library, the only difference with a Tensorflow code is that y

youceF 1 Nov 12, 2021
Bayesian-Torch is a library of neural network layers and utilities extending the core of PyTorch to enable the user to perform stochastic variational inference in Bayesian deep neural networks

Bayesian-Torch is a library of neural network layers and utilities extending the core of PyTorch to enable the user to perform stochastic variational inference in Bayesian deep neural networks. Bayesian-Torch is designed to be flexible and seamless in extending a deterministic deep neural network architecture to corresponding Bayesian form by simply replacing the deterministic layers with Bayesian layers.

Intel Labs 210 Jan 4, 2023
This repository contains notebook implementations of the following Neural Process variants: Conditional Neural Processes (CNPs), Neural Processes (NPs), Attentive Neural Processes (ANPs).

The Neural Process Family This repository contains notebook implementations of the following Neural Process variants: Conditional Neural Processes (CN

DeepMind 892 Dec 28, 2022