Sample code from the Neural Networks from Scratch book.

Overview

Neural Networks from Scratch (NNFS) book code

Code from the NNFS book (https://nnfs.io) separated by chapter. The NNFS book is written with the Python programming lanauge. If you're interested in seeing this same content, but with 40+ other programming languages, check out: https://github.com/sentdex/nnfsix

Comments
  • chapter14  -Problem in Regularization_loss

    chapter14 -Problem in Regularization_loss

    Hi in our final code we calculated Regularization Loss and then sum it with data_Loss. Now the problem is I think we didn't add our total Loss as a final output of the last layer and then multiply partial gradients with it and instead what we did is just propagated Loss like when we don't have regularization loss. (In summary, we didn't add the effect of Regularization Loss in forward pass phase). And I think we should use this loss_activation.backward(Loss, y)
    instead of loss_activation.backward(loss_activation.output, y).

    Would be glad to receive your opinion.

    Best Regards

    opened by Ali-Mohammadi65748 0
  • Outputs not matching on newer versions of numpy.

    Outputs not matching on newer versions of numpy.

    Using version 1.21.0 of numpy the outputs do not match the outputs from the book. I was able to rectify this issue by finding the version you used the first video on the companion series on your Youtube channel version 1.18.2. I am not sure if you would want to add a comment on the readme, update the outputs, or update the nnfs package to work with the newer version. Adding it to the readme would be the least maintenance in the future.

    opened by ThomAbe 1
  • Ch 10 Optimizer SGD parameter wrong in the book

    Ch 10 Optimizer SGD parameter wrong in the book

    In the book p 251, 271 & 276, learning_rate begins with a capitol letter "L" in the class definition, but it doesn't in this code and it shouldn't.

    def init(self, learning_rate=1., decay=0., momentum=0.): self.learning_rate = learning_rate

    opened by pstingley 0
  • Chapter 10 Adagrad text.

    Chapter 10 Adagrad text.

    In chapter 10 after the square root example the text is written as

    Overall, the impact is the learning rates for parameters with smaller gradients are decreased slowly, while the parameters with larger gradients have their learning rates decreased faster
    

    I am confused over this, we are not updating learning rate anywhere (other than rate decay). Yes the weights will be updated faster for parameters with bigger gradients but much slower than they would have if no normalization is used.

    Am I correct or am I understanding it incorrectly.

    opened by avs20 0
  • CH03_p61_inputs

    CH03_p61_inputs

    Thank for the great book, just a minor, minor remark.. inputs = [[1, 2, 3, 2.5], [2., 5., -1., 2], <----- formatted with zeroes [-1.5, 2.7, 3.3, -0.8]]

    inputs = [.. [2.0, 5.0, -1.0, 2.0], [..]]

    opened by ronaldgithub 0
Yolox-bytetrack-sample - Python sample of MOT (Multiple Object Tracking) using YOLOX and ByteTrack

yolox-bytetrack-sample YOLOXとByteTrackを用いたMOT(Multiple Object Tracking)のPythonサン

KazuhitoTakahashi 12 Nov 9, 2022
Code samples for my book "Neural Networks and Deep Learning"

Code samples for "Neural Networks and Deep Learning" This repository contains code samples for my book on "Neural Networks and Deep Learning". The cod

Michael Nielsen 13.9k Dec 26, 2022
From Perceptron model to Deep Neural Network from scratch in Python.

Neural-Network-Basics Aim of this Repository: From Perceptron model to Deep Neural Network (from scratch) in Python. ** Currently working on a basic N

Aditya Kahol 1 Jan 14, 2022
NuPIC Studio is an all­-in-­one tool that allows users create a HTM neural network from scratch

NuPIC Studio is an all­-in-­one tool that allows users create a HTM neural network from scratch, train it, collect statistics, and share it among the members of the community. It is not just a visualization tool but an HTM builder, debugger and laboratory for experiments. It is ideal for newbies with little intimacy with NuPIC code as well as experts that wish a better productivity. Among its features and advantages:

HTM Community 93 Sep 30, 2022
A graph neural network (GNN) model to predict protein-protein interactions (PPI) with no sample features

A graph neural network (GNN) model to predict protein-protein interactions (PPI) with no sample features

null 2 Jul 25, 2022
Complex-Valued Neural Networks (CVNN)Complex-Valued Neural Networks (CVNN)

Complex-Valued Neural Networks (CVNN) Done by @NEGU93 - J. Agustin Barrachina Using this library, the only difference with a Tensorflow code is that y

youceF 1 Nov 12, 2021
Code implementation from my Medium blog post: [Transformers from Scratch in PyTorch]

transformer-from-scratch Code for my Medium blog post: Transformers from Scratch in PyTorch Note: This Transformer code does not include masked attent

Frank Odom 27 Dec 21, 2022
Jupyter notebooks for the code samples of the book "Deep Learning with Python"

Jupyter notebooks for the code samples of the book "Deep Learning with Python"

François Chollet 16.2k Dec 30, 2022
Code release for the ICML 2021 paper "PixelTransformer: Sample Conditioned Signal Generation".

PixelTransformer Code release for the ICML 2021 paper "PixelTransformer: Sample Conditioned Signal Generation". Project Page Installation Please insta

Shubham Tulsiani 24 Dec 17, 2022
This repository contains notebook implementations of the following Neural Process variants: Conditional Neural Processes (CNPs), Neural Processes (NPs), Attentive Neural Processes (ANPs).

The Neural Process Family This repository contains notebook implementations of the following Neural Process variants: Conditional Neural Processes (CN

DeepMind 892 Dec 28, 2022
Code for "Neural Parts: Learning Expressive 3D Shape Abstractions with Invertible Neural Networks", CVPR 2021

Neural Parts: Learning Expressive 3D Shape Abstractions with Invertible Neural Networks This repository contains the code that accompanies our CVPR 20

Despoina Paschalidou 161 Dec 20, 2022
Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep learning.

Machine Learning From Scratch About Python implementations of some of the fundamental Machine Learning models and algorithms from scratch. The purpose

Erik Linder-Norén 21.8k Jan 9, 2023
Scripts of Machine Learning Algorithms from Scratch. Implementations of machine learning models and algorithms using nothing but NumPy with a focus on accessibility. Aims to cover everything from basic to advance.

Algo-ScriptML Python implementations of some of the fundamental Machine Learning models and algorithms from scratch. The goal of this project is not t

Algo Phantoms 81 Nov 26, 2022
Minimal deep learning library written from scratch in Python, using NumPy/CuPy.

SmallPebble Project status: experimental, unstable. SmallPebble is a minimal/toy automatic differentiation/deep learning library written from scratch

Sidney Radcliffe 92 Dec 30, 2022
A PyTorch Lightning solution to training OpenAI's CLIP from scratch.

train-CLIP ?? A PyTorch Lightning solution to training CLIP from scratch. Goal ⚽ Our aim is to create an easy to use Lightning implementation of OpenA

Cade Gordon 396 Dec 30, 2022
RoBERTa Marathi Language model trained from scratch during huggingface 🤗 x flax community week

RoBERTa base model for Marathi Language (मराठी भाषा) Pretrained model on Marathi language using a masked language modeling (MLM) objective. RoBERTa wa

Nipun Sadvilkar 23 Oct 19, 2022
A pytorch implementation of Detectron. Both training from scratch and inferring directly from pretrained Detectron weights are available.

Use this instead: https://github.com/facebookresearch/maskrcnn-benchmark A Pytorch Implementation of Detectron Example output of e2e_mask_rcnn-R-101-F

Roy 2.8k Dec 29, 2022
Keras like implementation of Deep Learning architectures from scratch using numpy.

Mini-Keras Keras like implementation of Deep Learning architectures from scratch using numpy. How to contribute? The project contains implementations

MANU S PILLAI 5 Oct 10, 2021
This repo is to be freely used by ML devs to check the GAN performances without coding from scratch.

GANs for Fun Created because I can! GOAL The goal of this repo is to be freely used by ML devs to check the GAN performances without coding from scrat

Sagnik Roy 13 Jan 26, 2022