Deep Learning (with PyTorch)

Overview

Deep Learning (with PyTorch) Binder

This notebook repository now has a companion website, where all the course material can be found in video and textual format.

🇬🇧   🇨🇳   🇰🇷   🇪🇸   🇮🇹   🇹🇷   🇯🇵   🇸🇦   🇫🇷   🇮🇷   🇷🇺   🇻🇳   🇷🇸   🇵🇹   🇭🇺

Getting started

To be able to follow the exercises, you are going to need a laptop with Miniconda (a minimal version of Anaconda) and several Python packages installed. The following instruction would work as is for Mac or Ubuntu Linux users, Windows users would need to install and work in the Git BASH terminal.

Download and install Miniconda

Please go to the Anaconda website. Download and install the latest Miniconda version for Python 3.7 for your operating system.

wget <http:// link to miniconda>
sh <miniconda*.sh>

Check-out the git repository with the exercise

Once Miniconda is ready, checkout the course repository and proceed with setting up the environment:

git clone https://github.com/Atcold/pytorch-Deep-Learning

Create isolated Miniconda environment

Change directory (cd) into the course folder, then type:

# cd pytorch-Deep-Learning
conda env create -f environment.yml
source activate pDL

Start Jupyter Notebook or JupyterLab

Start from terminal as usual:

jupyter lab

Or, for the classic interface:

jupyter notebook

Notebooks visualisation

Jupyter Notebooks are used throughout these lectures for interactive data exploration and visualisation.

We use dark styles for both GitHub and Jupyter Notebook. You should try to do the same, or they will look ugly. JupyterLab has a built-in selectable dark theme, so you only need to install something if you want to use the classic notebook interface. To see the content appropriately in the classic interface install the following:

Comments
  • Chapter 5-2 docs

    Chapter 5-2 docs

    Optimization techniques II

    We discuss adaptive methods for SGD such as RMSprop and ADAM. We also talk about normalization layers and their effects on the neural network training process. Finally, we discuss a real-world example of neural nets being used in industry to make MRI scans faster and more efficient.

    Please let me know if any changes need to be made before merging.

    opened by guidopetri 16
  • Updates to current packages

    Updates to current packages

    This:

    • Moves PyTorch from 0.4 to 1.1 (one tiny code change)
    • Moves Python from 3.6 to 3.7 (no changes to code, just env)
    • Moves 1-2 requirements out of notebooks and into environment (potential nasty scipy pip install from librosa avoided!)
    • Uses conda kernels so the correct environment kernel is available (all notebooks rerun to pick up proper kernel)
    • Adds JuptyerLab (not required, but nice) - the interactive backend in the final notebook is still best in the classic interface. Try out built-in dark mode!

    All notebooks seem to run (except noted minor issue with JupyterLab)

    opened by henryiii 16
  • [FR & EN] YouTube subtitles

    [FR & EN] YouTube subtitles

    Hi Alf :wave:,

    As indicated in my last email, I can't afford to wait for Yann's return without a big delay on my side. So here are the subtitle files:

    • For English, it is the addition of the unicode. In practice:
    1. The list of files not modified during this review of the unicode: practinum1 (didn't need unicode), practinum4 (the file contains blocks of 3 instead of 2 for the others), for lecture 12 (the only file I didn't translate into French)

    2. The list of finished files (full English review + unicode) : lecture 6 & 9

    3. The list of about clean files (partial English review + unicode) : lecture 1-3,10,11 + practinum 1-3, 7-8, 10

    4. The list of not clean files (no English review + unicode): lecture 5-9,12-15 + practinum 5-6,9,11-15

    • For French, these are all the subtitles (except for lecture 12 where I have huge problems understanding Mike Lewis's accent and so I preferred not to put anything than to translate badly).

    I also added a disclamer for the V2 of the French translation of the website which should arrive this month. It should be my next and last PR closing the French translation work :boom:

    Loïck

    opened by lbourdois 13
  • Broken image links in 3.3. Properties of natural signals

    Broken image links in 3.3. Properties of natural signals

    The following image links are broken:

    • [x] Figure 2(a)
    • [x] Figure 2(b)
    • [x] Figure 3(a)
    • [x] Figure 3(b)

    See https://atcold.github.io/pytorch-Deep-Learning/en/week03/03-3/

    I think the images were originally obtained from this presentation: 02 - CNN.pdf

    See pages 10-11


    Also, small suggestions:

    • [x] Change Figure 4 to include R^7 and R^2 as in Slide 20 . This would better match the text for Figure 4.

    • [x] Include Figure (4b maybe?) with that on Slide 21 to show what Padding is doing

    opened by feedthebeat90 11
  • Portuguese translation

    Portuguese translation

    Hi @Atcold ! I would like to know how and where should I commit markdown files in Portuguese? I recall that you have commented something with @ebetica .

    opened by ricardobarroslourenco 11
  • [ZH] 13-3 Inline latex broken

    [ZH] 13-3 Inline latex broken

    Hi @JonathanSum ! Just for your info, There seems to be some inline latex broken on lecture 13-3:

    Screen Shot 2020-09-23 at 22 41 44

    The rest of the lectures I've checked seem to be fine.

    opened by xcastilla 9
  • Reorganize the website structure

    Reorganize the website structure

    This PR reorganizes the website structure, so we now have:

    en/
      index.md
      about.dm
      week01/
      week02/
      ...
    zh/
      index.md
      about.md
      week01/
      week02/
      ...
    ...
    

    Hopefully it's less messy and easier to work with.

    After this is merged, I will pull the images out into a global directory as well.

    Also fixes some broken links in zh/index.md

    opened by ebetica 9
  • Problem visualizing spanish translation on github.io

    Problem visualizing spanish translation on github.io

    I found an error visualizing on the github.io page the file /docs/es/week02/02-1.md.

    The english version of the file appears before some parts and the layout of the spanish parts after the english parts gets a bit messed up.

    grafik

    grafik

    opened by mt0rm0 8
  • [EN] Fix timers

    [EN] Fix timers

    A PR that fixes the timers of the sbv files that I couldn't correct in PR #660 to avoid conflicts.

    I also took the opportunity to correct the few errors I caught when translating the lecture10.

    I also noticed that the sbv files of the practinums of weeks 14 and 15 were missing.

    opened by lbourdois 8
  • [ZH] translation of 06-2 and 06.md

    [ZH] translation of 06-2 and 06.md

    I have translated the top 50% of the RNN(06-02) in Chinese.

    I passed the course on deep learning.ai and I also wrote a few notebooks to help students in the coursera Tensorflow time series seq2seq notebook.

    opened by JonathanSum 8
  • Vanishing gradient notebook

    Vanishing gradient notebook

    Poornima and I have compared an LSTM and RNN and visualized the gradients with respect to the input. We see that the gradients for the RNN are much smaller compared to the LSTM.

    We are able to train MNIST for a large input sequence with an LSTM and failed to do so with an RNN.

    Hope this is useful. If we need to make any chances, please let us know !

    opened by karanchahal 8
  • Added controller trainer and improved truck class

    Added controller trainer and improved truck class

    Added

    • new truck methods for randomizing state within contraints
    • new truck methods for seeing if truck is at dock or offscreen
    • Training script for optimizing controller

    Note

    I currently have not successfully trained the controller to convergence. I have based the training off of this. On the website, they mention that the controller is hard to train. I have tried training it on the website with no success, so it seems like even their lessons are difficult to train. However, the code for training should be very similar to the code on the website. You may also alter the amount of lessons, max time steps, learning rate etc. to see if the model converges. I have been trying for over a week and have not succeeded yet.

    opened by dafaronbi 1
  • fix chinese version of 12-3

    fix chinese version of 12-3

    I found that the Chinese version was basically machine translated, which caused the latex syntax to be broken. Of course, there are a lot of unreasonable translation. This PR is mainly about fixing broken latex. I also did my best to fix some of the translations that were too much bullshit.

    opened by vipcxj 0
  • Russian translation (dictionary)

    Russian translation (dictionary)

    I would question some translations in the dictionary for Russian: I've graduated this year and we haven't really translated everything. For example, it will be more understandable if I say "one-hot" in Russian as it is, rather than "унитарный код". Basically, I've never heard anyone calling it "унитарный код", to be honest...

    So I guess there is a choice between being academically strict or being understood.

    opened by xufana 4
  • Use conda instead of source activate

    Use conda instead of source activate

    I think source activate is a few years old now and isn't supported anymore. https://stackoverflow.com/questions/49600611/python-anaconda-should-i-use-conda-activate-or-source-activate-in-linux

    opened by ebetica 0
  • [JA](和訳に関わるもの) マークダウンの中で明らかに間違えている部分を修正しました

    [JA](和訳に関わるもの) マークダウンの中で明らかに間違えている部分を修正しました

    初めまして。私は皆様のおかげで深層学習の勉強に取り組み始めた学生の一人です。少しでも貢献しようと思い、リクエストをしました。修正は8-3までしか行っていません。今後も最後まで修正を続ける予定です。

    主に簡単なマークダウンの記述ミスや日本語のミスを修正しました。いくつか気になった点がありましたので、報告します。これらのうちほとんどは修正をしていません。

    1-3図5の部分でコメントアウトをしています。これをマークダウンファイルで見るとコメントアウトが有効に働いていることが分かるのですが、webページで見るとそうではないことが分かると思います。原因がわかる場合は教えていただきたいです。 2-3図6三層ニューラルネットワークの部分の図と説明文があっていないと思います。まず「3つのクラスがあります(C=3)」は図によればK=3です。また、d=100とd=1000の違いもあります。これは英文も同様です。 3-1「高次元の畳み込み」の数式でklとありますが、k,lが適切ではないでしょうか。これは英文も同様です。 5-1Figure5において、vsが強調表示されていない状態です。これは8-3Fig. 1: VAE vs. Classic AEでも同様です。また、これは英文も同様です。 6-1図4の直後「4)赤の障害物」は障害物が赤色というわけではないため、間違えていると思われます。英文では"4) red obstacle"となっており、こちらもおかしいです。 7-1で「eがエネルギーで、fが自由エネルギー」という記述があり、どちらも大文字でE、Fとするのが正しいと思うのですが、どうですか?eについては、英文も小文字になっています。 7-2「最尤法は、分子を大きく、分母を小さくして」の直後にある式は分母部分にもlogを付けるべきだと思います。これは英文も同様です。 8-1でwebページで見ると_heads_が強調されないまま表示されています。原因不明です。 8-3Fig. 4: Plot showing how relative entropy forces the bubbles to have variance = 1の直前の式で、V(z_i)はV(z)_iではないでしょうか。つまり、Fig. 4が正しいということです。これは英文も同様です。

    長い文ですが、読んでいただけると嬉しいです。 ありがとうございます。

    opened by hiragaatsuya 2
Releases(dlsp19)
  • dlsp19(Jan 30, 2020)

    This is the notes for the Spring 2019 Deep Learning course at NYU. This course concerns the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional net and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition.

    This is the initial draft of the course notes - they are based off of a course developed for the the African Masters of Machine Intelligence (AMMI). You can access that version here

    Source code(tar.gz)
    Source code(zip)
  • aims-fl18(Jan 30, 2020)

    The African Masters of Machine Intelligence (AMMI) is Africa's flagship program in machine intelligence led by The African Institute for Mathematical Sciences (AIMS). These lessons, developed during the course of several years while I've been teaching at Purdue and NYU, are here proposed for the AMMI (AIMS).

    Prior to this course delivered for AMMI (AIMS), an earlier version of this was delivered and video-recorded for the Computational and Data Science for High Energy Physics (CoDaS-HEP) summer school at Princeton University. Please refer to this version release here.

    Source code(tar.gz)
    Source code(zip)
  • v1.0.0(Nov 5, 2018)

    Click CoDaS-HEP_2018 to jump to this release.

    These lessons, developed during the course of several years while I've been teaching at Purdue and NYU, are here proposed for the Computational and Data Science for High Energy Physics (CoDaS-HEP) summer school at Princeton University. The whole course has been recorded and the playlist is made available here. Check the slides for drawings of better visual quality.

    Source code(tar.gz)
    Source code(zip)
Owner
Alfredo Canziani
Musician, math lover, cook, dancer, 🏳️‍🌈, and assistant professor of Computer Science at New York University
Alfredo Canziani
PyTorch Tutorial for Deep Learning Researchers

This repository provides tutorial code for deep learning researchers to learn PyTorch. In the tutorial, most of the models were implemented with less

Yunjey Choi 25.4k Jan 5, 2023
Pytorch implementations of various Deep NLP models in cs-224n(Stanford Univ)

DeepNLP-models-Pytorch Pytorch implementations of various Deep NLP models in cs-224n(Stanford Univ: NLP with Deep Learning) This is not for Pytorch be

Kim SungDong 2.9k Dec 24, 2022
Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 200 universities.

D2L.ai: Interactive Deep Learning Book with Multi-Framework Code, Math, and Discussions Book website | STAT 157 Course at UC Berkeley | Latest version

Dive into Deep Learning (D2L.ai) 16k Jan 3, 2023
A collection of various deep learning architectures, models, and tips

Deep Learning Models A collection of various deep learning architectures, models, and tips for TensorFlow and PyTorch in Jupyter Notebooks. Traditiona

Sebastian Raschka 15.5k Jan 7, 2023
An IPython Notebook tutorial on deep learning for natural language processing, including structure prediction.

Table of Contents: Introduction to Torch's Tensor Library Computation Graphs and Automatic Differentiation Deep Learning Building Blocks: Affine maps,

Robert 1.8k Jan 4, 2023
A set of examples around pytorch in Vision, Text, Reinforcement Learning, etc.

PyTorch Examples WARNING: if you fork this repo, github actions will run daily on it. To disable this, go to /examples/settings/actions and Disable Ac

null 19.4k Jan 1, 2023
A scalable template for PyTorch projects, with examples in Image Segmentation, Object classification, GANs and Reinforcement Learning.

PyTorch Project Template is being sponsored by the following tool; please help to support us by taking a look and signing up to a free trial PyTorch P

Mo'men AbdelRazek 740 Dec 23, 2022
PyTorch tutorials.

PyTorch Tutorials All the tutorials are now presented as sphinx style documentation at: https://pytorch.org/tutorials Contributing We use sphinx-galle

null 6.6k Jan 2, 2023
C++ Implementation of PyTorch Tutorials for Everyone

C++ Implementation of PyTorch Tutorials for Everyone OS (Compiler)\LibTorch 1.9.0 macOS (clang 10.0, 11.0, 12.0) Linux (gcc 8, 9, 10, 11) Windows (msv

Omkar Prabhu 1.5k Jan 4, 2023
Simple examples to introduce PyTorch

This repository introduces the fundamental concepts of PyTorch through self-contained examples. At its core, PyTorch provides two main features: An n-

Justin Johnson 4.4k Jan 7, 2023
Minimal tutorials for PyTorch

Minimal tutorials for PyTorch adapted from Alec Radford's Theano tutorials. Tensor multiplication Linear Regression Logistic Regression Neural Network

Vinh Khuc 321 Oct 25, 2022
PyTorch Implementation of Fully Convolutional Networks. (Training code to reproduce the original result is available.)

pytorch-fcn PyTorch implementation of Fully Convolutional Networks. Requirements pytorch >= 0.2.0 torchvision >= 0.1.8 fcn >= 6.1.5 Pillow scipy tqdm

Kentaro Wada 1.6k Jan 4, 2023
Simple PyTorch Tutorials Zero to ALL!

PyTorchZeroToAll Quick 3~4 day lecture materials for HKUST students. Video Lectures: (RNN TBA) Youtube Bilibili Slides Lecture Slides @GoogleDrive If

Sung Kim 3.7k Dec 30, 2022
PyTorch tutorials and best practices.

Effective PyTorch Table of Contents Part I: PyTorch Fundamentals PyTorch basics Encapsulate your model with Modules Broadcasting the good and the ugly

Vahid Kazemi 1.5k Jan 4, 2023
Some example scripts on pytorch

pytorch-practice Some example scripts on pytorch CONLL 2000 Chunking task Uses BiLSTM CRF loss with char CNN embeddings. To run use: cd data/conll2000

Shubhanshu Mishra 180 Dec 22, 2022
Example of network fine-tuning in pytorch for the kaggle competition Dogs vs. Cats Redux: Kernels Edition

Example of network fine-tuning in pytorch for the kaggle competition Dogs vs. Cats Redux: Kernels Edition Currently

bobby 70 Sep 22, 2022
ConvNet training using pytorch

Convolutional networks using PyTorch This is a complete training example for Deep Convolutional Networks on various datasets (ImageNet, Cifar10, Cifar

Elad Hoffer 336 Dec 30, 2022
simple generative adversarial network (GAN) using PyTorch

Generative Adversarial Networks (GANs) in PyTorch Running Run the sample code by typing: ./gan_pytorch.py ...and you'll train two nets to battle it o

vanguard_space 32 Jun 14, 2020
Torch Containers simplified in PyTorch

pytorch-containers This repository aims to help former Torchies more seamlessly transition to the "Containerless" world of PyTorch by providing a list

Max deGroot 88 Apr 25, 2022