DEEP LEARNING FOR CODERS WITH FASTAI AND PYTORCH
The repository contains a list of the projects which I have worked on while reading the book Deep Learning For Coders with Fastai and PyTorch.
📚
NOTEBOOKS:
- The Introduction notebook is a comprehensive notebook as it contains a list of projects such as Cat and Dog Classification, Semantic Segmentation, Sentiment Classification, Tabular Classification and Recommendation System.
- The BearDetector notebook contains all the dependencies for a complete Image Classification project.
- The DigitClassifier notebook contains all the dependencies required for Image Classification project from scratch.
- The Image Classification notebook contains all the dependencies for Image Classification such as getting image data ready for modeling i.e presizing and data block summary and for fitting the model i.e learning rate finder, unfreezing, discriminative learning rates, setting the number of epochs and using deeper architectures. It has explanations of cross entropy loss function as well.
5. MULTILABEL CLASSIFICATION AND REGRESSION
- The Multilabel Classification notebook contains all the dependencies required to understand Multilabel Classification. It contains the explanations of initializing DataBlock and DataLoaders. The Regression notebook contains all the dependencies required to understand Image Regression.
- The Imagenette Classification notebook contains all the dependencies required to train a state of art machine learning model in computer vision whether from scratch or using transfer learning. It contains explanations and implementation of Normalization, Progressive Resizing, Test Time Augmentation, Mixup Augmentation and Label Smoothing.
- The Collaborative Filtering notebook contains all the dependencies required to build a Recommendation System. It presents how gradient descent can learn intrinsic factors or biases about items from a history of ratings which then gives information about the data.
- The Tabular Model notebook contains all the dependencies required for Tabular Modeling. It presents the detailed explanations of two approaches to Tabular Modeling: Decision Tree Ensembles and Neural Networks.
9. NATURAL LANGUAGE PROCESSING
- The NLP notebook contains all the dependencies required build Language Model that can generate texts and a Classifier Model that determines whether a review is positive or negative. It presents the state of art Classifier Model which is build using a pretrained language model and fine tuned it to the corpus of task. Then the Encoder model is used for classification.
- The DataMunging notebook contains all the dependencies required to implement mid level API of Fast.ai in Natural Language Processing and Computer Vision which provides greater flexibility to apply transformations on data items.
11. LANGUAGE MODEL FROM SCRATCH
- The LanguageModel notebook contains all the dependencies that is inside AWD-LSTM architecture for Text Classification. It presents the implementation of Language Model using simple Linear Model, Recurrent Neural Network, Long Short Term Memory, Dropout Regularization and Activation Regularization.
12. CONVOLUTIONAL NEURAL NETWORK
- The CNN notebook contains all the dependencies required to understand Convolutional Neural Networks. Convolutions are just a type of matrix multiplication with two constraints on the weight matrix: some elements are always zero and some elements are tied or forced to always have the same value.
- The ResNets notebook contains all the dependencies required to understand the implementation of skip connections which allow deeper models to be trained. ResNet is the pretrained model when using Transfer Learning.
- The Architecture Details notebook contains all the dependencies required to create a complete state of art computer vision models. It presents some aspects of natural language processing as well.
- The Training notebook contains all the dependencies required to create a training loop and explored variants of Stochastic Gradient Descent.
16. NEURAL NETWORK FOUNDATIONS
- The Neural Foundations notebook contains all the dependencies required to understand the foundations of deep learning, begining with matrix multiplication and moving on to implementing the forward and backward passes of a neural net from scratch.
17. CNN INTERPRETATION WITH CAM
- The CNN Interpretation notebook presents the implementation of Class Activation Maps in model interpretation. Class activation maps give insights into why a model predicted a certain result by showing the areas of images that were most responsible for a given prediction.
18. FASTAI LEARNER FROM SCRATCH
- The Fastai Learner notebook contains all the dependencies to understand the key concepts of Fastai.