Name |
Description |
Category |
Link |
Training pix2pix |
This notebook shows a simple pipeline for training pix2pix on a simple dataset. Most of the code is based on this implementation. |
GAN |
|
One Place |
This notebook shows how to train, test then deploy models in the browser directly from one notebook. We use a simple XOR example to prove this simple concept. |
Deployment |
|
TPU vs GPU |
Google recently allowed training on TPUs for free on colab. This notebook explains how to enable TPU training. Also, it reports some benchmarks using mnist dataset by comparing TPU and GPU performance. |
TPU |
|
Keras Custom Data Generator |
This notebook shows to create a custom data genertor in keras. |
Data Generatation |
|
Eager Execution (1) |
As we know that TenosrFlow works with static graphs. So, first you have to create the graph then execute it later. This makes debugging a bit complicated. With Eager Execution you can now evalute operations directly without creating a session. |
Dynamic Graphs |
|
Eager Execution (2) |
In this notebook I explain different concepts in eager execution. I go over variables, ops, gradients, custom gradients, callbacks, metrics and creating models with tf.keras and saving/restoring them. |
Dynamic Graphs |
|
Sketcher |
Create a simple app to recognize 100 drawings from the quickdraw dataset. A simple CNN model is created and served to deoploy in the browser to create a sketch recognizer app. |
Deployment |
|
QuickDraw10 |
In this notebook we provide QuickDraw10 as an alternative for MNIST. A script is provided to download and load a preprocessed dataset for 10 classes with training and testing split. Also, a simple CNN model is implemented for training and testing. |
Data Preperation |
|
Autoencoders |
Autoencoders consists of two structures: the encoder and the decoder. The encoder network downsamples the data into lower dimensions and the decoder network reconstructs the original data from the lower dimension representation. The lower dimension representation is usually called latent space representation. |
Auto-encoder |
|
Weight Transfer |
In this tutorial we explain how to transfer weights from a static graph model built with TensorFlow to a dynamic graph built with Keras. We will first train a model using Tensorflow then we will create the same model in keras and transfer the trained weights between the two models. |
Weights Save and Load |
|
BigGan (1) |
Create some cool gifs by interpolation in the latent space of the BigGan model. The model is imported from tensorflow hub. |
GAN |
|
BigGan (2) |
In this notebook I give a basic introduction to bigGans. I also, how to interpolate between z-vector values. Moreover, I show the results of multiple experiments I made in the latent space of BigGans. |
GAN |
|
Mask R-CNN |
In this notebook a pretrained Mask R-CNN model is used to predict the bounding box and the segmentation mask of objects. I used this notebook to create the dataset for training the pix2pix model. |
Segmentation |
|
QuickDraw Strokes |
A notebook exploring the drawing data of quickdraw. I also illustrate how to make a cool animation of the drawing process in colab. |
Data Preperation |
|
U-Net |
The U-Net model is a simple fully convolutional neural network that is used for binary segmentation i.e foreground and background pixel-wise classification. In this notebook we use it to segment cats and dogs from arbitrary images. |
Segmentation |
|
Localizer |
A simple CNN with a regression branch to predict bounding box parameters. The model is trained on a dataset of dogs and cats with bounding box annotations around the head of the pets. |
Object Localization |
|
Classification and Localization |
We create a simple CNN with two branches for classification and locazliation of cats and dogs. |
Classification, Localization |
|
Transfer Learning |
A notebook about using Mobilenet for transfer learning in TensorFlow. The model is very fast and achieves 97% validation accuracy on a binary classification dataset. |
Transfer Learning |
|
Hand Detection |
In this task we want to localize the right and left hands for each person that exists in a single frame. It acheives around 0.85 IoU. |
Detection |
|
Face Detection |
In this task we used a simple version of SSD for face detection. The model was trained on less than 3K images using TensorFlow with eager execution |
Detection |
|
TensorFlow 2.0 |
In this task we use the brand new TF 2.0 with default eager execution. We explore, tensors, gradients, dataset and many more. |
Platform |
|
SC-FEGAN |
In this notebook, you can play directly with the SC-FEGAN for face-editting directly in the browser. |
GAN |
|
Swift for TensorFlow |
Swift for TensorFlow is a next-generation platform for machine learning that incorporates differentiable programming. In this notebook a go over its basics and also how to create a simple NN and CNN. |
Platform |
|
GCN |
Ever asked yourself how to use convolution networks for non Euclidean data for instance graphs ? GCNs are becoming increasingly popular to solve such problems. I used Deep GCNs to classify spammers & non-spammers. |
Platform |
|