Lightning Kitti
Semantic Segmentation with Pytorch-Lightning
Introduction
This is a simple demo for performing semantic segmentation on the Kitti dataset using Pytorch-Lightning and optimizing the neural network by monitoring and comparing runs with Weights & Biases.
Pytorch-Ligthning includes a logger for W&B that can be called simply with:
from pytorch_lightning.loggers import WandbLogger
from pytorch_lightning import Trainer
wandb_logger = WandbLogger()
trainer = Trainer(logger=wandb_logger)
Refer to the documentation for more details.
Hyper-parameters can be defined manually and every run is automatically logged onto Weights & Biases for easier analysis/interpretation of results and how to optimize the architecture.
You can also run sweeps to optimize automatically hyper-parameters.
Note: this example has been adapted from Pytorch-Lightning examples.
Usage
Notebook
- A quick way to run the training scrip is to go to the
notebook/tutorial.ipynb
and play with it.
Script
-
Clone this repository.
-
Download Kitti dataset
-
The dataset will be downloaded in the form of a zip file namely
data_semantics.zip
. Unzip the dataset inside thelightning-kitti/data_semantic/
folder. -
Install dependencies through
requirements.txt
,Pipfile
or manually (Pytorch, Pytorch-Lightning & Wandb) -
Log in or sign up for an account ->
wandb login
-
Run
python train.py
and add any optional args -
Visualize and compare your runs through generated link
Sweeps for hyper-parameter tuning
W&B Sweeps can be defined in multiple ways:
- with a YAML file - best for distributed sweeps and runs from command line
- with a Python object - best for notebooks
In this project we use a YAML file. You can refer to W&B documentation for more Pytorch-Lightning examples.
-
Run
wandb sweep sweep.yaml
-
Run
wandb agent
where
is given by previous command -
Visualize and compare the sweep runs
Results
After running the script a few times, you will be able to compare quickly a large combination of hyperparameters.
Feel free to modify the script and define your own hyperparameters.