Semantic Segmentation with Pytorch-Lightning

Overview

Lightning Kitti

Semantic Segmentation with Pytorch-Lightning

Introduction

This is a simple demo for performing semantic segmentation on the Kitti dataset using Pytorch-Lightning and optimizing the neural network by monitoring and comparing runs with Weights & Biases.

Pytorch-Ligthning includes a logger for W&B that can be called simply with:

from pytorch_lightning.loggers import WandbLogger
from pytorch_lightning import Trainer

wandb_logger = WandbLogger()
trainer = Trainer(logger=wandb_logger)

Refer to the documentation for more details.

Hyper-parameters can be defined manually and every run is automatically logged onto Weights & Biases for easier analysis/interpretation of results and how to optimize the architecture.

You can also run sweeps to optimize automatically hyper-parameters.

Note: this example has been adapted from Pytorch-Lightning examples.

Usage

Notebook

  • A quick way to run the training scrip is to go to the notebook/tutorial.ipynb and play with it.

Script

  1. Clone this repository.

  2. Download Kitti dataset

  3. The dataset will be downloaded in the form of a zip file namely data_semantics.zip. Unzip the dataset inside the lightning-kitti/data_semantic/ folder.

  4. Install dependencies through requirements.txt, Pipfile or manually (Pytorch, Pytorch-Lightning & Wandb)

  5. Log in or sign up for an account -> wandb login

  6. Run python train.py and add any optional args

  7. Visualize and compare your runs through generated link

    alt text

Sweeps for hyper-parameter tuning

W&B Sweeps can be defined in multiple ways:

  • with a YAML file - best for distributed sweeps and runs from command line
  • with a Python object - best for notebooks

In this project we use a YAML file. You can refer to W&B documentation for more Pytorch-Lightning examples.

  1. Run wandb sweep sweep.yaml

  2. Run wandb agent where is given by previous command

  3. Visualize and compare the sweep runs

    alt text

Results

After running the script a few times, you will be able to compare quickly a large combination of hyperparameters.

Feel free to modify the script and define your own hyperparameters.

See the live report →

You might also like...
Pytorch Lightning Distributed Accelerators using Ray

Distributed PyTorch Lightning Training on Ray This library adds new PyTorch Lightning accelerators for distributed training using the Ray distributed

Pytorch Lightning Distributed Accelerators using Ray

Distributed PyTorch Lightning Training on Ray This library adds new PyTorch Lightning plugins for distributed training using the Ray distributed compu

PyTorch Lightning implementation of Automatic Speech Recognition

lasr Lightening Automatic Speech Recognition An MIT License ASR research library, built on PyTorch-Lightning, for developing end-to-end ASR models. In

A small demonstration of using WebDataset with ImageNet and PyTorch Lightning

A small demonstration of using WebDataset with ImageNet and PyTorch Lightning

A PyTorch Lightning solution to training OpenAI's CLIP from scratch.
A PyTorch Lightning solution to training OpenAI's CLIP from scratch.

train-CLIP 📎 A PyTorch Lightning solution to training CLIP from scratch. Goal ⚽ Our aim is to create an easy to use Lightning implementation of OpenA

A small demonstration of using WebDataset with ImageNet and PyTorch Lightning

A small demonstration of using WebDataset with ImageNet and PyTorch Lightning This is a small repo illustrating how to use WebDataset on ImageNet. usi

An AutoML Library made with Optuna and PyTorch Lightning
An AutoML Library made with Optuna and PyTorch Lightning

An AutoML Library made with Optuna and PyTorch Lightning Installation Recommended pip install -U gradsflow From source pip install git+https://github.

Cookiecutter PyTorch Lightning

Cookiecutter PyTorch Lightning Instructions # install cookiecutter pip install cookiecutter

Unofficial Pytorch Lightning implementation of Contrastive Syn-to-Real Generalization (ICLR, 2021)

Unofficial Pytorch Lightning implementation of Contrastive Syn-to-Real Generalization (ICLR, 2021)

Comments
  • lightning + wandb + ddp  how to sweep config

    lightning + wandb + ddp how to sweep config

    wandb.init(config=hyperparameter_defaults)
    # Config parameters are automatically set by W&B sweep agent
    config = wandb.config
    
    

    This works well running only one instance of the run I think i would prefer this though.

    wandb_logger = WandbLogger(name=hparams.name, project=hparams.project_name)
    wandb_config = wandb_logger.experiment.config
    

    For the multiprocessing case however there is no way to get the same wandb configs as for the previous instance as the wandb experiment only gets created if rank == 0. For rank 1 the wandb_logger.experiment is none and therfore has no config

    Is there another way to use the sweep configs of wandb in combination with pytorch lightning ddp?

    opened by tobiascz 3
  • Perform Test time Augmentation?

    Perform Test time Augmentation?

    Hi,

    Thank you for providing such detailed example for semantic segmentation with PyTorch lightning. I was wondering if you have any idea abut implementing test time augmentation in the validation loop and maybe save the images as well? Any help is appreciated.

    Thank you, Best, SK

    opened by shreyaskamathkm 0
Owner
Boris Dayma
Sharing AI love ❤
Boris Dayma
TorchDistiller - a collection of the open source pytorch code for knowledge distillation, especially for the perception tasks, including semantic segmentation, depth estimation, object detection and instance segmentation.

This project is a collection of the open source pytorch code for knowledge distillation, especially for the perception tasks, including semantic segmentation, depth estimation, object detection and instance segmentation.

yifan liu 147 Dec 3, 2022
Learning Pixel-level Semantic Affinity with Image-level Supervision for Weakly Supervised Semantic Segmentation, CVPR 2018

Learning Pixel-level Semantic Affinity with Image-level Supervision This code is deprecated. Please see https://github.com/jiwoon-ahn/irn instead. Int

Jiwoon Ahn 337 Dec 15, 2022
An essential implementation of BYOL in PyTorch + PyTorch Lightning

Essential BYOL A simple and complete implementation of Bootstrap your own latent: A new approach to self-supervised Learning in PyTorch + PyTorch Ligh

Enrico Fini 48 Sep 27, 2022
Generic template to bootstrap your PyTorch project with PyTorch Lightning, Hydra, W&B, and DVC.

NN Template Generic template to bootstrap your PyTorch project. Click on Use this Template and avoid writing boilerplate code for: PyTorch Lightning,

Luca Moschella 520 Dec 30, 2022
Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch

Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch

Pytorch Lightning 1.4k Jan 1, 2023
A general framework for deep learning experiments under PyTorch based on pytorch-lightning

torchx Torchx is a general framework for deep learning experiments under PyTorch based on pytorch-lightning. TODO list gan-like training wrapper text

Yingtian Liu 6 Mar 17, 2022
Segmentation in Style: Unsupervised Semantic Image Segmentation with Stylegan and CLIP

Segmentation in Style: Unsupervised Semantic Image Segmentation with Stylegan and CLIP Abstract: We introduce a method that allows to automatically se

Daniil Pakhomov 134 Dec 19, 2022
Mae segmentation - Reproduction of semantic segmentation using masked autoencoder (mae)

ADE20k Semantic segmentation with MAE Getting started Install the mmsegmentation

null 97 Dec 17, 2022
Pytorch Lightning code guideline for conferences

Deep learning project seed Use this seed to start new deep learning / ML projects. Built in setup.py Built in requirements Examples with MNIST Badges

Pytorch Lightning 1k Jan 2, 2023