NeRF Meta-Learning with PyTorch

Overview

NeRF Meta Learning With PyTorch

nerf-meta is a PyTorch re-implementation of NeRF experiments from the paper "Learned Initializations for Optimizing Coordinate-Based Neural Representations". Simply by initializing NeRF with meta-learned weights, we can achieve:

Be sure to check out the original resources from the authors:

Environment

  • Python 3.8
  • PyTorch 1.8
  • NumPy, imageio, imageio-ffmpeg

Photo Tourism

Starting from a meta-initialized NeRF, we can interpolate between camera pose, focal length, aspect ratio and scene appearance. The videos below are generated with a 5 layer only NeRF, trained for ~100k iterations.

BrandenburgGate.mp4
TreviFountain.mp4

Data

Train and Evaluate

  1. Train NeRF on a single landmark scene using Reptile meta-learning:
    python tourism_train.py --config ./configs/tourism/$landmark.json
  2. Test Photo Tourism performance and generate an interpolation video of the landmark:
    python tourism_test.py --config ./configs/tourism/$landmark.json --weight-path $meta_weight.pth

View Synthesis from Single Image

Given a single input view, meta-initialized NeRF can generate a 360-degree video. The following ShapeNet video is generated with a class-specific NeRF (5 layers deep), trained for ~100k iterations.

ShapeNet.mp4

Data

Train and Evaluate

  1. Train NeRF on a particular ShapeNet class using Reptile meta-learning:
    python shapenet_train.py --config ./configs/shapenet/$shape.json
  2. Optimize the meta-trained model on a single view and test on other held-out views. It also generates a 360 video for each test object:
    python shapenet_test.py --config ./configs/shapenet/$shape.json --weight-path $meta_weight.pth

Acknowledgments

I referenced several open-source NeRF and Meta-Learning code base for this implementation. Specifically, I borrowed/modified code from the following repositories:

Thanks to the authors for releasing their code.

Issues
  • pre-trained meta models

    pre-trained meta models

    HI. Thanks for your pytorch code. Can you share the pre-trained meta models? Besides, I found the training is not very stable, sometimes the val PSNR keeps the same number while training and the test results are just some lumps.

    opened by zhangmaoxiansheng 2
  • Data for Photorealism

    Data for Photorealism

    Hello, thank you for sharing the code for this wonderfull research! Is it possible to share the data for the Photorealism data used during the experiments?

    opened by danperazzo 2
  • White space

    White space

    Hello, thank you for releasing this implementation. I tried to use the scripts for view synthesis from a single image. I am running with CUDA 10.1 in a Ubuntu 16.04. I am using the lamps set of images and the lamps config script. However, when I end training, the resulting video is blank, with only a white void.

    opened by danperazzo 2
  • use softplus activation for sigma

    use softplus activation for sigma

    ReLU activation for sigma can cause optimization to fail. Using Softplus should solve this issue. See mip-NeRF section E.2 for a discussion.

    opened by sanowar-raihan 0
Owner
Sanowar Raihan
EE Undergrad @ Bangladesh University of Engineering and Technology
Sanowar Raihan
PlenOctrees: NeRF-SH Training & Conversion

PlenOctrees Official Repo: NeRF-SH training and conversion This repository contains code to train NeRF-SH and to extract the PlenOctree, constituting

Alex Yu 63 Jun 4, 2021
NeRF Meta-Learning with PyTorch

NeRF Meta Learning With PyTorch nerf-meta is a PyTorch re-implementation of NeRF experiments from the paper "Learned Initializations for Optimizing Co

Sanowar Raihan 39 Jun 9, 2021
Mip-NeRF: A Multiscale Representation for Anti-Aliasing Neural Radiance Fields.

This repository contains the code release for Mip-NeRF: A Multiscale Representation for Anti-Aliasing Neural Radiance Fields. This implementation is written in JAX, and is a fork of Google's JaxNeRF implementation. Contact Jon Barron if you encounter any issues.

Google 121 Jun 15, 2021
(Arxiv 2021) NeRF--: Neural Radiance Fields Without Known Camera Parameters

NeRF--: Neural Radiance Fields Without Known Camera Parameters Project Page | Arxiv | Colab Notebook | Data Zirui Wang¹, Shangzhe Wu², Weidi Xie², Min

Active Vision Laboratory 95 Jun 16, 2021
Tensors and Dynamic neural networks in Python with strong GPU acceleration

PyTorch is a Python package that provides two high-level features: Tensor computation (like NumPy) with strong GPU acceleration Deep neural networks b

null 48.8k Jun 13, 2021
Tensors and Dynamic neural networks in Python with strong GPU acceleration

PyTorch is a Python package that provides two high-level features: Tensor computation (like NumPy) with strong GPU acceleration Deep neural networks b

null 46.1k Feb 13, 2021
Open source repository for the code accompanying the paper 'Non-Rigid Neural Radiance Fields Reconstruction and Novel View Synthesis of a Deforming Scene from Monocular Video'.

Non-Rigid Neural Radiance Fields This is the official repository for the project "Non-Rigid Neural Radiance Fields: Reconstruction and Novel View Synt

Facebook Research 135 Jun 12, 2021
A resource for learning about ML, DL, PyTorch and TensorFlow. Feedback always appreciated :)

A resource for learning about ML, DL, PyTorch and TensorFlow. Feedback always appreciated :)

Aladdin Persson 1.5k Jun 13, 2021
Geometric Deep Learning Extension Library for PyTorch

Documentation | Paper | Colab Notebooks | External Resources | OGB Examples PyTorch Geometric (PyG) is a geometric deep learning extension library for

Matthias Fey 11.3k Jun 13, 2021
StudioGAN is a Pytorch library providing implementations of representative Generative Adversarial Networks (GANs) for conditional/unconditional image generation.

StudioGAN is a Pytorch library providing implementations of representative Generative Adversarial Networks (GANs) for conditional/unconditional image generation.

null 1.7k Jun 13, 2021
The lightweight PyTorch wrapper for high-performance AI research. Scale your models, not the boilerplate.

The lightweight PyTorch wrapper for high-performance AI research. Scale your models, not the boilerplate. Website • Key Features • How To Use • Docs •

Pytorch Lightning 13.8k Jun 13, 2021
The lightweight PyTorch wrapper for high-performance AI research. Scale your models, not the boilerplate.

The lightweight PyTorch wrapper for high-performance AI research. Scale your models, not the boilerplate. Website • Key Features • How To Use • Docs •

Pytorch Lightning 11.9k Feb 13, 2021
High-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently.

TL;DR Ignite is a high-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently. Click on the image to

null 3.6k Jun 13, 2021
Implementation of "Meta-rPPG: Remote Heart Rate Estimation Using a Transductive Meta-Learner"

Meta-rPPG: Remote Heart Rate Estimation Using a Transductive Meta-Learner This repository is the official implementation of Meta-rPPG: Remote Heart Ra

Eugene Lee 67 May 26, 2021