Code for Neural Reflectance Surfaces (NeRS)
[arXiv
] [Project Page
] [Colab Demo
] [Bibtex
]
This repo contains the code for NeRS: Neural Reflectance Surfaces.
The code was tested with the following dependencies:
- Python 3.8.6
- Pytorch 1.7.0
- Pytorch3d 0.4.0
- CUDA 11.0
Installation
Setup
We recommend using conda to manage dependencies. Make sure to install a cudatoolkit compatible with your GPU.
git clone [email protected]:jasonyzhang/ners.git
conda create -n ners python=3.8
cond activate pytorch3d
conda install -c pytorch pytorch=1.7.0 torchvision cudatoolkit=11.0
pip install -r requirements.txt
Installing Pytorch3d
Here, we list the recommended steps for installing Pytorch3d. Refer to the official installation directions for troubleshooting and additional details.
mkdir -p external
git clone https://github.com/facebookresearch/pytorch3d.git external/pytorch3d
cd external/pytorch3d
conda install -c conda-forge -c fvcore -c iopath fvcore iopath
conda install -c bottler nvidiacub
python setup.py install
If you need to compile for multiple architectures (e.g. Turing for 2080TI and Maxwell for 1080TI), you can pass the architectures as an environment variable, i.e. TORCH_CUDA_ARCH_LIST="Maxwell;Pascal;Turing;Volta" python setup.py install
.
If you get a warning about the default C/C++ compiler on your machine, you should compile Pytorch3D using the same compiler that your pytorch installation uses, likely gcc/g++. Try: CC=gcc CXX=g++ python setup.py install
.
Acquiring Object Masks
To get object masks, we recommend using PointRend for COCO classes or GrabCut for other categories.
If using GrabCut, you can try this interactive segmentation tool.
Running the Code
Running on MVMC
Coming Soon!
Running on Your Own Objects
We recommend beginning with the demo notebook so that you can visualize the intermediate outputs. The demo notebook generates the 3D reconstruction and illumination prediction for the espresso machine (data included). You can also run the demo script:
python main.py --instance-dir data/espresso --symmetrize --export-mesh --predict-illumination
We also provide a Colab notebook that runs on a single GPU. Note that the Colab demo does not include the view-dependent illumination prediction. At the end of the demo, you can view the turntable NeRS rendering and download the generated mesh as an obj.
To run on your own objects, you will need to acquire images and masks. See data/espresso
for an example of the expected directory structure.
We also provide the images and masks for all objects in the paper. All objects except hydrant and robot should have a --symmetrize
flag.
gdown https://drive.google.com/uc?id=1JWuofTIlcLJmmzYtZYM2SvZVizJCcOU_
unzip -f misc_objects.zip -d data
Citing NeRS
If you use find this code helpful, please consider citing:
@inproceedings{zhang2021ners,
title={{NeRS}: Neural Reflectance Surfaces for Sparse-view 3D Reconstruction in the Wild},
author={Zhang, Jason Y. and Yang, Gengshan and Tulsiani, Shubham and Ramanan, Deva},
booktitle={Conference on Neural Information Processing Systems},
year={2021}
}