Geometry-Consistent Neural Shape Representation with Implicit Displacement Fields

Related tags

Text Data & NLP idf
Overview

Geometry-Consistent Neural Shape Representation with Implicit Displacement Fields

[project page][paper][cite]

overview

overview video

demos

cuda 11.1 and pytorch 3.8

preparations

git clone https://github.com/yifita/idf.git
cd idf

# conda environment and dependencies
# update conda
conda update -n base -c defaults conda
# install requirements
conda env create --name idf -f environment.yml
conda activate idf

# download data. This will download 8 mesh and point clouds to data/benchmark_shapes
sh data/get_data.sh

surface reconstruction

# surface reconstruction from point cloud
# replace {asian_dragon} with another model name inside the benchmark_shape folder
python net/classes/runner.py net/experiments/displacement_benchmark/ablation/ablation_phased_scaledTanh_yes_act_yes_baseLoss_yes.json --name asian_dragon

detail transfer

This example uses provided base shapes

sh data/get_dt_shapes.sh
python net/classes/runner.py net/experiments/displacement_benchmark/transfer/shorts_2phase.json

bibtex

@misc{yifan2021geometryconsistent,
      title={Geometry-Consistent Neural Shape Representation with Implicit Displacement Fields},
      author={Wang Yifan and Lukas Rahmann and Olga Sorkine-Hornung},
      year={2021},
      eprint={2106.05187},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}
Comments
  • Could not find 'net/experiments/transfer/exec.json'

    Could not find 'net/experiments/transfer/exec.json'

    Thanks for your great work and excellent open-source code!

    However, when I try to train the base shapes for the source and target shapes, I could not find the file named 'net/experiments/transfer/exec.json'. Has it been added in the repository?

    Thanks!

    opened by RainbowRui 1
  • Directly rendering the IDF

    Directly rendering the IDF

    Thanks for your work! Maybe I overlooked it, but I didn't find any code for rendering the IDF (e.g. by means of sphere tracing). Are you planning to provide such code for rendering the IDF directly, because the images in your paper (where you compare to other work) seem to display such direct renderings.

    opened by coledea 1
  • Data scripts

    Data scripts

    Hi @yifita!

    Thanks a lot for this exiting work!

    I wanted to mention a minor thing with the data download scripts. I ended up running both data/gen_data.sh and the data/get_dt_shapes.sh. Maybe we could combine both into one single one for clarity, right?

    Thanks again for sharing the code!!

    opened by pablopalafox 1
  • benchmark shapes

    benchmark shapes

    Hi Yifan,

    IDF is an excellent work and I want to use it as the baseline for my research. However, I download the benchmark_shapes.zip and it does not contain all the shapes that you use in your paper. I know maybe several models are limited by the copyright. For your convenience, would you mind share the link to download these shapes and preprocessing code for your IDF? Very thanks for your help.

    opened by wangjingbo1219 0
  • Then JSON file for detail transfer seems not correct

    Then JSON file for detail transfer seems not correct

    Hi, Thanks for open-sourcing this awesome work! I tried to train the detail transfer network by using the command in the Readme

    python net/classes/executor.py net/experiments/transfer/exec.json
    

    But the extracted plys are empty with only one vertex. It seems the JSON file is not correct. Could you please upload the JSON file used for the experiment in the paper? Any help is appreciated.

    opened by iris112358 1
  • How to train a network with other shapes

    How to train a network with other shapes

    Hi,

    First thank you for presenting this amazing work! Now I want to train the network with shapes which are not the benchmark shapes. Can you please share some instructions to do so?

    Thanks

    opened by zhijieW94 1
  • Question about large memory usage when calculating chamfer distance with KDTree

    Question about large memory usage when calculating chamfer distance with KDTree

    When we try to reproduce the experiment result, it seems we have a peak memory usage when trying to calculate chamfer distance with KDTree. We are using a mesh ~2M with ~50k vertices, but get ~250G memory usage for this and cause a out of memory issue. Is this an expected behavior? WechatIMG454

    Thanks again for your help!

    opened by dc3505 0
  • Is it possible to provide with the experiment setup for nglod?

    Is it possible to provide with the experiment setup for nglod?

    Thanks so much for sharing this! Just want to see if it is possible to also have the experiment setting for nglod so that we can reproduce the result~

    Thanks in advance!

    opened by dc3505 0
Owner
Yifan Wang
PhD student @ ETH Zurich
Yifan Wang
Source code for AAAI20 "Generating Persona Consistent Dialogues by Exploiting Natural Language Inference".

Generating Persona Consistent Dialogues by Exploiting Natural Language Inference Source code for RCDG model in AAAI20 Generating Persona Consistent Di

null 16 Oct 8, 2022
This repository will contain the code for the CVPR 2021 paper "GIRAFFE: Representing Scenes as Compositional Generative Neural Feature Fields"

GIRAFFE: Representing Scenes as Compositional Generative Neural Feature Fields Project Page | Paper | Supplementary | Video | Slides | Blog | Talk If

null 1.1k Dec 27, 2022
[NeurIPS 2021] Code for Learning Signal-Agnostic Manifolds of Neural Fields

Learning Signal-Agnostic Manifolds of Neural Fields This is the uncleaned code for the paper Learning Signal-Agnostic Manifolds of Neural Fields. The

null 60 Dec 12, 2022
Library for fast text representation and classification.

fastText fastText is a library for efficient learning of word representations and sentence classification. Table of contents Resources Models Suppleme

Facebook Research 24.1k Jan 5, 2023
Text preprocessing, representation and visualization from zero to hero.

Text preprocessing, representation and visualization from zero to hero. From zero to hero • Installation • Getting Started • Examples • API • FAQ • Co

Jonathan Besomi 2.7k Jan 8, 2023
Library for fast text representation and classification.

fastText fastText is a library for efficient learning of word representations and sentence classification. Table of contents Resources Models Suppleme

Facebook Research 22.2k Feb 18, 2021
Text preprocessing, representation and visualization from zero to hero.

Text preprocessing, representation and visualization from zero to hero. From zero to hero • Installation • Getting Started • Examples • API • FAQ • Co

Jonathan Besomi 2.1k Feb 13, 2021
This repository describes our reproducible framework for assessing self-supervised representation learning from speech

LeBenchmark: a reproducible framework for assessing SSL from speech Self-Supervised Learning (SSL) using huge unlabeled data has been successfully exp

null 49 Aug 24, 2022
Code for our ACL 2021 paper - ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer

ConSERT Code for our ACL 2021 paper - ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer Requirements torch==1.6.0

Yan Yuanmeng 478 Dec 25, 2022
Code for the Findings of NAACL 2022(Long Paper): AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks

AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks arXiv link: upcoming To be published in Findings of NA

Allen 16 Nov 12, 2022
NeuTex: Neural Texture Mapping for Volumetric Neural Rendering

NeuTex: Neural Texture Mapping for Volumetric Neural Rendering Paper: https://arxiv.org/abs/2103.00762 Running Run on the provided DTU scene cd run ba

Fanbo Xiang 68 Jan 6, 2023
Named-entity recognition using neural networks. Easy-to-use and state-of-the-art results.

NeuroNER NeuroNER is a program that performs named-entity recognition (NER). Website: neuroner.com. This page gives step-by-step instructions to insta

Franck Dernoncourt 1.6k Dec 27, 2022
Easy to use, state-of-the-art Neural Machine Translation for 100+ languages

EasyNMT - Easy to use, state-of-the-art Neural Machine Translation This package provides easy to use, state-of-the-art machine translation for more th

Ubiquitous Knowledge Processing Lab 748 Jan 6, 2023
PORORO: Platform Of neuRal mOdels for natuRal language prOcessing

PORORO: Platform Of neuRal mOdels for natuRal language prOcessing pororo performs Natural Language Processing and Speech-related tasks. It is easy to

Kakao Brain 1.2k Dec 21, 2022
Unsupervised text tokenizer for Neural Network-based text generation.

SentencePiece SentencePiece is an unsupervised text tokenizer and detokenizer mainly for Neural Network-based text generation systems where the vocabu

Google 6.4k Jan 1, 2023
Open Source Neural Machine Translation in PyTorch

OpenNMT-py: Open-Source Neural Machine Translation OpenNMT-py is the PyTorch version of the OpenNMT project, an open-source (MIT) neural machine trans

OpenNMT 5.8k Jan 4, 2023
An easier way to build neural search on the cloud

An easier way to build neural search on the cloud Jina is a deep learning-powered search framework for building cross-/multi-modal search systems (e.g

Jina AI 17.1k Jan 9, 2023
:mag: Transformers at scale for question answering & neural search. Using NLP via a modular Retriever-Reader-Pipeline. Supporting DPR, Elasticsearch, HuggingFace's Modelhub...

Haystack is an end-to-end framework for Question Answering & Neural search that enables you to ... ... ask questions in natural language and find gran

deepset 6.4k Jan 9, 2023
Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code.

textgenrnn Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code, or quickly tr

Max Woolf 4.8k Dec 30, 2022