SCANimate: Weakly Supervised Learning of Skinned Clothed Avatar Networks (CVPR 2021 Oral)
This repository contains the official PyTorch implementation of:
SCANimate: Weakly Supervised Learning of Skinned Clothed Avatar Networks
Full paper | 5min Presentation | Video | Project website | Poster
Installation
Please follow the instructions in ./installation.txt
to install the environment and the SMPL model.
Run SCANimate
0. Activate the environment if it is not already activated:
$ source ./venv/scanimate/bin/activate
1. First download the pretrained model, some motion sequences and other files for the demo
- Download an AIST++ dance motion sequence for test (CC BY 4.0 license):
$ . ./download_aist_demo_motion.sh
This script will create a data
folder under current directory, please make sure to put it under the SCANimate
directory.
-
Download pre-trained scanimats for animation test: Please visit https://scanimate.is.tue.mpg.de/download.php, register, login, read and agree to the license and then download some demo scanimats. Unzip the zip file into
./data
directory -
Download subset of CAPE data for training demo: Please visit https://scanimate.is.tue.mpg.de/download.php, register, login, read and agree to the license and then download the data for training demo. Unzip the zip file into
./data
directory. -
Now you should have a
./data
directory underSCANimate
. Within./data
you will have 5 directories:minimal_body
,pretrained
,pretrained_configs
,test
, andtrain
.
Run animation demos:
2. Now you can run the test demo with the following command:
$ python -m apps.test_scanimate -c ./data/pretrained_configs/release_03223_shortlong.yaml -t ./data/test/gLO_sBM_cAll_d14_mLO1_ch05
- You can replace the configuration file with other files under
./data/pretrained_configs/
to try other subjects. - You can also replace the test motions with others under
./data/test
. - The result will be generated under
./demo_result/results_test
.
3. The generated mesh sequences can be rendered with the code under ./demo_result
:
First, install Open3D (for rendering the results) by:
$ pip install open3d==0.12.0
Then run:
$ python render/render_aist.py -i demo_result/results_test/release_03223_shortlong_test_gLO_sBM_cAll_d14_mLO1_ch05/ -o demo_result
Run training demo
2. Now you can run the demo training with
$ python -m apps.train_scanimate -c ./configs/example.yaml
The results can be found under ./demo_result/results/example
.
3. Train on your own data Make your data the same structure as in the ./data/train/example_03375_shortlong
, where a .ply
file contains a T-pose SMPL body mesh and a folder containing training frames. Each frame corresponds to two files: one .npz
files containing SMPL parameters that describes the body and one .ply
file containing the clothed scan. The body should align with the scan. Then, change the ./configs/example.yaml
to point to your data directory and you are good to go!
Citations
If you find our code or paper useful to your research, please consider citing:
@inproceedings{Saito:CVPR:2021,
title = {{SCANimate}: Weakly Supervised Learning of Skinned Clothed Avatar Networks},
author = {Saito, Shunsuke and Yang, Jinlong and Ma, Qianli and Black, Michael J.},
booktitle = {Proceedings IEEE/CVF Conf.~on Computer Vision and Pattern Recognition (CVPR)},
month = jun,
year = {2021},
month_numeric = {6}}