KGDet: Keypoint-Guided Fashion Detection (AAAI 2021)

Related tags

Deep Learning KGDet
Overview

KGDet: Keypoint-Guided Fashion Detection (AAAI 2021)

This is an official implementation of the AAAI-2021 paper "KGDet: Keypoint-Guided Fashion Detection".

Architecture

Installation

To avoid problems, please install this repo in a pure conda virtual environment.

First, enter the root directory of this repo. Install CUDA and PyTorch with conda.

conda install -c pytorch -c conda-forge pytorch==1.4.0 torchvision==0.5.0 cudatoolkit-dev=10.1 

Then, install other dependencies with pip.

pip install -r requirements.txt

DeepFashion2API

cd deepfashion2_api/PythonAPI
pip install -e .

main code

Our code is based on mmdetection, which is a clean open-sourced project for benchmarking object detection methods.

cd ../../mmdetection
python setup.py develop

Now the repo is ready, let's go back to the root directory.

cd ..

Data Preparation

DeepFashion2

If you need to run experiments on the entire DeepFashion2 dataset, please refer to DeepFashion2 for detailed guidance. Otherwise, you can skip to the Demo dataset subsection.

After downloading and unpacking the dataset, please create a soft link from the code repository to the dataset's root directory.

ln -s <root dir of DeepFashion2> data/deepfashion2

Demo dataset

We provide a subset (32 images) of DeepFashion2 to enable quick-experiment.

Checkpoints

The checkpoints can be fetched from this OneDrive link.

Experiments

Demo

Test with 1 gpu

./mmdetection/tools/dist_test.sh configs/kgdet_moment_r50_fpn_1x-demo.py checkpoints/KGDet_epoch-12.pth 1 --json_out work_dirs/demo_KGDet.json --eval bbox keypoints
  • Results files will be stored as work_dirs/demo_KGDet.json.
  • If you only need the prediction results, you can drop --eval and its arguments.

DeepFashion2

Train with 4 gpus

./mmdetection/tools/dist_train.sh configs/kgdet_moment_r50_fpn_1x-deepfashion2.py 4 --validate --work_dir work_dirs/TRAIN_KGDet
  • The running log and checkpoints will be stored in the work_dirs/TRAIN_KGDet directory according to the argument --work_dir.
  • --validate evokes a validation section after each training epoch.

Test with 4 gpus

./mmdetection/tools/dist_test.sh configs/kgdet_moment_r50_fpn_1x-deepfashion2.py checkpoints/KGDet_epoch-12.pth 4 --json_out work_dirs/result_KGDet.json --eval bbox keypoints
  • Results files will be stored as work_dirs/result_KGDet.json.

Customization

If you would like to run our model on your own data, you can imitate the structure of the demo_dataset (an image directory plus a JSON file), and adjust the arguments in the configuration file.

Acknowledgment

This repo is built upon RepPoints and mmdetection.

@inproceedings{qian2021kgdet,
  title={KGDet: Keypoint-Guided Fashion Detection},
  author={Qian, Shenhan and Lian, Dongze and Zhao, Binqiang and Liu, Tong and Zhu, Bohui and Li, Hai and Gao, Shenghua},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  volume={35},
  number={3},
  pages={2449--2457},
  year={2021}
}
Comments
  • Question about dataset for training ?

    Question about dataset for training ?

    Thank you for great research. I have a question about field keypoints in dataset. Why number keypoint very large 882 while max keypoint in deepfastion is 38 .

    opened by ThorPham 7
  •  The demo experiment cannot be run successfully

    The demo experiment cannot be run successfully

    Hello.

    Thanks you for the great work.

    I used your checkpoints to test the deepfashon2 images, but I encountered some problems. I tried the following modifications:

    1. The file reppoints_detector_kp_gt.py is missing, so I comment out import RepPointsDetectorKpGT. I'm not sure if this will affect the model accuracy.
    2. In the file https://github.com/ShenhanQian/KGDet/blob/master/configs/kgdet_moment_r50_fpn_1x-deepfashion2.py, the neck type is FPN2, but there is no FPN2 module. I change the FPN2 type into FPN and comment out select_out=[2].

    Finally, I can run the demo. However, the output is not correct. Can you give some advises so that I can run the demo experiment successfully? Thanks.

    opened by Yhdian 3
  • Error occurs when running the Test with 1 gpu.

    Error occurs when running the Test with 1 gpu.

    I Followed all the instructions to install this repo but when i run Test with 1 gpu demo, follow error occurs.

    Traceback (most recent call last): File "./mmdetection/tools/test.py", line 13, in from mmdet.apis import init_dist File "/home/revolveai/projects/KGDet/mmdetection/mmdet/apis/init.py", line 2, in from .inference import (inference_detector, init_detector, show_result, File "/home/revolveai/projects/KGDet/mmdetection/mmdet/apis/inference.py", line 10, in from mmdet.core import get_classes File "/home/revolveai/projects/KGDet/mmdetection/mmdet/core/init.py", line 3, in from .evaluation import * # noqa: F401, F403 File "/home/revolveai/projects/KGDet/mmdetection/mmdet/core/evaluation/init.py", line 5, in from .eval_hooks import (CocoDistEvalmAPHook, CocoDistEvalRecallHook, File "/home/revolveai/projects/KGDet/mmdetection/mmdet/core/evaluation/eval_hooks.py", line 13, in from mmdet import datasets File "/home/revolveai/projects/KGDet/mmdetection/mmdet/datasets/init.py", line 7, in from .loader import DistributedGroupSampler, GroupSampler, build_dataloader File "/home/revolveai/projects/KGDet/mmdetection/mmdet/datasets/loader/init.py", line 1, in from .build_loader import build_dataloader File "/home/revolveai/projects/KGDet/mmdetection/mmdet/datasets/loader/build_loader.py", line 8, in from .sampler import DistributedGroupSampler, DistributedSampler, GroupSampler File "/home/revolveai/projects/KGDet/mmdetection/mmdet/datasets/loader/sampler.py", line 6, in from mmcv.runner.utils import get_dist_info ImportError: cannot import name 'get_dist_info' Traceback (most recent call last): File "/home/revolveai/miniconda3/envs/pft_test/lib/python3.6/runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "/home/revolveai/miniconda3/envs/pft_test/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/home/revolveai/miniconda3/envs/pft_test/lib/python3.6/site-packages/torch/distributed/launch.py", line 263, in main() File "/home/revolveai/miniconda3/envs/pft_test/lib/python3.6/site-packages/torch/distributed/launch.py", line 259, in main cmd=cmd) subprocess.CalledProcessError: Command '['/home/revolveai/miniconda3/envs/pft_test/bin/python', '-u', './mmdetection/tools/test.py', '--local_rank=0', 'configs/kgdet_moment_r50_fpn_1x-demo.py', 'checkpoints/KGDet_epoch-12.pth', '--launcher', 'pytorch', '--json_out', 'work_dirs/demo_KGDet.json', '--eval', 'bbox', 'keypoints']' returned non-zero exit status 1.

    opened by Ahsan686 2
  • error: unrecognized arguments: --json_out work_dirs/demo_KGDet.json

    error: unrecognized arguments: --json_out work_dirs/demo_KGDet.json

    So I am trying to run this notebook on colab and I tried to run the line for demo on test images

    • I tried this # single-gpu testing !python tools/test.py /content/drive/MyDrive/KGDet/KGDet/configs/kgdet_moment_r50_fpn_1x-demo.py /content/drive/MyDrive/KGDet/KGDet/checkpoints/KGDet_epoch-12.pth --json_out work_dirs/demo_KGDet.json --eval bbox keypoints

    and got the error test.py: error: unrecognized arguments: --json_out work_dirs/demo_KGDet.json

    If I remove json from json_out then it raises a pkl file error

    • Alternatively, I tried ! ./tools/dist_test.sh /content/drive/MyDrive/KGDet/KGDet/configs/kgdet_moment_r50_fpn_1x-demo.py /content/drive/MyDrive/KGDet/KGDet/checkpoints/KGDet_epoch-12.pth 1 --json_out work_dirs/demo_KGDet.json --eval bbox keypoints

    and got /bin/bash: ./tools/dist_test.sh: /usr/bin/env: bad interpreter: Permission denied

    Is there a tutorial to run this on colab ?

    I also tried following mmdetection's colab notebook tutorial and tried this from mmdet.apis import inference_detector, init_detector, show_result_pyplot config = '/content/drive/MyDrive/KGDet/KGDet/configs/kgdet_moment_r50_fpn_1x-demo.py' checkpoint = '/content/drive/MyDrive/KGDet/KGDet/checkpoints/KGDet_epoch-12.pth' model = init_detector(config, checkpoint, device='cuda:0')

    but ended up getting KeyError: 'RepPointsDetectorKp is not in the models registry'

    opened by kritanjalijain 1
  • RuntimeError: Error compiling objects for extension

    RuntimeError: Error compiling objects for extension

    Hi, I am getting the following error: RuntimeError: Error compiling objects for extension

    Getting this error when trying to run this command: !python3 setup.py develop

    opened by sameersharma00747 0
  • Support for higher Cuda version

    Support for higher Cuda version

    Thank you for sharing such an amazing repo. I'm testing it but having trouble installing dependencies because Colab and also my own laptop have Cuda 11.2 or higher so I cannot install pytorch and mmcv like your README instruction. Do you have plan for it?

    opened by gradient1706 0
Owner
Qian Shenhan
Qian Shenhan
This is an official implementation of our CVPR 2021 paper "Bottom-Up Human Pose Estimation Via Disentangled Keypoint Regression" (https://arxiv.org/abs/2104.02300)

Bottom-Up Human Pose Estimation Via Disentangled Keypoint Regression Introduction In this paper, we are interested in the bottom-up paradigm of estima

HRNet 367 Dec 27, 2022
Leveraging Two Types of Global Graph for Sequential Fashion Recommendation, ICMR 2021

This is the repo for the paper: Leveraging Two Types of Global Graph for Sequential Fashion Recommendation Requirements OS: Ubuntu 16.04 or higher ver

Yujuan Ding 10 Oct 10, 2022
CharacterGAN: Few-Shot Keypoint Character Animation and Reposing

CharacterGAN Implementation of the paper "CharacterGAN: Few-Shot Keypoint Character Animation and Reposing" by Tobias Hinz, Matthew Fisher, Oliver Wan

Tobias Hinz 181 Dec 27, 2022
Code repository for paper `Skeleton Merger: an Unsupervised Aligned Keypoint Detector`.

Skeleton Merger Skeleton Merger, an Unsupervised Aligned Keypoint Detector. The paper is available at https://arxiv.org/abs/2103.10814. A map of the r

北海若 48 Nov 14, 2022
KeypointDeformer: Unsupervised 3D Keypoint Discovery for Shape Control

KeypointDeformer: Unsupervised 3D Keypoint Discovery for Shape Control Tomas Jakab, Richard Tucker, Ameesh Makadia, Jiajun Wu, Noah Snavely, Angjoo Ka

Tomas Jakab 87 Nov 30, 2022
Implementation of "JOKR: Joint Keypoint Representation for Unsupervised Cross-Domain Motion Retargeting"

JOKR: Joint Keypoint Representation for Unsupervised Cross-Domain Motion Retargeting Pytorch implementation for the paper "JOKR: Joint Keypoint Repres

null 45 Dec 25, 2022
Single-stage Keypoint-based Category-level Object Pose Estimation from an RGB Image

CenterPose Overview This repository is the official implementation of the paper "Single-stage Keypoint-based Category-level Object Pose Estimation fro

NVIDIA Research Projects 188 Dec 27, 2022
68 keypoint annotations for COFW test data

68 keypoint annotations for COFW test data This repository contains manually annotated 68 keypoints for COFW test data (original annotation of CFOW da

null 31 Dec 6, 2022
Solving SMPL/MANO parameters from keypoint coordinates.

Minimal-IK A simple and naive inverse kinematics solver for MANO hand model, SMPL body model, and SMPL-H body+hand model. Briefly, given joint coordin

Yuxiao Zhou 305 Dec 30, 2022
UDP++ (ECCVW 2020 Oral), (Winner of COCO 2020 Keypoint Challenge).

UDP-Pose This is the pytorch implementation for UDP++, which won the Fisrt place in COCO Keypoint Challenge at ECCV 2020 Workshop. Top-Down Results on

null 20 Jul 29, 2022
Co-mining: Self-Supervised Learning for Sparsely Annotated Object Detection, AAAI 2021.

Co-mining: Self-Supervised Learning for Sparsely Annotated Object Detection This repository is an official implementation of the AAAI 2021 paper Co-mi

MEGVII Research 20 Dec 7, 2022
Fashion Landmark Estimation with HRNet

HRNet for Fashion Landmark Estimation (Modified from deep-high-resolution-net.pytorch) Introduction This code applies the HRNet (Deep High-Resolution

SVIP Lab 91 Dec 26, 2022
(CVPR2021) Kaleido-BERT: Vision-Language Pre-training on Fashion Domain

Kaleido-BERT: Vision-Language Pre-training on Fashion Domain Mingchen Zhuge*, Dehong Gao*, Deng-Ping Fan#, Linbo Jin, Ben Chen, Haoming Zhou, Minghui

null 248 Dec 4, 2022
Random Erasing Data Augmentation. Experiments on CIFAR10, CIFAR100 and Fashion-MNIST

Random Erasing Data Augmentation =============================================================== black white random This code has the source code for

Zhun Zhong 654 Dec 26, 2022
(CVPR2021) Kaleido-BERT: Vision-Language Pre-training on Fashion Domain

Kaleido-BERT: Vision-Language Pre-training on Fashion Domain Mingchen Zhuge*, Dehong Gao*, Deng-Ping Fan#, Linbo Jin, Ben Chen, Haoming Zhou, Minghui

null 250 Jan 8, 2023
PyTorch experiments with the Zalando fashion-mnist dataset

zalando-pytorch PyTorch experiments with the Zalando fashion-mnist dataset Project Organization ├── LICENSE ├── Makefile <- Makefile with co

Federico Baldassarre 31 Sep 25, 2021
A MNIST-like fashion product database. Benchmark

Fashion-MNIST Table of Contents Why we made Fashion-MNIST Get the Data Usage Benchmark Visualization Contributing Contact Citing Fashion-MNIST License

Zalando Research 10.5k Jan 8, 2023
An executor that performs image segmentation on fashion items

ClothingSegmenter U2NET fashion image/clothing segmenter based on https://github.com/levindabhi/cloth-segmentation Overview The ClothingSegmenter exec

Jina AI 5 Mar 30, 2022
Fashion Recommender System With Python

Fashion-Recommender-System Thr growing e-commerce industry presents us with a la

Omkar Gawade 2 Feb 2, 2022