Hierarchical Attentive Recurrent Tracking

Overview

Hierarchical Attentive Recurrent Tracking

This is an official Tensorflow implementation of single object tracking in videos by using hierarchical attentive recurrent neural networks, as presented in the following paper:

A. R. Kosiorek, A. Bewley, I. Posner, "Hierarchical Attentive Recurrent Tracking", NIPS 2017.

Installation

Install Tensorflow v1.1 and the following dependencies (using pip install -r requirements.txt (preferred) or pip install [package]):

  • matplotlib==1.5.3
  • numpy==1.12.1
  • pandas==0.18.1
  • scipy==0.18.1

Demo

The notebook scripts/demo.ipynb contains a demo, which shows how to evaluate tracker on an arbitrary image sequence. By default, it runs on images located in imgs folder and uses a pretrained model. Before running the demo please download AlexNet weights first (described in the Training section).

Data

  1. Download KITTI dataset from here. We need left color images and tracking labels.
  2. Unpack data into a data folder; images should be in an image folder and labels should be in a label folder.
  3. Resize all the images to (heigh=187, width=621) e.g. by using the scripts/resize_imgs.sh script.

Training

  1. Download the AlexNet weights:

    • Execute scripts/download_alexnet.sh or
    • Download the weights from here and put the file in the checkpoints folder.
  2. Run

     python scripts/train_hart_kitti.py --img_dir=path/to/image/folder --label_dir=/path/to/label/folder
    

The training script will save model checkpoints in the checkpoints folder and report train and test scores every couple of epochs. You can run tensorboard in the checkpoints folder to visualise training progress. Training should converge in about 400k iterations, which should take about 3 days. It might take a couple of hours between logging messages, so don't worry.

Evaluation on KITTI dataset

The scripts/eval_kitti.ipynb notebook contains the code necessary to prepare (IoU, timesteps) curves for train and validation set of KITTI. Before running the evaluation:

  • Download AlexNet weights (described in the Training section).
  • Update image and label folder paths in the notebook.

Citation

If you find this repo useful in your research, please consider citing:

@inproceedings{Kosiorek2017hierarchical,
   title = {Hierarchical Attentive Recurrent Tracking},
   author = {Kosiorek, Adam R and Bewley, Alex and Posner, Ingmar},
   booktitle = {Neural Information Processing Systems},
   url = {http://www.robots.ox.ac.uk/~mobile/Papers/2017NIPS_AdamKosiorek.pdf},
   pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/2017NIPS_AdamKosiorek.pdf},
   year = {2017},
   month = {December}
}

License

This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program. If not, see http://www.gnu.org/licenses/.

Release Notes

Version 1.0

  • Original version from the paper. It contains the KITTI tracking experiment.
Comments
  • ImportError: No module named core_rnn_cell_impl

    ImportError: No module named core_rnn_cell_impl

    Hi, I'm using tensorflow 1.2.1 an i get the following error:

    `ImportErrorTraceback (most recent call last) in () 9 from hart.data.kitti.tools import get_data 10 from hart.model import util ---> 11 from hart.model.attention_ops import FixedStdAttention 12 from hart.model.eval_tools import log_norm, log_ratios, log_values, make_expr_logger 13 from hart.model.tracker import HierarchicalAttentiveRecurrentTracker as HART

    /home/wscuser/notebooks/hart/hart/model/attention_ops.py in () 22 import numpy as np 23 import tensorflow as tf ---> 24 from tensorflow.contrib.rnn.python.ops.core_rnn_cell_impl import LSTMCell 25 from tensorflow.python.ops.rnn_cell_impl import _RNNCell as RNNCell 26 from tensorflow.python.util import nest

    ImportError: No module named core_rnn_cell_impl`

    opened by IdoWSC 3
  • Issue in /neurocity/data/store.py

    Issue in /neurocity/data/store.py

    Hello,

    I met an error when running the training script. The error is below. I added the code to print out the value of len(v) and len(self). /neurocity/data/store.py", line 87, in reset_data assert len(v) == len(self), (len(v), len(self)) AssertionError: (1, 0)

    Thanks!

    opened by YantianZha 2
  • saliency

    saliency

    I wonder whether you use the saliency in this paper,and it means what is the difference between the saliency and the attentive? I want to apply the saliency to the tracking

    opened by ghost 1
  •  how are the bounding boxes in the labels read?

    how are the bounding boxes in the labels read?

    We are trying to run your code on another dataset with diffrenet dimension images, however, we get wrong bounding boxes. we are assuming you scale down and/or change coordinates of the bounding boxes in the labels, would you please explain how did you use bounding boxes coordinates? Also, when you use the kitti datset the bounding boxes coordinates seems to be decimal, wuold you please explain how you read them? Thank you so much.

    opened by rasoulid 1
  • ValueError: invalid literal for int() with base 10: 'training'

    ValueError: invalid literal for int() with base 10: 'training'

    get the error: Traceback (most recent call last): File "/home/cbl/PycharmProjects/hart-master/scripts/train_hart_kitti.py", line 141, in truncated_threshold=1., occluded_threshold=1, reverse=True, mirror=True) File "/home/cbl/PycharmProjects/hart-master/scripts/../hart/data/kitti/tools.py", line 475, in get_data truncated_threshold=truncated_threshold, occluded_threshold=occluded_threshold) File "/home/cbl/PycharmProjects/hart-master/scripts/../hart/data/kitti/parser.py", line 226, in init img_folder_or_paths = self._get_img_paths() File "/home/cbl/PycharmProjects/hart-master/scripts/../hart/data/kitti/parser.py", line 252, in _get_img_paths folders = sorted(folders, key=lambda x: int(os.path.basename(x).split('.')[0])) File "/home/cbl/PycharmProjects/hart-master/scripts/../hart/data/kitti/parser.py", line 252, in folders = sorted(folders, key=lambda x: int(os.path.basename(x).split('.')[0])) ValueError: invalid literal for int() with base 10: 'training' ('parser=', ArgumentParser(prog='train_hart_kitti.py', usage=None, description=None, version=None, formatter_class=<class 'argparse.HelpFormatter'>, conflict_handler='error', add_help=True)) ('args.img_dir=', '/home/cbl/PycharmProjects/hart-master/KITTI_resize/data_tracking_image_2') ('args.label_dir=', '/home/cbl/PycharmProjects/hart-master/KITTI_resize/data_tracking_label_2') ('args.alexnet_dir=', '/home/cbl/PycharmProjects/hart-master/checkpoints')

    could you give the example of the img_folder, label_folder, train_fraction, img_size? I think my path is wrong

    opened by ghost 0
  • ImportError: No module named 'component'?

    ImportError: No module named 'component'?

    Hi! when I ready to run train_hart_kitti.py script, there is a wrong msg "No module named 'component' ".The whole code is as below:

    Traceback (most recent call last):
    File "../hart/scripts/train_hart_kitti.py", line 34, in <module>
    from hart.data.kitti.tools import get_data
    File "../hart/scripts/../hart/data/kitti/tools.py", line 28, in <module>
    import neurocity as nct
    File "../hart/scripts/../neurocity/__init__.py", line 22, in <module>
    from component.model.base import train_mode,test_mode,mode
    ImportError: No module named 'component'
    
    opened by chenfsjz 1
Owner
Adam Kosiorek
I'm a PhD student at the Oxford Robotics Institute. I work on Machine Learning for perception - I'm looking into external memory and attention for RNNs.
Adam Kosiorek
Implementation of Bidirectional Recurrent Independent Mechanisms (Learning to Combine Top-Down and Bottom-Up Signals in Recurrent Neural Networks with Attention over Modules)

BRIMs Bidirectional Recurrent Independent Mechanisms Implementation of the paper Learning to Combine Top-Down and Bottom-Up Signals in Recurrent Neura

Sarthak Mittal 26 May 26, 2022
PyTorch implementation of Hierarchical Multi-label Text Classification: An Attention-based Recurrent Network

hierarchical-multi-label-text-classification-pytorch Hierarchical Multi-label Text Classification: An Attention-based Recurrent Network Approach This

Mingu Kang 17 Dec 13, 2022
LBK 20 Dec 2, 2022
HiFT: Hierarchical Feature Transformer for Aerial Tracking (ICCV2021)

HiFT: Hierarchical Feature Transformer for Aerial Tracking Ziang Cao, Changhong Fu, Junjie Ye, Bowen Li, and Yiming Li Our paper is Accepted by ICCV 2

Intelligent Vision for Robotics in Complex Environment 55 Nov 23, 2022
Joint detection and tracking model named DEFT, or ``Detection Embeddings for Tracking.

DEFT: Detection Embeddings for Tracking DEFT: Detection Embeddings for Tracking, Mohamed Chaabane, Peter Zhang, J. Ross Beveridge, Stephen O'Hara

Mohamed Chaabane 253 Dec 18, 2022
Tracking code for the winner of track 1 in the MMP-Tracking Challenge at ICCV 2021 Workshop.

Tracking Code for the winner of track1 in MMP-Trakcing challenge This repository contains our tracking code for the Multi-camera Multiple People Track

DamoCV 29 Nov 13, 2022
Tracking Pipeline helps you to solve the tracking problem more easily

Tracking_Pipeline Tracking_Pipeline helps you to solve the tracking problem more easily I integrate detection algorithms like: Yolov5, Yolov4, YoloX,

VNOpenAI 32 Dec 21, 2022
Quadruped-command-tracking-controller - Quadruped command tracking controller (flat terrain)

Quadruped command tracking controller (flat terrain) Prepare Install RAISIM link

Yunho Kim 4 Oct 20, 2022
Python package for multiple object tracking research with focus on laboratory animals tracking.

motutils is a Python package for multiple object tracking research with focus on laboratory animals tracking. Features loads: MOTChallenge CSV, sleap

Matěj Šmíd 2 Sep 5, 2022
Code for the paper "How Attentive are Graph Attention Networks?"

How Attentive are Graph Attention Networks? This repository is the official implementation of How Attentive are Graph Attention Networks?. The PyTorch

null 175 Dec 29, 2022
Scalable Attentive Sentence-Pair Modeling via Distilled Sentence Embedding (AAAI 2020) - PyTorch Implementation

Scalable Attentive Sentence-Pair Modeling via Distilled Sentence Embedding PyTorch implementation for the Scalable Attentive Sentence-Pair Modeling vi

Microsoft 25 Dec 2, 2022
code for "AttentiveNAS Improving Neural Architecture Search via Attentive Sampling"

code for "AttentiveNAS Improving Neural Architecture Search via Attentive Sampling"

Facebook Research 94 Oct 26, 2022
Dynamic Attentive Graph Learning for Image Restoration, ICCV2021 [PyTorch Code]

Dynamic Attentive Graph Learning for Image Restoration This repository is for GATIR introduced in the following paper: Chong Mou, Jian Zhang, Zhuoyuan

Jian Zhang 84 Dec 9, 2022
A Structured Self-attentive Sentence Embedding

Structured Self-attentive sentence embeddings Implementation for the paper A Structured Self-Attentive Sentence Embedding, which was published in ICLR

Kaushal Shetty 488 Nov 28, 2022
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context Code in both PyTorch and TensorFlow

Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context This repository contains the code in both PyTorch and TensorFlow for our paper

Zhilin Yang 3.3k Jan 6, 2023
A framework for attentive explainable deep learning on tabular data

?? kendrite A framework for attentive explainable deep learning on tabular data ?? Quick start kedro run ?? Built upon Technology Description Links ke

Marnix Koops 3 Nov 6, 2021
Locally Constrained Self-Attentive Sequential Recommendation

LOCKER This is the pytorch implementation of this paper: Locally Constrained Self-Attentive Sequential Recommendation. Zhankui He, Handong Zhao, Zhe L

Zhankui (Aaron) He 8 Jul 30, 2022
Keyword-BERT: Keyword-Attentive Deep Semantic Matching

project discription An implementation of the Keyword-BERT model mentioned in my paper Keyword-Attentive Deep Semantic Matching (Plz cite this github r

null 1 Nov 14, 2021
A PaddlePaddle implementation of Time Interval Aware Self-Attentive Sequential Recommendation.

TiSASRec.paddle A PaddlePaddle implementation of Time Interval Aware Self-Attentive Sequential Recommendation. Introduction 论文:Time Interval Aware Sel

Paddorch 2 Nov 28, 2021