Code for Temporally Abstract Partial Models

Overview

Code for Temporally Abstract Partial Models

Accompanies the code for the experimental section of the paper: Temporally Abstract Partial Models, Khetarpal, Ahmed, Comanici and Precup, 2021 that is to be published at NeurIPS 2021.

Installation

  1. Clone the deepmind-research repository and cd into this directory:
git clone https://github.com/deepmind/affordances_option_models.git
  1. Now install the requirements to your system pip install -r ./requirements.txt. It is recommended to use a virtualenv to isolate dependencies.

For example:

git clone https://github.com/deepmind/affordances_option_models.git

python3 -m virtualenv affordances
source affordances/bin/activate

pip install -r affordances_option_models/requirements.txt

Usage

  1. The first step of the experiment is to build, train and save the low level options: python3 -m affordances_option_models.lp_learn_options --save_path ./options which will save the option policies into ./options/args/.... The low level options are trained by creating a reward matrix for the 75 options (see option_utils.check_option_termination) and then running value iteration.
  2. The next step is to learn the option models, policy over options and affordance models all online: python3 -m affordances_option_models.lp_learn_model_from_options --path_to_options=./options/gamma0.99/max_iterations1000/options/. See Arguments below to see how to select --affordances_name.

Arguments

  1. The default arguments for lp_learn_options.py will produce a reasonable set of option policies.
  2. For lp_learn_model_from_options.py use the argument --affordances_name to switch between the affordance that will be used for model learning. For the heuristic affordances (everything, only_pickup_drop and only_relevant_pickup_drop) the model learned will be evaluated via value iteration (i.e. planning) with every other affordance type. For the learned affordances, only learned affordances will be used in value iteration.

Experiments in Section 5.1

To reproduce the experiments with heuristics use the command

python3 -m affordances_option_models.lp_learn_model_from_options  \
--num_rollout_nodes=1 --total_steps=50000000 \
--seed=0 --affordances_name=everything

and run this command for every combination of the arguments:

  • --seed=: 0, 1, 2, 3
  • --affordances_name=: everything, only_pickup_drop, only_relevant_pickup_drop.

Experiments in Section 5.2

To reproduce the experiments with learned affordances use the command

python3 -m affordances_option_models.lp_learn_model_from_options  \
--num_rollout_nodes=1 --total_steps=50000000 --affordances_name=learned \
--seed=0 --affordances_threshold=0.0

and run this command for every combination of the arguments:

  • --seed=: 0, 1, 2, 3
  • --affordances_threshold=: 0.0, 0.1, 0.25, 0.5, 0.75.

Citation

If you use this codebase in your research, please cite the paper:

@misc{khetarpal2021temporally,
      title={Temporally Abstract Partial Models},
      author={Khimya Khetarpal and Zafarali Ahmed and Gheorghe Comanici and Doina Precup},
      year={2021},
      eprint={2108.03213},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

Disclaimer

This is not an official Google product.

You might also like...
Partial implementation of ODE-GAN technique from the paper Training Generative Adversarial Networks by Solving Ordinary Differential Equations
Partial implementation of ODE-GAN technique from the paper Training Generative Adversarial Networks by Solving Ordinary Differential Equations

ODE GAN (Prototype) in PyTorch Partial implementation of ODE-GAN technique from the paper Training Generative Adversarial Networks by Solving Ordinary

Leibniz is a python package which provide facilities to express learnable partial differential equations with PyTorch
Leibniz is a python package which provide facilities to express learnable partial differential equations with PyTorch

Leibniz is a python package which provide facilities to express learnable partial differential equations with PyTorch

Official implementation for the paper:
Official implementation for the paper: "Multi-label Classification with Partial Annotations using Class-aware Selective Loss"

Multi-label Classification with Partial Annotations using Class-aware Selective Loss Paper | Pretrained models Official PyTorch Implementation Emanuel

Implementation of "Scaled-YOLOv4: Scaling Cross Stage Partial Network" using PyTorch framwork.

YOLOv4-large This is the implementation of "Scaled-YOLOv4: Scaling Cross Stage Partial Network" using PyTorch framwork. YOLOv4-CSP YOLOv4-tiny YOLOv4-

Unofficial pytorch implementation of 'Image Inpainting for Irregular Holes Using Partial Convolutions'
Unofficial pytorch implementation of 'Image Inpainting for Irregular Holes Using Partial Convolutions'

pytorch-inpainting-with-partial-conv Official implementation is released by the authors. Note that this is an ongoing re-implementation and I cannot f

Reproduce partial features of DeePMD-kit using PyTorch.
Reproduce partial features of DeePMD-kit using PyTorch.

DeePMD-kit on PyTorch For better understand DeePMD-kit, we implement its partial features using PyTorch and expose interface consuing descriptors. Tec

GEP (GDB Enhanced Prompt) - a GDB plug-in for GDB command prompt with fzf history search, fish-like autosuggestions, auto-completion with floating window, partial string matching in history, and more!

GEP (GDB Enhanced Prompt) GEP (GDB Enhanced Prompt) is a GDB plug-in which make your GDB command prompt more convenient and flexibility. Why I need th

A PyTorch implementation of ICLR 2022 Oral paper PiCO: Contrastive Label Disambiguation for Partial Label Learning
A PyTorch implementation of ICLR 2022 Oral paper PiCO: Contrastive Label Disambiguation for Partial Label Learning

PiCO: Contrastive Label Disambiguation for Partial Label Learning This is a PyTorch implementation of ICLR 2022 Oral paper PiCO; also see our Project

Ever felt tired after preprocessing the dataset, and not wanting to write any code further to train your model? Ever encountered a situation where you wanted to record the hyperparameters of the trained model and able to retrieve it afterward? Models Playground is here to help you do that. Models playground allows you to train your models right from the browser.
Owner
DeepMind
DeepMind
A PyTorch Reimplementation of TecoGAN: Temporally Coherent GAN for Video Super-Resolution

TecoGAN-PyTorch Introduction This is a PyTorch reimplementation of TecoGAN: Temporally Coherent GAN for Video Super-Resolution (VSR). Please refer to

null 165 Dec 17, 2022
This is the official implementation of the paper "Object Propagation via Inter-Frame Attentions for Temporally Stable Video Instance Segmentation".

[CVPRW 2021] - Object Propagation via Inter-Frame Attentions for Temporally Stable Video Instance Segmentation

Anirudh S Chakravarthy 6 May 3, 2022
Temporally Coherent GAN SIGGRAPH project.

TecoGAN This repository contains source code and materials for the TecoGAN project, i.e. code for a TEmporally COherent GAN for video super-resolution

Duc Linh Nguyen 2 Jan 18, 2022
Temporally Efficient Vision Transformer for Video Instance Segmentation, CVPR 2022, Oral

Temporally Efficient Vision Transformer for Video Instance Segmentation Temporally Efficient Vision Transformer for Video Instance Segmentation (CVPR

Hust Visual Learning Team 203 Dec 31, 2022
Point Cloud Denoising input segmentation output raw point-cloud valid/clear fog rain de-noised Abstract Lidar sensors are frequently used in environme

Point Cloud Denoising input segmentation output raw point-cloud valid/clear fog rain de-noised Abstract Lidar sensors are frequently used in environme

null 75 Nov 24, 2022
Fine-grained Control of Image Caption Generation with Abstract Scene Graphs

Faster R-CNN pretrained on VisualGenome This repository modifies maskrcnn-benchmark for object detection and attribute prediction on VisualGenome data

Shizhe Chen 7 Apr 20, 2021
Puzzle-CAM: Improved localization via matching partial and full features.

Puzzle-CAM The official implementation of "Puzzle-CAM: Improved localization via matching partial and full features".

Sanghyun Jo 150 Nov 14, 2022
Provide partial dates and retain the date precision through processing

Prefix date parser This is a helper class to parse dates with varied degrees of precision. For example, a data source might state a date as 2001, 2001

Friedrich Lindenberg 13 Dec 14, 2022
MVP Benchmark for Multi-View Partial Point Cloud Completion and Registration

MVP Benchmark: Multi-View Partial Point Clouds for Completion and Registration [NEWS] 2021-07-12 [NEW ?? ] The submission on Codalab starts! 2021-07-1

PL 93 Dec 21, 2022
Robust Partial Matching for Person Search in the Wild

APNet for Person Search Introduction This is the code of Robust Partial Matching for Person Search in the Wild accepted in CVPR2020. The Align-to-Part

Yingji Zhong 36 Dec 18, 2022