StackRec: Efficient Training of Very Deep Sequential Recommender Models by Iterative Stacking

Overview

StackRec: Efficient Training of Very Deep Sequential Recommender Models by Iterative Stacking

Datasets

You can download datasets that have been pre-processed:

We construct a large-scale session-based recommendation dataset (denoted as Video-6M) by collecting the interactiton behaviors of nearly 6 million users in a week from a commercial recommender system. The dataset can be used to evaluate very deep recommendation models (up to 100 layers), such as NextItNet (as shown in our paper StackRec(SIGIR2021)). If you use this dataset in your paper, you should cite our NextItNet and StackRec for publish permission.

@article{yuan2019simple,
	title={A simple convolutional generative network for next item recommendation},
	author={Yuan, Fajie and Karatzoglou, Alexandros and Arapakis, Ioannis and Jose, Joemon M and He, Xiangnan},
	journal={Proceedings of the Twelfth ACM International Conference on Web Search and Data Mining},
	year={2019}
}

@article{wang2020stackrec,
  title={StackRec: Efficient Training of Very Deep Sequential Recommender Models by Iterative Stacking},
  author={Wang, Jiachun and Yuan, Fajie and Chen, Jian and Wu, Qingyao and Li, Chengmin and Yang, Min and Sun, Yang and Zhang, Guoxiao},
  journal={Proceedings of the 44th International ACM SIGIR conference on Research and Development in Information Retrieval},
  year={2021}
}

File Description

requirements.txt: the experiment environment

train_nextitnet_sc1.sh: the shell script to train StackRec with NextItNet in CL scenario
train_nextitnet_sc2.sh: the shell script to train StackRec with NextItNet in TF scenario
train_nextitnet_sc3.sh: the shell script to train StackRec with NextItNet in TS scenario
deep_nextitnet.py: the training file of NextItNet
deep_nextitnet_coldrec.py: the training file of NextItNet customized for coldrec source dataset
data_loader.py: the dataset loading file of NextItNet and GRec
data_loader_finetune.py: the dataset loading file of NextItNet and GRec customized for coldrec dataset
generator_deep.py: the model file of NextItNet
ops.py: the module file of NextItNet and GRec with stacking methods doubling blocks
ops_copytop.py: the module file of NextItNet with stacking methods allowed to stack top blocks
ops_original.py: the module file of NextItNet with stacking methods without alpha
fineall.py: the training file of NextItNet customized for coldrec target dataset

train_grec_sc1.sh: the shell script to train StackRec with GRec in CL scenario
deep_GRec: the training file of GRec
generator_deep_GRec.py: the model file of GRec
utils_GRec.py: some tools for GRec

train_sasrec_sc1.sh: the shell script to train StackRec with SASRec in CL scenario
baseline_SASRec.py: the training file of SASRec
Data_loader_SASRec.py: the dataset loading file of SASRec
SASRec_Alpha.py: the model file of SASRec

train_ssept_sc1.sh: the shell script to train StackRec with SSEPT in CL scenario
baseline_SSEPT.py: the training file of SSEPT
Data_loader_SSEPT.py: the dataset loading file of SSEPT
SSEPT_Alpha.py: the model file of SSEPT
utils.py: some tools for SASRec and SSEPT
Modules.py: the module file of SASRec and SSEPT with stacking methods

Stacking with NextItNet

Train in the CL scenario

Execute example:

sh train_nextitnet_sc1.sh

Train in the TS scenario

Execute example:

sh train_nextitnet_sc2.sh

Train in the TF scenario

Execute example:

sh train_nextitnet_sc3.sh

Stacking with GRec

Execute example:

sh train_grec_sc1.sh

Stacking with SASRec

Execute example:

sh train_sasrec_sc1.sh

Stacking with SSEPT

Execute example:

sh train_ssept_sc1.sh

Key Configuration

  • method: five stacking methods including from_scratch, stackC, stackA, stackR and stackE
  • data_ratio: the percentage of training data
  • dilation_count: the number of dilation factors {1,2,4,8}
  • num_blocks: the number of residual blocks
  • load_model: whether load pre-trained model or not
You might also like...
Transformers4Rec is a flexible and efficient library for sequential and session-based recommendation, available for both PyTorch and Tensorflow.
Transformers4Rec is a flexible and efficient library for sequential and session-based recommendation, available for both PyTorch and Tensorflow.

Transformers4Rec is a flexible and efficient library for sequential and session-based recommendation, available for both PyTorch and Tensorflow.

Reviving Iterative Training with Mask Guidance for Interactive Segmentation
Reviving Iterative Training with Mask Guidance for Interactive Segmentation

This repository provides the source code for training and testing state-of-the-art click-based interactive segmentation models with the official PyTorch implementation

An efficient PyTorch implementation of the evaluation metrics in recommender systems.
An efficient PyTorch implementation of the evaluation metrics in recommender systems.

recsys_metrics An efficient PyTorch implementation of the evaluation metrics in recommender systems. Overview • Installation • How to use • Benchmark

This repository provides an efficient PyTorch-based library for training deep models.

An Efficient Library for Training Deep Models This repository provides an efficient PyTorch-based library for training deep models. Installation Make

Ultra-Data-Efficient GAN Training: Drawing A Lottery Ticket First, Then Training It Toughly
Ultra-Data-Efficient GAN Training: Drawing A Lottery Ticket First, Then Training It Toughly

Ultra-Data-Efficient GAN Training: Drawing A Lottery Ticket First, Then Training It Toughly Code for this paper Ultra-Data-Efficient GAN Tra

Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch

Perceiver - Pytorch Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch Install $ pip install perceiver-pytorch Usage

Official repository for "PAIR: Planning and Iterative Refinement in Pre-trained Transformers for Long Text Generation"

pair-emnlp2020 Official repository for the paper: Xinyu Hua and Lu Wang: PAIR: Planning and Iterative Refinement in Pre-trained Transformers for Long

Official Implementation for
Official Implementation for "ReStyle: A Residual-Based StyleGAN Encoder via Iterative Refinement" https://arxiv.org/abs/2104.02699

ReStyle: A Residual-Based StyleGAN Encoder via Iterative Refinement Recently, the power of unconditional image synthesis has significantly advanced th

Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow

Perceiver This Python package implements Perceiver: General Perception with Iterative Attention by Andrew Jaegle in TensorFlow. This model builds on t

Owner
null
Technical Indicators implemented in Python only using Numpy-Pandas as Magic - Very Very Fast! Very tiny! Stock Market Financial Technical Analysis Python library . Quant Trading automation or cryptocoin exchange

MyTT Technical Indicators implemented in Python only using Numpy-Pandas as Magic - Very Very Fast! to Stock Market Financial Technical Analysis Python

dev 34 Dec 27, 2022
Code for Private Recommender Systems: How Can Users Build Their Own Fair Recommender Systems without Log Data? (SDM 2022)

Private Recommender Systems: How Can Users Build Their Own Fair Recommender Systems without Log Data? (SDM 2022) We consider how a user of a web servi

joisino 20 Aug 21, 2022
NVIDIA Merlin is an open source library providing end-to-end GPU-accelerated recommender systems, from feature engineering and preprocessing to training deep learning models and running inference in production.

NVIDIA Merlin NVIDIA Merlin is an open source library designed to accelerate recommender systems on NVIDIA’s GPUs. It enables data scientists, machine

null 419 Jan 3, 2023
Iterative Normalization: Beyond Standardization towards Efficient Whitening

IterNorm Code for reproducing the results in the following paper: Iterative Normalization: Beyond Standardization towards Efficient Whitening Lei Huan

Lei Huang 21 Dec 27, 2022
An Efficient Training Approach for Very Large Scale Face Recognition or F²C for simplicity.

Fast Face Classification (F²C) This is the code of our paper An Efficient Training Approach for Very Large Scale Face Recognition or F²C for simplicit

null 33 Jun 27, 2021
A library for preparing, training, and evaluating scalable deep learning hybrid recommender systems using PyTorch.

collie_recs Collie is a library for preparing, training, and evaluating implicit deep learning hybrid recommender systems, named after the Border Coll

ShopRunner 97 Jan 3, 2023
A library for preparing, training, and evaluating scalable deep learning hybrid recommender systems using PyTorch.

collie Collie is a library for preparing, training, and evaluating implicit deep learning hybrid recommender systems, named after the Border Collie do

ShopRunner 96 Dec 29, 2022
Library for machine learning stacking generalization.

stacked_generalization Implemented machine learning *stacking technic[1]* as handy library in Python. Feature weighted linear stacking is also availab

null 114 Jul 19, 2022
RGB-stacking 🛑 🟩 🔷 for robotic manipulation

RGB-stacking ?? ?? ?? for robotic manipulation BLOG | PAPER | VIDEO Beyond Pick-and-Place: Tackling Robotic Stacking of Diverse Shapes, Alex X. Lee*,

DeepMind 95 Dec 23, 2022
BESS: Balanced Evolutionary Semi-Stacking for Disease Detection via Partially Labeled Imbalanced Tongue Data

Balanced-Evolutionary-Semi-Stacking Code for the paper ''BESS: Balanced Evolutionary Semi-Stacking for Disease Detection via Partially Labeled Imbalan

null 0 Jan 16, 2022