Adjusting for Autocorrelated Errors in Neural Networks for Time Series

Overview

Adjusting for Autocorrelated Errors in Neural Networks for Time Series

This repository is the official implementation of the paper "Adjusting for Autocorrelated Errors in Neural Networks for Time Series" (arXiv link).

For simplicity, we only provide the code for time series forecasting. However, it is straightforward to implement our method with other models on other time series tasks as described in the appendix of the paper.

Requirements

To install requirements:

pip3 install -r requirements.txt

Datasets

Available datasets are located in the directory data/forecasting. Traffic is not included because it exceeds the 100MB size limit set by Github. However, you can download it here and format the it into a .npy file. ADI-related datasets are not released because they are proprietary.

To use your own dataset, format it into a numpy array with size TxN and saved it into the data directory as a .npy file.

Training and Evaluation

Example commands can be found in run.sh.

Citation

@inproceedings{sun2021adjusting,
    title={Adjusting for Autocorrelated Errors in Neural Networks for Time Series Regression and Forecasting}, 
    author={Fan-Keng Sun and Christopher I. Lang and Duane S. Boning},
    booktitle = {Advances in Neural Information Processing Systems},
    year={2021},
}
You might also like...
This repository contains notebook implementations of the following Neural Process variants: Conditional Neural Processes (CNPs), Neural Processes (NPs), Attentive Neural Processes (ANPs).

The Neural Process Family This repository contains notebook implementations of the following Neural Process variants: Conditional Neural Processes (CN

TCNN Temporal convolutional neural network for real-time speech enhancement in the time domain
TCNN Temporal convolutional neural network for real-time speech enhancement in the time domain

TCNN Pandey A, Wang D L. TCNN: Temporal convolutional neural network for real-time speech enhancement in the time domain[C]//ICASSP 2019-2019 IEEE Int

[CVPR 2022] Official code for the paper:
[CVPR 2022] Official code for the paper: "A Stitch in Time Saves Nine: A Train-Time Regularizing Loss for Improved Neural Network Calibration"

MDCA Calibration This is the official PyTorch implementation for the paper: "A Stitch in Time Saves Nine: A Train-Time Regularizing Loss for Improved

A unified framework for machine learning with time series

Welcome to sktime A unified framework for machine learning with time series We provide specialized time series algorithms and scikit-learn compatible

Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting
Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting

Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting This is the origin Pytorch implementation of Informer in the followin

Implementation of the paper NAST: Non-Autoregressive Spatial-Temporal Transformer for Time Series Forecasting.
Implementation of the paper NAST: Non-Autoregressive Spatial-Temporal Transformer for Time Series Forecasting.

Non-AR Spatial-Temporal Transformer Introduction Implementation of the paper NAST: Non-Autoregressive Spatial-Temporal Transformer for Time Series For

MINIROCKET: A Very Fast (Almost) Deterministic Transform for Time Series Classification

MINIROCKET: A Very Fast (Almost) Deterministic Transform for Time Series Classification

AntroPy: entropy and complexity of (EEG) time-series in Python
AntroPy: entropy and complexity of (EEG) time-series in Python

AntroPy is a Python 3 package providing several time-efficient algorithms for computing the complexity of time-series. It can be used for example to e

tsai is an open-source deep learning package built on top of Pytorch & fastai focused on state-of-the-art techniques for time series classification, regression and forecasting.
tsai is an open-source deep learning package built on top of Pytorch & fastai focused on state-of-the-art techniques for time series classification, regression and forecasting.

Time series Timeseries Deep Learning Pytorch fastai - State-of-the-art Deep Learning with Time Series and Sequences in Pytorch / fastai

Comments
  • some question about 𝐗_π‘‘βˆ’πœŒπ—_(π‘‘βˆ’1) in the code

    some question about 𝐗_π‘‘βˆ’πœŒπ—_(π‘‘βˆ’1) in the code

    Hello! as the paper said,the input and output should be: image but the code in forecasting_runner: if args.inp_adj: x = torch.cat([avg[None].repeat(bs, 1, 1), x], dim=1) x = x[:, 1:] - rho * x[:, :-1] prd_y = self.model(x) if args.out_adj: prd_y += rho * x[:, -1] the x[:, -1] under has changed which mean that x[:, -1]=x[:, 1:] - rho * x[:, :-1] instead of x_{t-1}. Do I misunderstand?

    opened by Lanturewen 1
  • i have some questions about forecasting_runner.py

    i have some questions about forecasting_runner.py

    Hey, bro, I don't quite understand the error correction of the input and output of the prediction part in the code. Why should the average value be added and subtracted in the input correction, and why should the Rho multiplied by the last column of X be added in the output correction? These things don't seem to be reflected in the paper. Can you help me solve them?My email is [email protected]

    opened by pharaohback 1
Owner
Fan-Keng Sun
Ph.D. student @ EECS, MIT
Fan-Keng Sun
This project is a loose implementation of paper "Algorithmic Financial Trading with Deep Convolutional Neural Networks: Time Series to Image Conversion Approach"

Stock Market Buy/Sell/Hold prediction Using convolutional Neural Network This repo is an attempt to implement the research paper titled "Algorithmic F

Asutosh Nayak 136 Dec 28, 2022
Delving into Localization Errors for Monocular 3D Object Detection, CVPR'2021

Delving into Localization Errors for Monocular 3D Detection By Xinzhu Ma, Yinmin Zhang, Dan Xu, Dongzhan Zhou, Shuai Yi, Haojie Li, Wanli Ouyang. Intr

XINZHU.MA 124 Jan 4, 2023
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data.

DeepConsensus DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS)

Google 149 Dec 19, 2022
Alleviating Over-segmentation Errors by Detecting Action Boundaries

Alleviating Over-segmentation Errors by Detecting Action Boundaries Forked from ASRF offical code. This repo is the a implementation of replacing orig

null 13 Dec 12, 2022
Ian Covert 130 Jan 1, 2023
Code for the paper "TadGAN: Time Series Anomaly Detection Using Generative Adversarial Networks"

TadGAN: Time Series Anomaly Detection Using Generative Adversarial Networks This is a Python3 / Pytorch implementation of TadGAN paper. The associated

Arun 92 Dec 3, 2022
Library for implementing reservoir computing models (echo state networks) for multivariate time series classification and clustering.

Framework overview This library allows to quickly implement different architectures based on Reservoir Computing (the family of approaches popularized

Filippo Bianchi 249 Dec 21, 2022
Spectral Temporal Graph Neural Network (StemGNN in short) for Multivariate Time-series Forecasting

Spectral Temporal Graph Neural Network for Multivariate Time-series Forecasting This repository is the official implementation of Spectral Temporal Gr

Microsoft 306 Dec 29, 2022
A real world application of a Recurrent Neural Network on a binary classification of time series data

What is this This is a real world application of a Recurrent Neural Network on a binary classification of time series data. This project includes data

Josep Maria Salvia Hornos 2 Jan 30, 2022
Complex-Valued Neural Networks (CVNN)Complex-Valued Neural Networks (CVNN)

Complex-Valued Neural Networks (CVNN) Done by @NEGU93 - J. Agustin Barrachina Using this library, the only difference with a Tensorflow code is that y

youceF 1 Nov 12, 2021