Wind Speed Prediction using LSTMs in PyTorch

Overview

Implementation of Deep-Forecast using PyTorch

Setup

  • Clone this repository : git clone https://github.com/Wizaron/deep-forecast-pytorch.git
  • Download and install Anaconda or Miniconda
  • Go to the "reseg-pytorch/code/pytorch" : cd reseg-pytorch/code/pytorch
  • Create environment : conda env create -f conda_environment.yml
  • Activate environment : source activate deep-forecast-pytorch

Code Structure

  • "data" : Stores data and scripts to prepare dataset for training.
  • "lib" : Stores miscellaneous scripts for training and testing.
    • "arch.py" : Defines network architecture
    • "model.py" : Defines model (Minibatching mechanism, optimization, criterion, fit, predict, etc.)
    • "prediction.py" : Metrics and plots to evaluate the performance of the trained model
    • "data.py" : Creates training, validation and testings datasets
    • "loader.py" : Creates Dataset loader for PyTorch
  • "train.py" : Main training script.
  • "test.py" : Main testing script.
  • "settings.py" : Defines hyper-parameters of the model.

Data

  • Data is downloaded from IEM
  • Download data and save it under "data/raw"
  • To prepare dataset, run the scripts in "data/scripts"

Training and Testing

  • Train : python train.py --data [PATH OF PREPARED DATASET]
  • Test : python test.py --data [PATH OF PREPARED DATASET] --model [PATH OF THE SAVED MODEL]
  • For more info : python train.py --help, python test.py --help

train.py

  • It saves models and logs under "models"
  • At the end of the training, it saves predictions under "outputs"

test.py

  • It saves predictions under the directory of the model.

Sample Outputs









You might also like...
Interpretable Models for NLP using PyTorch

This repo is deprecated. Please find the updated package here. https://github.com/EdGENetworks/anuvada Anuvada: Interpretable Models for NLP using PyT

Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.

Flexible interface for high performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra. What is Lightning Tran

Sequence-to-Sequence learning using PyTorch

Seq2Seq in PyTorch This is a complete suite for training sequence-to-sequence models in PyTorch. It consists of several models and code to both train

 Anuvada: Interpretable Models for NLP using PyTorch
Anuvada: Interpretable Models for NLP using PyTorch

Anuvada: Interpretable Models for NLP using PyTorch So, you want to know why your classifier arrived at a particular decision or why your flashy new d

An example project using OpenPrompt under pytorch-lightning for prompt-based SST2 sentiment analysis model

pl_prompt_sst An example project using OpenPrompt under the framework of pytorch-lightning for a training prompt-based text classification model on SS

Implementation of Memorizing Transformers (ICLR 2022), attention net augmented with indexing and retrieval of memories using approximate nearest neighbors, in Pytorch
Implementation of Memorizing Transformers (ICLR 2022), attention net augmented with indexing and retrieval of memories using approximate nearest neighbors, in Pytorch

Memorizing Transformers - Pytorch Implementation of Memorizing Transformers (ICLR 2022), attention net augmented with indexing and retrieval of memori

This repository details the steps in creating a Part of Speech tagger using Trigram Hidden Markov Models and the Viterbi Algorithm without using external libraries.

POS-Tagger This repository details the creation of a Part-of-Speech tagger using Trigram Hidden Markov Models to predict word tags in a word sequence.

Creating an Audiobook (mp3 file) using a Ebook (epub) using BeautifulSoup and Google Text to Speech

epub2audiobook Creating an Audiobook (mp3 file) using a Ebook (epub) using BeautifulSoup and Google Text to Speech Input examples qual a pasta do seu

Text-Summarization-using-NLP - Text Summarization using NLP  to fetch BBC News Article and summarize its text and also it includes custom article Summarization
Owner
Onur Kaplan
Onur Kaplan
Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.

Summarization, translation, Q&A, text generation and more at blazing speed using a T5 version implemented in ONNX. This package is still in alpha stag

Abel 137 Feb 1, 2021
⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x using fastT5.

Reduce T5 model size by 3X and increase the inference speed up to 5X. Install Usage Details Functionalities Benchmarks Onnx model Quantized onnx model

Kiran R 399 Jan 5, 2023
Fast, general, and tested differentiable structured prediction in PyTorch

Torch-Struct: Structured Prediction Library A library of tested, GPU implementations of core structured prediction algorithms for deep learning applic

HNLP 1.1k Dec 16, 2022
Label data using HuggingFace's transformers and automatically get a prediction service

Label Studio for Hugging Face's Transformers Website • Docs • Twitter • Join Slack Community Transfer learning for NLP models by annotating your textu

Heartex 135 Dec 29, 2022
To be a next-generation DL-based phenotype prediction from genome mutations.

Sequence -----------+--> 3D_structure --> 3D_module --+ +--> ? | |

Eric Alcaide 18 Jan 11, 2022
Pervasive Attention: 2D Convolutional Networks for Sequence-to-Sequence Prediction

This is a fork of Fairseq(-py) with implementations of the following models: Pervasive Attention - 2D Convolutional Neural Networks for Sequence-to-Se

Maha 490 Dec 15, 2022
A python package for deep multilingual punctuation prediction.

This python library predicts the punctuation of English, Italian, French and German texts. We developed it to restore the punctuation of transcribed spoken language.

Oliver Guhr 27 Dec 22, 2022
One Stop Anomaly Shop: Anomaly detection using two-phase approach: (a) pre-labeling using statistics, Natural Language Processing and static rules; (b) anomaly scoring using supervised and unsupervised machine learning.

One Stop Anomaly Shop (OSAS) Quick start guide Step 1: Get/build the docker image Option 1: Use precompiled image (might not reflect latest changes):

Adobe, Inc. 148 Dec 26, 2022
Mirco Ravanelli 2.3k Dec 27, 2022
Pytorch-version BERT-flow: One can apply BERT-flow to any PLM within Pytorch framework.

Pytorch-version BERT-flow: One can apply BERT-flow to any PLM within Pytorch framework.

Ubiquitous Knowledge Processing Lab 59 Dec 1, 2022