Awesome Treasure of Transformers Models Collection

Overview

Awesome Treasure of Transformers Models Collection

fw.jpg


🧑‍💻 👩‍💻 Collection of All NLP Deep learning algorithm list with Code 🧑‍💻 👩‍💻 Jupyter Notebook


Sr No Algorithm Name Year Blog Video Official Repo Code
1 GPT-Neo 2000 Youtube Open In Colab
2 Transformer 2017 Youtube Open In Colab
3 BERT 2018 Youtube Open In Colab
4 GPT 2018 Youtube Open In Colab
5 Universal Transformer 2018 Youtube Open In Colab
6 T-D 2018 Youtube Open In Colab
7 GPT-2 2019 Youtube Open In Colab
8 T5 2019 Youtube Open In Colab
9 BART 2019 Youtube Open In Colab
10 XLNet 2019 Youtube Open In Colab
11 ALBERT 2019 Youtube Open In Colab
12 Distil-BERT 2019 Youtube Open In Colab
13 Transformer-XL 2019 Youtube Open In Colab
14 XLM 2019 Youtube Open In Colab
15 ViLBERT 2019 Youtube Open In Colab
16 Sparse Transformer 2019 Youtube Open In Colab
17 Levenshtein Transformer 2019 Youtube Open In Colab
18 CTRL 2019 Youtube Open In Colab
19 VideoBERT 2019 Youtube Open In Colab
20 Compressive Transformer 2019 Youtube Open In Colab
21 CuBERT 2019 Youtube Open In Colab
22 BP-Transformer 2019 Youtube Open In Colab
23 Adaptively Sparse Transformer 2019 Youtube Open In Colab
24 Sandwich Transformer 2019 Youtube Open In Colab
25 FSMT 2019 Youtube Open In Colab
26 LXMERT 2019 Youtube Open In Colab
27 VisualBERT 2019 Youtube Open In Colab
28 GPT-3 2020 Youtube Open In Colab
29 ELECTRA 2020 Youtube Open In Colab
30 Electric 2020 Youtube Open In Colab
31 LongFormer 2020 Youtube Open In Colab
32 mBART 2020 Youtube Open In Colab
33 Performer 2020 Youtube Open In Colab
34 ETC 2020 Youtube Open In Colab
35 CodeBERT 2020 Youtube Open In Colab
36 mT5 2020 Youtube Open In Colab
37 Reformer 2020 Youtube Open In Colab
38 DeBERTa & DeBERTa-v2 2020 Youtube Open In Colab
39 Linformer 2020 Youtube Open In Colab
40 RAG 2020 Youtube Open In Colab
41 ProphetNet 2020 Youtube Open In Colab
42 BigBird 2020 Youtube Open In Colab
43 PLATO-2 2020 Youtube Open In Colab
44 Routing Transformer 2020 Youtube Open In Colab
45 DeeBERT 2020 Youtube Open In Colab
46 DynaBERT 2020 Youtube Open In Colab
47 TernaryBERT 2020 Youtube Open In Colab
48 MobileBERT 2020 Youtube Open In Colab
49 Bort 2020 Youtube Open In Colab
50 DeLighT 2020 Youtube Open In Colab
51 PAR Transformer 2020 Youtube Open In Colab
52 ConvBERT 2020 Youtube Open In Colab
53 IB-BERT 2020 Youtube Open In Colab
54 MacBERT 2020 Youtube Open In Colab
55 RealFormer 2020 Youtube Open In Colab
56 Sinkhorn Transformer 2020 Youtube Open In Colab
57 SongNet 2020 Youtube Open In Colab
58 Funnel Transformer 2020 Youtube Open In Colab
59 SC-GPT 2020 Youtube Open In Colab
60 SMITH 2020 Youtube Open In Colab
61 BinaryBERT 2020 Youtube Open In Colab
62 SqueezeBERT 2020 Youtube Open In Colab
63 Feedback Transformer 2020 Youtube Open In Colab
64 CamemBERT 2020 Youtube Open In Colab
65 CPM 2020 Youtube Open In Colab
66 DialoGPT 2020 Youtube Open In Colab
67 DPR 2020 Youtube Open In Colab
68 FlauBERT 2020 Youtube Open In Colab
69 HerBERT 2020 Youtube Open In Colab
70 LayoutLM 2020 Youtube Open In Colab
71 LED 2020 Youtube Open In Colab
72 LUKE 2020 Youtube Open In Colab
73 M2M100 2020 Youtube Open In Colab
74 MBart and MBart-50 2020 Youtube Open In Colab
75 MegatronBERT 2020 Youtube Open In Colab
76 MegatronGPT2 2020 Youtube Open In Colab
77 MPNet 2020 Youtube Open In Colab
78 Pegasus 2020 Youtube Open In Colab
79 PhoBERT 2020 Youtube Open In Colab
80 QDQBERT 2020 Youtube Open In Colab
81 RemBERT 2020 Youtube Open In Colab
82 RetriBERT 2020 Youtube Open In Colab
83 Speech2Text 2020 Youtube Open In Colab
84 T5v1.1 2020 Youtube Open In Colab
85 TAPAS 2020 Youtube Open In Colab
86 Wav2Vec2 2020 Youtube Open In Colab
87 XLM-ProphetNet 2020 Youtube Open In Colab
88 XLM-RoBERTa 2020 Youtube Open In Colab
89 XLSR-Wav2Vec2 2020 Youtube Open In Colab
90 Switch Transformer 2021 Youtube Open In Colab
91 TNT 2021 Youtube Open In Colab
92 Adaptive Span Transformer 2021 Youtube Open In Colab
93 Primer 2021 Youtube Open In Colab
94 Charformer 2021 Youtube Open In Colab
95 MATE 2021 Youtube Open In Colab
96 Nystromformer 2021 Youtube Open In Colab
97 Subformer 2021 Youtube Open In Colab
98 ESACL 2021 Youtube Open In Colab
99 PermuteFormer 2021 Youtube Open In Colab
100 NormFormer 2021 Youtube Open In Colab
101 Fastformer 2021 Youtube Open In Colab
102 AutoTinyBERT 2021 Youtube Open In Colab
103 EGT 2021 Youtube Open In Colab
104 Chinese Pre-trained Unbalanced Transformer 2021 Youtube Open In Colab
105 GANFormer 2021 Youtube Open In Colab
106 ClipBERT 2021 Youtube Open In Colab
107 CodeT5 2021 Youtube Open In Colab
108 I-BERT 2021 Youtube Open In Colab
109 ByT5 2021 Youtube Open In Colab
110 CANINE 2021 Youtube Open In Colab
111 FNet 2021 Youtube Open In Colab
112 LayoutLMV2 2021 Youtube Open In Colab
113 LayoutXLM 2021 Youtube Open In Colab
114 GPT-J 2021 Youtube Open In Colab
115 Hubert 2021 Youtube Open In Colab
116 Perceiver 2021 Youtube Open In Colab
117 RoFormer 2021 Youtube Open In Colab
118 SegFormer 2021 Youtube Open In Colab
119 SEW 2021 Youtube Open In Colab
120 SEW-D 2021 Youtube Open In Colab
121 Speech2Text2 2021 Youtube Open In Colab
122 Splinter 2021 Youtube Open In Colab
123 TrOCR 2021 Youtube Open In Colab
124 UniSpeech 2021 Youtube Open In Colab
125 UniSpeech-SAT 2021 Youtube Open In Colab
126 MarianMT - Open In Colab

Instruction

  • All blogs have links and some links contains different language blog such as Chinese, Korean etc. for this Please use Google Tranlater Page Convert.

How to Contribute


if you want to contribute on this project please send us email: [email protected] .

🙏 🙏 Special Thanks to Komal Lamba for contributing.


Copyright for source code belongs to the original author(s). However, under fair use you are encouraged to fork and contribute minor corrections and updates for the benefit of the reader(s).


Thanks for Reading ...!!!


You might also like...
voice2json is a collection of command-line tools for offline speech/intent recognition on Linux
voice2json is a collection of command-line tools for offline speech/intent recognition on Linux

Command-line tools for speech and intent recognition on Linux

Collection of useful (to me) python scripts for interacting with napari

Napari scripts A collection of napari related tools in various state of disrepair/functionality. Browse_LIF_widget.py This module can be imported, for

BPEmb is a collection of pre-trained subword embeddings in 275 languages, based on Byte-Pair Encoding (BPE) and trained on Wikipedia.

BPEmb is a collection of pre-trained subword embeddings in 275 languages, based on Byte-Pair Encoding (BPE) and trained on Wikipedia. Its intended use is as input for neural models in natural language processing.

An open collection of annotated voices in Japanese language

声庭 (Koniwa): オープンな日本語音声とアノテーションのコレクション Koniwa (声庭): An open collection of annotated voices in Japanese language 概要 Koniwa(声庭)は利用・修正・再配布が自由でオープンな音声とアノテ

ALIbaba's Collection of Encoder-decoders from MinD (Machine IntelligeNce of Damo) Lab

AliceMind AliceMind: ALIbaba's Collection of Encoder-decoders from MinD (Machine IntelligeNce of Damo) Lab This repository provides pre-trained encode

Framework for fine-tuning pretrained transformers for Named-Entity Recognition (NER) tasks
Framework for fine-tuning pretrained transformers for Named-Entity Recognition (NER) tasks

NERDA Not only is NERDA a mesmerizing muppet-like character. NERDA is also a python package, that offers a slick easy-to-use interface for fine-tuning

KoBART model on huggingface transformers

KoBART-Transformers SKT에서 공개한 KoBART를 편리하게 사용할 수 있게 transformers로 포팅하였습니다. Install (Optional) BartModel과 PreTrainedTokenizerFast를 이용하면 설치하실 필요 없습니다. p

Big Bird: Transformers for Longer Sequences

BigBird, is a sparse-attention based transformer which extends Transformer based models, such as BERT to much longer sequences. Moreover, BigBird comes along with a theoretical understanding of the capabilities of a complete transformer that the sparse model can handle.

🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.

State-of-the-art Natural Language Processing for PyTorch and TensorFlow 2.0 🤗 Transformers provides thousands of pretrained models to perform tasks o

Owner
Ashish Patel
AI Researcher & Senior Data Scientist at Softweb Solutions Avnet Solutions(Fortune 500) | Rank 3 Kaggle Kernel Master
Ashish Patel
A collection of scripts to preprocess ASR datasets and finetune language-specific Wav2Vec2 XLSR models

wav2vec-toolkit A collection of scripts to preprocess ASR datasets and finetune language-specific Wav2Vec2 XLSR models This repository accompanies the

Anton Lozhkov 29 Oct 23, 2022
A collection of GNN-based fake news detection models.

This repo includes the Pytorch-Geometric implementation of a series of Graph Neural Network (GNN) based fake news detection models. All GNN models are implemented and evaluated under the User Preference-aware Fake News Detection (UPFD) framework. The fake news detection problem is instantiated as a graph classification task under the UPFD framework.

SafeGraph 251 Jan 1, 2023
A collection of models for image - text generation in ACM MM 2021.

Bi-directional Image and Text Generation UMT-BITG (image & text generator) Unifying Multimodal Transformer for Bi-directional Image and Text Generatio

Multimedia Research 63 Oct 30, 2022
simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.

Quickly train T5 models in just 3 lines of code + ONNX support simpleT5 is built on top of PyTorch-lightning ⚡️ and Transformers ?? that lets you quic

Shivanand Roy 220 Dec 30, 2022
Awesome-NLP-Research (ANLP)

Awesome-NLP-Research (ANLP)

Language, Information, and Learning at Yale 72 Dec 19, 2022
An implementation of model parallel GPT-3-like models on GPUs, based on the DeepSpeed library. Designed to be able to train models in the hundreds of billions of parameters or larger.

GPT-NeoX An implementation of model parallel GPT-3-like models on GPUs, based on the DeepSpeed library. Designed to be able to train models in the hun

EleutherAI 3.1k Jan 8, 2023
PyTorch implementation and pretrained models for XCiT models. See XCiT: Cross-Covariance Image Transformer

Cross-Covariance Image Transformer (XCiT) PyTorch implementation and pretrained models for XCiT models. See XCiT: Cross-Covariance Image Transformer L

Facebook Research 605 Jan 2, 2023
Silero Models: pre-trained speech-to-text, text-to-speech models and benchmarks made embarrassingly simple

Silero Models: pre-trained speech-to-text, text-to-speech models and benchmarks made embarrassingly simple

Alexander Veysov 3.2k Dec 31, 2022
Collection of scripts to pinpoint obfuscated code

Obfuscation Detection (v1.0) Author: Tim Blazytko Automatically detect control-flow flattening and other state machines Description: Scripts and binar

Tim Blazytko 230 Nov 26, 2022
A collection of Korean Text Datasets ready to use using Tensorflow-Datasets.

tfds-korean A collection of Korean Text Datasets ready to use using Tensorflow-Datasets. TensorFlow-Datasets를 이용한 한국어/한글 데이터셋 모음입니다. Dataset Catalog |

Jeong Ukjae 20 Jul 11, 2022