Transformers are all you need
In this workshop we will be exploring NLP state of the art transformers, with SOTA models like T5 and BERT, then build a model using HugginFace transformers framework.
Table of Content
The workshop will be divided into four parts
- Introduction to Transformers as a HYPE
- Sneak peek to the theory behind Transfomers
- Quick tour (Huggingface framework)
Note that you can always open the notebooks on Google Colab ( No need to install anything ) you just need a stable internet connection :
2. How to get started
- Fork this repository
- Create a branch by your name
- Go through the notebook and complete all tasks
- Submit a pull request
Your task is to fine-tune a classification model
- Using HuggingFace transformers and datasets.
- fine tune it to one of the classification task of the GLUE Benchmark(CoLa to be specific).
- Use a checkpoint from the Hub ("distilbert-base-uncased" for example)
- Once finished submit a pull request to this repo, make sure to place your .ipynb file in the
Useful ressources : text_classification