WORK IN PROGRESS...
Notebooks from the Seminar:
Human Machine Readable < WS21/22
Introduction into programming
Georg Trogemann, Christian Heck, Mattis Kuhn, Ting Chun Liu
Basic Seminar Material/Sculpture/Code
Compact seminar 11 - 4 pm | 31.01.2022 until 11.02.2022
Online @ BigBlueButton
Academy of Media Arts Cologne
Email: [email protected], [email protected], [email protected]
Description
The generation of text by means of deep neural nets (NLG) has spread rapidly. Among other things, text-based dialog systems such as chatbots, assistance systems (Alexa/Siri) or robot journalism are increasingly used in news portals, e-commerce and social media; wherever context-based, natural language or reader-friendly texts are to be generated from structured data. Deep writing techniques have also found their way into the arts and literature with the help of models such as ELMo (Embeddings from Language Models), BERT (Bidirectional Encoder Representations from Transformers) or GPT-2/3 (Generative Pre-Training Transformer).
The goal of the seminar is that at the end each student has produced (a) text based on one of the neural language models mentioned above. No matter if poem, prose, novella, essay, manifesto, shopping list or social bot.
The course is intended as a general introduction to programming. It will not only teach skills to generate texts, but also the basics of Python, a universal programming language that can be used to program images, PDFs or web applications. Furthermore, Python is the most widely used language in programming Artificial Intelligences, especially Deep Neural Nets.
We ask for registration at [email protected] until 20.09.2021. No prior knowledge of programming is required to participate in the basic seminar.
Course
Week 1 (31.1. - 4.2.)
Hands on Python
- files
- ...
Week 2 (7.2. - 11.2.)
Hands on Markov Chains
- link??? n_order_text_generation.ipynb < text generation from zero-order (pure random) via first-order (probability through quantity) and second-order (markov-chain based on one token) to n-order (markov chain based on n token)
- link??? markov_simple.ipynb < a simple ready to use version of n-order markov chains based on n_order_text_generation.ipynb
- link??? interactive_text_generation.ipynb < next word recommendation via markov chain
- link??? markov_basic.ipynb < word-level markov chain
- link??? markov_n-grams.ipynb < word-level markov chain based on n-grams
- link??? dictionary_list.ipynb < dictionairies & lists...
Hands on RNN/LSTM's
- link??? Text generation with LSTM < notebook
Hands on GPT-2/3
GPT-2
Copilot
AI-Dungeon
General Info
Executing the Notebooks:
Folder in KHM-Cloud:
- ??Here?? you can find some material for the seminar
Anaconda & Jupyter Notebooks
Hands on Jupyter Notebooks
- little-helpers-in-jupyter-notebooks.ipynb < Introduction to Jupyter Notebooks (general info, Installation & Help-Functions)
Hands on Markdown
- Jupyter-Notebook: Markdown-basics.ipynb
Datasets
- scraper_wikipedia.ipynb < extract text of specific wikipedia articles
- scrape-load_textcorpora.ipynb < some basic examples and code-snippets to srape, load and walk through datasets
- dataset-list.md < just some resources of datasets & archives