A python project made to generate code using either OpenAI's codex or GPT-J (Although not as good as codex)

Overview

CodeJ

A python project made to generate code using either OpenAI's codex or GPT-J (Although not as good as codex)

Install requirements

pip install -r requirements.txt

Gpt-J

first set everything up (Note: you can change the temperature to a higher value for more creative results)

from CodeJ import GenerationJ

lang = "Python"
temperature = 0.6
top_p = 1.0
J = GenerationJ("file_name")

Now call the method to generate text

J.generate("Python", temperature, top_p, "Make a square", "Make a triangle using turtle")

For Both the gptj and codex version you can pass as many prompts as you want

Codex

from CodeJ import Generation3

temperature = 0.3
gpt3 = Generation3("file_name", "Replace with your API key")
gpt3.generation("Python", 0.416, "Make a square using the turtle module", "Add take the power of numbers in an array", "Print CodeJ is awesome")

Supported Languages

Languages = ["MATLAB", "Julia", "C#", "Python", "Python3", "C++", "Q#", "F#", "JavaScript", "Kotlin", "Dart", "Java", "C", "CSS", "HTML"]

Resources

Sign up for an api-key here https://beta.openai.com/signup Use the free Gpt-J version if you want as well

You might also like...
Modified GPT using average pooling to reduce the softmax attention memory constraints.
Modified GPT using average pooling to reduce the softmax attention memory constraints.

NLP-GPT-Upsampling This repository contains an implementation of Open AI's GPT Model. In particular, this implementation takes inspiration from the Ny

๐Ÿ“œ GPT-2 Rhyming Limerick and Haiku models using data augmentation

Well-formed Limericks and Haikus with GPT2 ๐Ÿ“œ GPT-2 Rhyming Limerick and Haiku models using data augmentation In collaboration with Matthew Korahais &

Creating a chess engine using GPT-3
Creating a chess engine using GPT-3

GPT3Chess Creating a chess engine using GPT-3 Code for my article : https://towardsdatascience.com/gpt-3-play-chess-d123a96096a9 My game (white) vs GP

Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts
Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts

gpt-2-simple A simple Python package that wraps existing model fine-tuning and generation scripts for OpenAI's GPT-2 text generation model (specifical

Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts
Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts

gpt-2-simple A simple Python package that wraps existing model fine-tuning and generation scripts for OpenAI's GPT-2 text generation model (specifical

A python script to prefab your scripts/text files, and re create them with ease and not have to open your browser to copy code or write code yourself
A python script to prefab your scripts/text files, and re create them with ease and not have to open your browser to copy code or write code yourself

Scriptfab - What is it? A python script to prefab your scripts/text files, and re create them with ease and not have to open your browser to copy code

An implementation of model parallel GPT-3-like models on GPUs, based on the DeepSpeed library. Designed to be able to train models in the hundreds of billions of parameters or larger.

GPT-NeoX An implementation of model parallel GPT-3-like models on GPUs, based on the DeepSpeed library. Designed to be able to train models in the hun

๐Ÿ›ธ Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy

spacy-transformers: Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy This package provides spaCy components and architectures to use tr

๐Ÿ›ธ Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy

spacy-transformers: Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy This package provides spaCy components and architectures to use tr

Owner
TheProtagonist
New developer who is excited to learn new things.
TheProtagonist
Converts python code into c++ by using OpenAI CODEX.

?? codex_py2cpp ?? OpenAI Codex Python to C++ Code Generator Your Python Code is too slow? ?? You want to speed it up but forgot how to code in C++? โŒจ

Alexander 423 Jan 1, 2023
Honor's thesis project analyzing whether the GPT-2 model can more effectively generate free-verse or structured poetry.

gpt2-poetry The following code is for my senior honor's thesis project, under the guidance of Dr. Keith Holyoak at the University of California, Los A

Ashley Kim 2 Jan 9, 2022
Generate product descriptions, blogs, ads and more using GPT architecture with a single request to TextCortex API a.k.a Hemingwai

TextCortex - HemingwAI Generate product descriptions, blogs, ads and more using GPT architecture with a single request to TextCortex API a.k.a Hemingw

TextCortex AI 27 Nov 28, 2022
Shirt Bot is a discord bot which uses GPT-3 to generate text

SHIRT BOT ยท Shirt Bot is a discord bot which uses GPT-3 to generate text. Made by Cyclcrclicly#3420 (474183744685604865) on Discord. Support Server EX

null 31 Oct 31, 2022
Galois is an auto code completer for code editors (or any text editor) based on OpenAI GPT-2.

Galois is an auto code completer for code editors (or any text editor) based on OpenAI GPT-2. It is trained (finetuned) on a curated list of approximately 45K Python (~470MB) files gathered from the Github. Currently, it just works properly on Python but not bad at other languages (thanks to GPT-2's power).

Galois Autocompleter 91 Sep 23, 2022
Code for producing Japanese GPT-2 provided by rinna Co., Ltd.

japanese-gpt2 This repository provides the code for training Japanese GPT-2 models. This code has been used for producing japanese-gpt2-medium release

rinna Co.,Ltd. 491 Jan 7, 2023
Interactive Jupyter Notebook Environment for using the GPT-3 Instruct API

gpt3-instruct-sandbox Interactive Jupyter Notebook Environment for using the GPT-3 Instruct API Description This project updates an existing GPT-3 san

null 312 Jan 3, 2023
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and GPT-NEO (2.7 B) on a single 16 GB VRAM V100 Google Cloud instance with Huggingface Transformers using DeepSpeed

Guide: Finetune GPT2-XL (1.5 Billion Parameters) and GPT-NEO (2.7 Billion Parameters) on a single 16 GB VRAM V100 Google Cloud instance with Huggingfa

null 289 Jan 6, 2023
Seonghwan Kim 24 Sep 11, 2022