An open-source, mini imitation of GitHub Copilot for Emacs.

Overview

Second Mate

An open-source, mini imitation of GitHub Copilot using EleutherAI GPT-Neo-2.7B (via Huggingface Model Hub) for Emacs.

This is a much smaller model so will likely not be as effective as Copilot, but can still be interesting to play around with!

./assets/demo1.gif

Setup

Inference End / Backend

  1. Set device to “cpu” or “cuda” in serve/server.py
  2. The “priming” is currently done in Python. If you want, modify it to another language or turn it off (priming subjectively seems to help).
  3. Launch serve/server.py. This will launch a Flask app which will allow us to sample the model via REST API.

Emacs

  1. In emacs/secondmate.py, set the URL to “localhost” or the address the API is running on.
  2. Configure Python and script path in emacs/secondmate.el. NOTE: The local Python script is a temporary patch which will be replaced by a GET request in Elisp directly.
Comments
  • Support launching server automatically from Emacs

    Support launching server automatically from Emacs

    There isn't really a reason why the server couldn't be launched automatically from Emacs. This requires figuring out a few things though:

    • Locate a suitable Python interpreter.
    • Locate the Python script relatively to the package file.
    • Query the user for connection options if necessary (unless they set them explicitly).
    • Pass those options to the Python script and launch it asynchronously.

    The design for this can be lifted from other interaction packages, such as CIDER or Sly. See also #3.

    Another thing worth thinking about is whether this really needs to use a HTTP service. If this is designed to run on localhost, talking with a subprocess makes a lot more sense and avoids security issues. I wrote a bunch before, though for interaction via Scheme rather than Emacs Lisp.

    discussion 
    opened by wasamasa 1
  • Provide installation instructions for the Python dependencies

    Provide installation instructions for the Python dependencies

    It's customary to provide a requirements.txt file to install Python dependencies. From what I can tell this would be Flask and https://huggingface.co/transformers/installation.html.

    opened by wasamasa 0
  • Contacting host...

    Contacting host...

    Hi, I am at the last stage of the setup and emacs just gets stuck on Contacting localhost:9900 and the terminal outputs Setting pad_token_id to eos_token_id:50256 for open-end generation but nothing happens. What am I meant to do?

    opened by ttuleyb 0
  • Not enough memory error running server.py

    Not enough memory error running server.py

    device is set to "cuda" and changing it to "cpu" returns the same error

    python server.py

    Traceback (most recent call last):
      File "A:\xxxxxxxxx\emacs-secondmate\serve\server.py", line 10, in <module>
        model = AutoModelForCausalLM.from_pretrained(modelname)
      File "C:\Users\xxxxx\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\transformers\models\auto\auto_factory.py", line 395, in from_pretrained
        return model_class.from_pretrained(pretrained_model_name_or_path, *model_args, config=config, **kwargs)
      File "C:\Users\xxxxx\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\transformers\modeling_utils.py", line 1179, in from_pretrained
        model = cls(config, *model_args, **model_kwargs)
      File "C:\Users\xxxxx\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\transformers\models\gpt_neo\modeling_gpt_neo.py", line 905, in __init__
        self.transformer = GPTNeoModel(config)
      File "C:\Users\xxxxx\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\transformers\models\gpt_neo\modeling_gpt_neo.py", line 708, in __init__
        self.h = nn.ModuleList([GPTNeoBlock(config, layer_id=i) for i in range(config.num_layers)])
      File "C:\Users\xxxxx\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\transformers\models\gpt_neo\modeling_gpt_neo.py", line 708, in <listcomp>
        self.h = nn.ModuleList([GPTNeoBlock(config, layer_id=i) for i in range(config.num_layers)])
      File "C:\Users\xxxxx\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\transformers\models\gpt_neo\modeling_gpt_neo.py", line 542, in __init__
        self.mlp = GPTNeoMLP(inner_dim, config)
      File "C:\Users\xxxxx\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\transformers\models\gpt_neo\modeling_gpt_neo.py", line 521, in __init__
        self.c_fc = nn.Linear(embed_dim, intermediate_size)
      File "C:\Users\xxxxx\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\torch\nn\modules\linear.py", line 81, in __init__
        self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
    RuntimeError: [enforce fail at ..\c10\core\CPUAllocator.cpp:79] data. DefaultCPUAllocator: not enough memory: you tried to allocate 104857600 bytes.
    
    opened by GrahamboJangles 0
  • Allow option of using HuggingFace Accelerated Inference API

    Allow option of using HuggingFace Accelerated Inference API

    If the user has access to the HuggingFace Accelerated Inference API [0], allow them to configure appropriate API Tokens and inference the model via the API rather than having to run the Language Model server locally.

    [0] https://api-inference.huggingface.co/docs/python/html/index.html

    opened by samrawal 0
  • Convert Emacs code to an Emacs package

    Convert Emacs code to an Emacs package

    • Remove top-level key binding commands and instead document what commands users should bind in their init file.
    • Add package header and footer.
    • Make sure that the package adheres to MELPA's coding guidelines.

    See also #3.

    opened by wasamasa 0
  • Could you explain how to get started with this/install via a package manager in the README?

    Could you explain how to get started with this/install via a package manager in the README?

    Hi, I love the project and would like to try it out. However, as an emacs noob I'm not sure how to get this started with something like use-package.

    Additionally, does the plugin automatically launch the server, or would you have to do that manually? If so, could it be made to be automatic?

    opened by shaunsingh 1
Owner
Sam Rawal
AI + Medicine.
Sam Rawal
The official GitHub mirror of https://gitlab.com/pycqa/flake8

Flake8 Flake8 is a wrapper around these tools: PyFlakes pycodestyle Ned Batchelder's McCabe script Flake8 runs all the tools by launching the single f

Python Code Quality Authority 2.6k Jan 3, 2023
A static-analysis bot for Github

Imhotep, the peaceful builder. What is it? Imhotep is a tool which will comment on commits coming into your repository and check for syntactic errors

Justin Abrahms 221 Nov 10, 2022
A simple program which checks Python source files for errors

Pyflakes A simple program which checks Python source files for errors. Pyflakes analyzes programs and detects various errors. It works by parsing the

Python Code Quality Authority 1.2k Dec 30, 2022
GPT-Code-Clippy (GPT-CC) is an open source version of GitHub Copilot, a language model

GPT-Code-Clippy (GPT-CC) is an open source version of GitHub Copilot, a language model -- based on GPT-3, called GPT-Codex -- that is fine-tuned on publicly available code from GitHub.

Nathan Cooper 2.3k Jan 1, 2023
GPT-Code-Clippy (GPT-CC) is an open source version of GitHub Copilot

GPT-Code-Clippy (GPT-CC) is an open source version of GitHub Copilot, a language model -- based on GPT-3, called GPT-Codex -- that is fine-tuned on publicly available code from GitHub.

null 2.3k Jan 9, 2023
Why write code when you can import it directly from GitHub Copilot?

Copilot Importer Why write code when you can import it directly from GitHub Copilot? What is Copilot Importer? The copilot python module will dynamica

Mythic 41 Jan 4, 2023
Documentation for GitHub Copilot

NOTE: GitHub Copilot discussions have moved to the Copilot Feedback forum. GitHub Copilot Welcome to the GitHub Copilot user community! In this reposi

GitHub 21.3k Dec 28, 2022
Your copilot to studies and work (Pomodoro-timer, Translate and Notes app)

Copylot Your copilot to studies and work (Pomodoro-timer, Translate and Notes app) Copylot are three applications in one: Pomodoro Translate Notes Cop

Eduardo Mendes 20 Dec 16, 2022
Let's learn how to build, release and operate your containerized applications to Amazon ECS and AWS Fargate using AWS Copilot.

?? Welcome to AWS Copilot Workshop In this workshop, you'll learn how to build, release and operate your containerised applications to Amazon ECS and

Donnie Prakoso 15 Jul 14, 2022
Mini Pupper - Open-Source,ROS Robot Dog Kit

Mini Pupper - Open-Source,ROS Robot Dog Kit

MangDang 747 Dec 28, 2022
Mini Tool to lovers of debe from eksisozluk (one of the most famous website -reffered as collaborative dictionary like reddit- in Turkey) for pushing debe (Most Liked Entries of Yesterday) to kindle every day via Github Actions.

debe to kindle Mini Tool to lovers of debe from eksisozluk (one of the most famous website -refered as collaborative dictionary like reddit- in Turkey

null 11 Oct 11, 2022
Emacs Python Development Environment

Elpy, the Emacs Python IDE Elpy is an Emacs package to bring powerful Python editing to Emacs. It combines and configures a number of other packages,

Jorgen Schäfer 1.8k Jan 2, 2023
RSS Reader application for the Emacs Application Framework.

EAF RSS Reader RSS Reader application for the Emacs Application Framework. Load application (add-to-list 'load-path "~/.emacs.d/site-lisp/eaf-rss-read

EAF 15 Dec 7, 2022
A Python code editor that looks like GNU Emacs.

?? WARNING ?? : Under development... Testing is not recommended! Welcome to Snake Editor! Hi! This is our repository, we are here to present our new p

Marcio Dantas 5 May 20, 2022
Live coding in Python with PyCharm, Emacs, Sublime Text, or even a browser

Live Coding in Python Visualize your Python code while you type it in PyCharm, Emacs, Sublime Text, or even your browser. To see how to use one of the

Don Kirkby 256 Dec 14, 2022
Bringing emacs' greatest feature to neovim - Tetris!

nvim-tetris Bringing emacs' greatest feature to neovim - Tetris! This plugin is written in Fennel using Olical's project Aniseed for creating the proj

null 129 Dec 26, 2022
Fastest Git client for Emacs.

EAF Git Client EAF Git is git client application for the Emacs Application Framework. The advantages of EAF Git are: Large log browse: support 1 milli

Emacs Application Framework 31 Dec 2, 2022
Learning from Guided Play: A Scheduled Hierarchical Approach for Improving Exploration in Adversarial Imitation Learning Source Code

Learning from Guided Play: A Scheduled Hierarchical Approach for Improving Exploration in Adversarial Imitation Learning Source Code

STARS Laboratory 8 Sep 14, 2022
PaddleRobotics is an open-source algorithm library for robots based on Paddle, including open-source parts such as human-robot interaction, complex motion control, environment perception, SLAM positioning, and navigation.

简体中文 | English PaddleRobotics paddleRobotics是基于paddle的机器人开源算法库集,包括人机交互、复杂运动控制、环境感知、slam定位导航等开源算法部分。 人机交互 主动多模交互技术TFVT-HRI 主动多模交互技术是通过视觉、语音、触摸传感器等输入机器人

null 185 Dec 26, 2022