4 Repositories
Python moe Libraries
Tutel MoE: An Optimized Mixture-of-Experts Implementation
Project Tutel Tutel MoE: An Optimized Mixture-of-Experts Implementation. Supported Framework: Pytorch Supported GPUs: CUDA(fp32 + fp16), ROCm(fp32) Ho
344 Dec 29, 2022
Fully asynchronous trace.moe API wrapper
AioMoe Fully asynchronous trace.moe API wrapper Installation You can install the stable version from PyPI: $ pip install aiomoe Or get it from github
2 Jun 26, 2022
A fast MoE impl for PyTorch
An easy-to-use and efficient system to support the Mixture of Experts (MoE) model for PyTorch.
873 Jan 9, 2023
A modern CLI to download animes automatically from Twist
Kurby Kurby is a nice and simple CLI that use Twist website, and their huge collection to download animes for free and automatically Animes from Twist
48 Dec 22, 2022