4 Repositories
Python moe Libraries
Tutel MoE: An Optimized Mixture-of-Experts Implementation
Project Tutel Tutel MoE: An Optimized Mixture-of-Experts Implementation. Supported Framework: Pytorch Supported GPUs: CUDA(fp32 + fp16), ROCm(fp32) Ho
Fully asynchronous trace.moe API wrapper
AioMoe Fully asynchronous trace.moe API wrapper Installation You can install the stable version from PyPI: $ pip install aiomoe Or get it from github
A fast MoE impl for PyTorch
An easy-to-use and efficient system to support the Mixture of Experts (MoE) model for PyTorch.
A modern CLI to download animes automatically from Twist
Kurby Kurby is a nice and simple CLI that use Twist website, and their huge collection to download animes for free and automatically Animes from Twist