10 Repositories
Python accelerators Libraries
Azure MLOps (v2) solution accelerators.
Azure MLOps (v2) solution accelerator Welcome to the MLOps (v2) solution accelerator repository! This project is intended to serve as the starting poi
QDax is a tool to accelerate Quality-Diveristy (QD) algorithms through hardware accelerators and massive parallelism
QDax: Accelerated Quality-Diversity QDax is a tool to accelerate Quality-Diveristy (QD) algorithms through hardware accelerators and massive paralleli
A C-like hardware description language (HDL) adding high level synthesis(HLS)-like automatic pipelining as a language construct/compiler feature.
██████╗ ██╗██████╗ ███████╗██╗ ██╗███╗ ██╗███████╗ ██████╗ ██╔══██╗██║██╔══██╗██╔════╝██║ ██║████╗ ██║██╔════╝██╔════╝ ██████╔╝██║██████╔╝█
Pyccel stands for Python extension language using accelerators.
Pyccel stands for Python extension language using accelerators.
A Python library for differentiable optimal control on accelerators.
A Python library for differentiable optimal control on accelerators.
An exploration of log domain "alternative floating point" for hardware ML/AI accelerators.
This repository contains the SystemVerilog RTL, C++, HLS (Intel FPGA OpenCL to wrap RTL code) and Python needed to reproduce the numerical results in
The Hailo Model Zoo includes pre-trained models and a full building and evaluation environment
Hailo Model Zoo The Hailo Model Zoo provides pre-trained models for high-performance deep learning applications. Using the Hailo Model Zoo you can mea
CoSA: Scheduling by Constrained Optimization for Spatial Accelerators
CoSA is a scheduler for spatial DNN accelerators that generate high-performance schedules in one shot using mixed integer programming
Pytorch Lightning Distributed Accelerators using Ray
Distributed PyTorch Lightning Training on Ray This library adds new PyTorch Lightning plugins for distributed training using the Ray distributed compu
Pytorch Lightning Distributed Accelerators using Ray
Distributed PyTorch Lightning Training on Ray This library adds new PyTorch Lightning accelerators for distributed training using the Ray distributed