13 Repositories
Python optimizers Libraries
Optimizers-visualized - Visualization of different optimizers on local minimas and saddle points.
Optimizers Visualized Visualization of how different optimizers handle mathematical functions for optimization. Contents Installation Usage Functions
learned_optimization: Training and evaluating learned optimizers in JAX
learned_optimization: Training and evaluating learned optimizers in JAX learned_optimization is a research codebase for training learned optimizers. I
This is a collection of simple PyTorch implementations of neural networks and related algorithms. These implementations are documented with explanations,
labml.ai Deep Learning Paper Implementations This is a collection of simple PyTorch implementations of neural networks and related algorithms. These i
Repository for open research on optimizers.
Open Optimizers Repository for open research on optimizers. This is a test in sharing research/exploration as it happens. If you use anything from thi
TLDR; Train custom adaptive filter optimizers without hand tuning or extra labels.
AutoDSP TLDR; Train custom adaptive filter optimizers without hand tuning or extra labels. About Adaptive filtering algorithms are commonplace in sign
Library for 8-bit optimizers and quantization routines.
bitsandbytes Bitsandbytes is a lightweight wrapper around CUDA custom functions, in particular 8-bit optimizers and quantization functions. Paper -- V
Implementations of Machine Learning models, Regularizers, Optimizers and different Cost functions.
Linear Models Implementations of LinearRegression, LassoRegression and RidgeRegression with appropriate Regularizers and Optimizers. Linear Regression
Task-based end-to-end model learning in stochastic optimization
Task-based End-to-end Model Learning in Stochastic Optimization This repository is by Priya L. Donti, Brandon Amos, and J. Zico Kolter and contains th
Hardware accelerated, batchable and differentiable optimizers in JAX.
JAXopt Installation | Examples | References Hardware accelerated (GPU/TPU), batchable and differentiable optimizers in JAX. Installation JAXopt can be
ML Optimizers from scratch using JAX
Toy implementations of some popular ML optimizers using Python/JAX
Differentiable Optimizers with Perturbations in Pytorch
Differentiable Optimizers with Perturbations in PyTorch This contains a PyTorch implementation of Differentiable Optimizers with Perturbations in Tens
torch-optimizer -- collection of optimizers for Pytorch
torch-optimizer torch-optimizer -- collection of optimizers for PyTorch compatible with optim module. Simple example import torch_optimizer as optim
🎯 A comprehensive gradient-free optimization framework written in Python
Solid is a Python framework for gradient-free optimization. It contains basic versions of many of the most common optimization algorithms that do not