614 Repositories
Python self-talk Libraries
Implementation of self-attention mechanisms for general purpose. Focused on computer vision modules. Ongoing repository.
Self-attention building blocks for computer vision applications in PyTorch Implementation of self attention mechanisms for computer vision in PyTorch
Authors implementation of LieTransformer: Equivariant Self-Attention for Lie Groups
LieTransformer This repository contains the implementation of the LieTransformer used for experiments in the paper LieTransformer: Equivariant self-at
This is a Pytorch implementation of the paper: Self-Supervised Graph Transformer on Large-Scale Molecular Data.
This is a Pytorch implementation of the paper: Self-Supervised Graph Transformer on Large-Scale Molecular Data.
Checkout some cool self-projects you can try your hands on to curb your boredom this December!
SoC-Winter Checkout some cool self-projects you can try your hands on to curb your boredom this December! These are short projects that you can do you
Subpar is a utility for creating self-contained python executables. It is designed to work well with Bazel.
Subpar Subpar is a utility for creating self-contained python executables. It is designed to work well with Bazel. Status Subpar is currently owned by
A self-contained cryptographic library for Python
PyCryptodome PyCryptodome is a self-contained Python package of low-level cryptographic primitives. It supports Python 2.7, Python 3.4 and newer, and
Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting
Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting This is the origin Pytorch implementation of Informer in the followin
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch.
SE3 Transformer - Pytorch Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. May be needed for replicating Alphafold2 resu
Implementation of Lie Transformer, Equivariant Self-Attention, in Pytorch
Lie Transformer - Pytorch (wip) Implementation of Lie Transformer, Equivariant Self-Attention, in Pytorch. Only the SE3 version will be present in thi
Official PyTorch implementation for paper Context Matters: Graph-based Self-supervised Representation Learning for Medical Images
Context Matters: Graph-based Self-supervised Representation Learning for Medical Images Official PyTorch implementation for paper Context Matters: Gra
Massively parallel self-organizing maps: accelerate training on multicore CPUs, GPUs, and clusters
Somoclu Somoclu is a massively parallel implementation of self-organizing maps. It exploits multicore CPUs, it is able to rely on MPI for distributing
A Python Library for Self Organizing Map (SOM)
SOMPY A Python Library for Self Organizing Map (SOM) As much as possible, the structure of SOM is similar to somtoolbox in Matlab. It has the followin
A python library for self-supervised learning on images.
Lightly is a computer vision framework for self-supervised learning. We, at Lightly, are passionate engineers who want to make deep learning more effi
shiv is a command line utility for building fully self contained Python zipapps as outlined in PEP 441, but with all their dependencies included.
shiv shiv is a command line utility for building fully self-contained Python zipapps as outlined in PEP 441, but with all their dependencies included!