CTR Algorithm
根据论文, 博客, 知乎等方式学习一些CTR相关的算法
理解原理并自己动手来实现一遍
pytorch & tf2.0
保持一颗学徒的心!
根据论文, 博客, 知乎等方式学习一些CTR相关的算法
理解原理并自己动手来实现一遍
pytorch & tf2.0
保持一颗学徒的心!
Amazon Forest Computer Vision Satellite Image tagging code using PyTorch / Keras Here is a sample of images we had to work with Source: https://www.ka
This is a curated list of tutorials, projects, libraries, videos, papers, books and anything related to the incredible PyTorch. Feel free to make a pu
Mixup: Beyond Empirical Risk Minimization in PyTorch This is an unofficial PyTorch implementation of mixup: Beyond Empirical Risk Minimization. The co
Pytorch-Sketch-RNN A pytorch implementation of https://arxiv.org/abs/1704.03477 In order to draw other things than cats, you will find more drawing da
Advantage async actor-critic Algorithms (A3C) in PyTorch @inproceedings{mnih2016asynchronous, title={Asynchronous methods for deep reinforcement lea
Amazon Forest Computer Vision Satellite Image tagging code using PyTorch / Keras Here is a sample of images we had to work with Source: https://www.ka
PyTorch Deep Learning Models using the C++ frontend Gettting started Clone the repo 1. https://github.com/mrdvince/pytorchcpp 2. cd fashionmnist or
PyTorch Autoencoders Implementing a Variational Autoencoder (VAE) Series in Pytorch. Inspired by this repository Model List check model paper conferen
PyTorch-LIT PyTorch-LIT is the Lite Inference Toolkit (LIT) for PyTorch which focuses on easy and fast inference of large models on end-devices. With
大佬您好,最近看您的DIN代码,有一些地方不太明白,希望得到您的解答! 1、mask = (behaviors_x > 0).float().unsqueeze(-1) 这里的msak的具体作用是啥啊?为什么需要这个mask呢? 这里的 注意力输入部分,原始的好像没有queries - user_behavior吧?为啥有这一项呢 2、attn_input = torch.cat([queries, user_behavior, queries - user_behavior, queries * user_behavior], dim = -1) 3、 output = user_behavior.mul(attns.mul(mask)) # batch * seq_len * embed_dim 这个里面为啥还有mask呢?
你好,我在阅读DIEN的PyTorch代码时发现一个地方可能存在错误
在Interest Extractor Layer中,GRU的隐藏层应该保留前T-1个用于计算辅助loss
但是DIEN代码的152行是
gru_embed=pad_interests[:,1:]
这样是不是取了后T-1个hidden state?
我理解的应该是
gru_embed=pad_interests[:,:-1]
大佬您好,最近看您的DIN代码,有一些地方不太明白,希望得到您的解答! 1、mask = (behaviors_x > 0).float().unsqueeze(-1) 这里的msak的具体作用是啥啊?为什么需要这个mask呢? 这里的 注意力输入部分,原始的好像没有queries - user_behavior吧?为啥有这一项呢 2、attn_input = torch.cat([queries, user_behavior, queries - user_behavior, queries * user_behavior], dim = -1) 3、 output = user_behavior.mul(attns.mul(mask)) # batch * seq_len * embed_dim 这个里面为啥还有mask呢?
大佬,您好,您写的这个项目是我看到最优美的推荐代码,对于小白的我,非常受益,感谢您把您的工作分享出来,供大家学习。在阅读代码的时候,我一直有一个小小的疑惑,就是,fields = data_x.max().values,这句话的意思是获取到每列特征中最大的索引,比如第一列是0,1两个不同的特征,而这句代码直接就取到了1,然而实际就有两个不同的特征,这样做embedding的时候,每一列的特征都少一个,所以torch.nn.Embedding(sum(feature_fields)+1,1)是不是应该为torch.nn.Embedding(sum(feature_fields+1),1)?还有为什么要在sum(feature_fields)后面加个1呢?还请大佬明示,感谢!
FunMatch-Distillation TF2 implementation of knowledge distillation using the "function matching" hypothesis from the paper Knowledge distillation: A g
autolfads-tf2 A TensorFlow 2.0 implementation of LFADS and AutoLFADS. Installati
What is xLearn? xLearn is a high performance, easy-to-use, and scalable machine learning package that contains linear model (LR), factorization machin
What is xLearn? xLearn is a high performance, easy-to-use, and scalable machine learning package that contains linear model (LR), factorization machin
Essential BYOL A simple and complete implementation of Bootstrap your own latent: A new approach to self-supervised Learning in PyTorch + PyTorch Ligh
RealFormer-Pytorch Implementation of RealFormer using pytorch. Includes comparison with classical Transformer on image classification task (ViT) wrt C
NN Template Generic template to bootstrap your PyTorch project. Click on Use this Template and avoid writing boilerplate code for: PyTorch Lightning,
This repository holds NVIDIA-maintained utilities to streamline mixed precision and distributed training in Pytorch. Some of the code here will be included in upstream Pytorch eventually. The intention of Apex is to make up-to-date utilities available to users as quickly as possible.
30 Days Of Machine Learning Using Pytorch Objective of the repository is to learn and build machine learning models using Pytorch. List of Algorithms
Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch