Code for MB-GMN, SIGIR 2021

Overview

MB-GMN

Code for MB-GMN, SIGIR 2021

For Beibei data, run

python .\labcode.py

For Tmall data, run

python .\labcode.py --data tmall --rank 2

For IJCAI data, run

python .\labcode_samp.py --data ijcai --rank 2 --graphSampleN 40000
You might also like...
ProxyLogon(CVE-2021-26855+CVE-2021-27065) Exchange Server RCE(SSRF->GetWebShell)
ProxyLogon(CVE-2021-26855+CVE-2021-27065) Exchange Server RCE(SSRF-GetWebShell)

ProxyLogon For Python3 ProxyLogon(CVE-2021-26855+CVE-2021-27065) Exchange Server RCE(SSRF-GetWebShell) usage: python ProxyLogon.py --host=exchang

git git《Transformer Meets Tracker: Exploiting Temporal Context for Robust Visual Tracking》(CVPR 2021) GitHub:git2] 《Masksembles for Uncertainty Estimation》(CVPR 2021) GitHub:git3]
git git《Transformer Meets Tracker: Exploiting Temporal Context for Robust Visual Tracking》(CVPR 2021) GitHub:git2] 《Masksembles for Uncertainty Estimation》(CVPR 2021) GitHub:git3]

Transformer Meets Tracker: Exploiting Temporal Context for Robust Visual Tracking Ning Wang, Wengang Zhou, Jie Wang, and Houqiang Li Accepted by CVPR

Python implementation for PrintNightmare (CVE-2021-1675 / CVE-2021-34527) using standard Impacket.

PrintNightmare Python implementation for PrintNightmare (CVE-2021-1675 / CVE-2021-34527) using standard Impacket. Installtion $ pip3 install impacket

Exploiting CVE-2021-42278 and CVE-2021-42287 to impersonate DA from standard domain user

Exploiting CVE-2021-42278 and CVE-2021-42287 to impersonate DA from standard domain user Known issues it will not work outside kali , i will update it

Exploiting CVE-2021-42278 and CVE-2021-42287

noPac Exploiting CVE-2021-42278 and CVE-2021-42287 原项目noPac在实现上可能有点问题,导致在本地没有打通,于是参考sam-the-admin项目进行修改。 使用 pip3 install -r requirements.txt # GetShel

Exploiting CVE-2021-42278 and CVE-2021-42287 to impersonate DA from standard domain user
Exploiting CVE-2021-42278 and CVE-2021-42287 to impersonate DA from standard domain user

About Exploiting CVE-2021-42278 and CVE-2021-42287 to impersonate DA from standard domain user Changed from sam-the-admin. Usage SAM THE ADMIN CVE-202

Details,PoC and patches for CVE-2021-45383 & CVE-2021-45384

CVE-2021-45383 & CVE-2021-45384 There are several network-layer vulnerabilities in the official server of Minecraft: Bedrock Edition (aka Bedrock Serv

This is the code for the paper
This is the code for the paper "Contrastive Clustering" (AAAI 2021)

Contrastive Clustering (CC) This is the code for the paper "Contrastive Clustering" (AAAI 2021) Dependency python=3.7 pytorch=1.6.0 torchvision=0.8

Code for our ICASSP 2021 paper: SA-Net: Shuffle Attention for Deep Convolutional Neural Networks
Code for our ICASSP 2021 paper: SA-Net: Shuffle Attention for Deep Convolutional Neural Networks

SA-Net: Shuffle Attention for Deep Convolutional Neural Networks (paper) By Qing-Long Zhang and Yu-Bin Yang [State Key Laboratory for Novel Software T

PyTorch code for ICLR 2021 paper Unbiased Teacher for Semi-Supervised Object Detection
PyTorch code for ICLR 2021 paper Unbiased Teacher for Semi-Supervised Object Detection

Unbiased Teacher for Semi-Supervised Object Detection This is the PyTorch implementation of our paper: Unbiased Teacher for Semi-Supervised Object Detection

[CVPR 2021] Released code for Counterfactual Zero-Shot and Open-Set Visual Recognition
[CVPR 2021] Released code for Counterfactual Zero-Shot and Open-Set Visual Recognition

Counterfactual Zero-Shot and Open-Set Visual Recognition This project provides implementations for our CVPR 2021 paper Counterfactual Zero-S

Code for Multiple Instance Active Learning for Object Detection, CVPR 2021
Code for Multiple Instance Active Learning for Object Detection, CVPR 2021

Language: 简体中文 | English Introduction This is the code for Multiple Instance Active Learning for Object Detection, CVPR 2021. Installation A Linux pla

Code for our CVPR 2021 paper
Code for our CVPR 2021 paper "MetaCam+DSCE"

Joint Noise-Tolerant Learning and Meta Camera Shift Adaptation for Unsupervised Person Re-Identification (CVPR'21) Introduction Code for our CVPR 2021

Official code for
Official code for "Parser-Free Virtual Try-on via Distilling Appearance Flows", CVPR 2021

Parser-Free Virtual Try-on via Distilling Appearance Flows, CVPR 2021 Official code for CVPR 2021 paper 'Parser-Free Virtual Try-on via Distilling App

Code for ICLR 2021 Paper,
Code for ICLR 2021 Paper, "Anytime Sampling for Autoregressive Models via Ordered Autoencoding"

Anytime Autoregressive Model Anytime Sampling for Autoregressive Models via Ordered Autoencoding , ICLR 21 Yilun Xu, Yang Song, Sahaj Gara, Linyuan Go

Official code of our work, Unified Pre-training for Program Understanding and Generation [NAACL 2021].

PLBART Code pre-release of our work, Unified Pre-training for Program Understanding and Generation accepted at NAACL 2021. Note. A detailed documentat

Official code of the paper
Official code of the paper "ReDet: A Rotation-equivariant Detector for Aerial Object Detection" (CVPR 2021)

ReDet: A Rotation-equivariant Detector for Aerial Object Detection ReDet: A Rotation-equivariant Detector for Aerial Object Detection (CVPR2021), Jiam

Code for the paper
Code for the paper "Training GANs with Stronger Augmentations via Contrastive Discriminator" (ICLR 2021)

Training GANs with Stronger Augmentations via Contrastive Discriminator (ICLR 2021) This repository contains the code for reproducing the paper: Train

Code for
Code for "Neural Parts: Learning Expressive 3D Shape Abstractions with Invertible Neural Networks", CVPR 2021

Neural Parts: Learning Expressive 3D Shape Abstractions with Invertible Neural Networks This repository contains the code that accompanies our CVPR 20

Comments
  • 您好,请问运行报错是什么原因

    您好,请问运行报错是什么原因

    我是Python 3.6.6 tensorflow 1.14.0

    没有对代码作任何修改,直接运行python labcode.py时如下报错,请问这是什么原因呢?

    tensorflow.python.framework.errors_impl.InternalError: 2 root error(s) found. (0) Internal: Blas xGEMMBatched launch failed : a.shape=[6520,1,96], b.shape=[6520,1,32], m=96, n=32, k=1, batch_size=6520 [[{{node gradients/matmul_74_grad/MatMul_1}}]] [[Adam/update/_148]] (1) Internal: Blas xGEMMBatched launch failed : a.shape=[6520,1,96], b.shape=[6520,1,32], m=96, n=32, k=1, batch_size=6520 [[{{node gradients/matmul_74_grad/MatMul_1}}]] 0 successful operations. 0 derived errors ignored.

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last): File "labcode.py", line 344, in recom.run() File "labcode.py", line 50, in run reses = self.trainEpoch() File "labcode.py", line 246, in trainEpoch res = self.sess.run(target, feed_dict=feed_dict, options=config_pb2.RunOptions(report_tensor_allocations_upon_oom=True)) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 950, in run run_metadata_ptr) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1173, in _run feed_dict_tensor, options, run_metadata) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1350, in _do_run run_metadata) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1370, in _do_call raise type(e)(node_def, op, message) tensorflow.python.framework.errors_impl.InternalError: 2 root error(s) found. (0) Internal: Blas xGEMMBatched launch failed : a.shape=[6520,1,96], b.shape=[6520,1,32], m=96, n=32, k=1, batch_size=6520 [[node gradients/matmul_74_grad/MatMul_1 (defined at labcode.py:198) ]] [[Adam/update/_148]] (1) Internal: Blas xGEMMBatched launch failed : a.shape=[6520,1,96], b.shape=[6520,1,32], m=96, n=32, k=1, batch_size=6520 [[node gradients/matmul_74_grad/MatMul_1 (defined at labcode.py:198) ]] 0 successful operations. 0 derived errors ignored.

    Errors may have originated from an input operation. Input Source operations connected to node gradients/matmul_74_grad/MatMul_1: ExpandDims_29 (defined at labcode.py:151)

    Input Source operations connected to node gradients/matmul_74_grad/MatMul_1: ExpandDims_29 (defined at labcode.py:151)

    Original stack trace for 'gradients/matmul_74_grad/MatMul_1': File "labcode.py", line 344, in recom.run() File "labcode.py", line 38, in run self.prepareModel() File "labcode.py", line 198, in prepareModel self.optimizer = tf.train.AdamOptimizer(learningRate).minimize(self.loss, global_step=globalStep) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/training/optimizer.py", line 403, in minimize grad_loss=grad_loss) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/training/optimizer.py", line 512, in compute_gradients colocate_gradients_with_ops=colocate_gradients_with_ops) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/ops/gradients_impl.py", line 158, in gradients unconnected_gradients) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/ops/gradients_util.py", line 731, in _GradientsHelper lambda: grad_fn(op, *out_grads)) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/ops/gradients_util.py", line 403, in _MaybeCompile return grad_fn() # Exit early File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/ops/gradients_util.py", line 731, in lambda: grad_fn(op, *out_grads)) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/ops/math_grad.py", line 1511, in _BatchMatMulV2 grad_y = math_ops.matmul(x, grad, adjoint_a=True, adjoint_b=False) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/util/dispatch.py", line 180, in wrapper return target(*args, **kwargs) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 2609, in matmul return batch_mat_mul_fn(a, b, adj_x=adjoint_a, adj_y=adjoint_b, name=name) File "/home/wangchao/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/ops/gen_math_ops.py", line 1677, in batch_mat_mul_v2 "BatchMatMulV2", x=x, y=y, adj_x=adj_x, adj_y=adj_y, name=name) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 788, in _apply_op_helper op_def=op_def) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 507, in new_func return func(*args, **kwargs) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3616, in create_op op_def=op_def) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 2005, in init self._traceback = tf_stack.extract_stack()

    ...which was originally created as op 'matmul_74', defined at: File "labcode.py", line 344, in recom.run() [elided 0 identical lines from previous traceback] File "labcode.py", line 38, in run self.prepareModel() File "labcode.py", line 186, in prepareModel preds = self.predict(src, tgt) File "labcode.py", line 166, in predict return self._predict(src_ulat, src_ilat, predParams) * args.mult File "labcode.py", line 152, in _predict predEmbed = Activate(predEmbed @ params['w1'] + params['b1'], self.actFunc) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 884, in binary_op_wrapper return func(x, y, name=name) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/util/dispatch.py", line 180, in wrapper return target(*args, **kwargs) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 2609, in matmul return batch_mat_mul_fn(a, b, adj_x=adjoint_a, adj_y=adjoint_b, name=name) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/ops/gen_math_ops.py", line 1677, in batch_mat_mul_v2 "BatchMatMulV2", x=x, y=y, adj_x=adj_x, adj_y=adj_y, name=name) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 788, in _apply_op_helper op_def=op_def) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 507, in new_func return func(*args, **kwargs) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3616, in create_op op_def=op_def) File "/home/user/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 2005, in init self._traceback = tf_stack.extract_stack()

    opened by chadwang2012 2
  • How to use the trn_pv.part*.rar in the Tmall directory?

    How to use the trn_pv.part*.rar in the Tmall directory?

    About the tmall data, I wonder how to use the data trn_pv.part*.rar. Does it need us to unzip the rar files and then just use one file as the input or the combination of all the files? I would appreciate very much if you help me with the question.

    opened by Scofield666 1
Owner
null
Code for our SIGIR 2022 accepted paper : P3 Ranker: Mitigating the Gaps between Pre-training and Ranking Fine-tuning with Prompt-based Learning and Pre-finetuning

P3 Ranker Implementation for our SIGIR2022 accepted paper: P3 Ranker: Mitigating the Gaps between Pre-training and Ranking Fine-tuning with Prompt-bas

null 14 Jan 4, 2023
Code for the SIGIR 2022 paper "Hybrid Transformer with Multi-level Fusion for Multimodal Knowledge Graph Completion"

MKGFormer Code for the SIGIR 2022 paper "Hybrid Transformer with Multi-level Fusion for Multimodal Knowledge Graph Completion" Model Architecture Illu

ZJUNLP 68 Dec 28, 2022
"Video Moment Retrieval from Text Queries via Single Frame Annotation" in SIGIR 2022.

ViGA: Video moment retrieval via Glance Annotation This is the official repository of the paper "Video Moment Retrieval from Text Queries via Single F

Ran Cui 38 Dec 31, 2022
Official repository for "Exploiting Session Information in BERT-based Session-aware Sequential Recommendation", SIGIR 2022 short.

Session-aware BERT4Rec Official repository for "Exploiting Session Information in BERT-based Session-aware Sequential Recommendation", SIGIR 2022 shor

Jamie J. Seol 22 Dec 13, 2022
Codes for SIGIR'22 Paper 'On-Device Next-Item Recommendation with Self-Supervised Knowledge Distillation'

OD-Rec Codes for SIGIR'22 Paper 'On-Device Next-Item Recommendation with Self-Supervised Knowledge Distillation' Paper, saved teacher models and Andro

Xin Xia 11 Nov 22, 2022
SIGIR'22 paper: Axiomatically Regularized Pre-training for Ad hoc Search

Introduction This codebase contains source-code of the Python-based implementation (ARES) of our SIGIR 2022 paper. Chen, Jia, et al. "Axiomatically Re

Jia Chen 17 Nov 9, 2022
ProxyLogon Full Exploit Chain PoC (CVE-2021–26855, CVE-2021–26857, CVE-2021–26858, CVE-2021–27065)

ExProlog ProxyLogon Full Exploit Chain PoC (CVE-2021–26855, CVE-2021–26857, CVE-2021–26858, CVE-2021–27065) Usage: exprolog.py [OPTIONS] ExProlog -

Herwono W. Wijaya 130 Dec 15, 2022
This is discord nitro code generator and checker made with python. This will generate nitro codes and checks if the code is valid or not. If code is valid then it will print the code leaving 2 lines and if not then it will print '*'.

Discord Nitro Generator And Checker ⚙️ Rᴜɴ Oɴ Rᴇᴘʟɪᴛ ??️ Lᴀɴɢᴜᴀɢᴇs Aɴᴅ Tᴏᴏʟs If you are taking code from this repository without a fork, then atleast

Vɪɴᴀʏᴀᴋ Pᴀɴᴅᴇʏ 37 Jan 7, 2023
This is the official source code for SLATE. We provide the code for the model, the training code, and a dataset loader for the 3D Shapes dataset. This code is implemented in Pytorch.

SLATE This is the official source code for SLATE. We provide the code for the model, the training code and a dataset loader for the 3D Shapes dataset.

Gautam Singh 66 Dec 26, 2022
Code of the lileonardo team for the 2021 Emotion and Theme Recognition in Music task of MediaEval 2021

Emotion and Theme Recognition in Music The repository contains code for the submission of the lileonardo team to the 2021 Emotion and Theme Recognitio

Vincent Bour 8 Aug 2, 2022