Codes for NeurIPS 2021 paper "On the Equivalence between Neural Network and Support Vector Machine".

Overview

On the Equivalence between Neural Network and Support Vector Machine

Codes for NeurIPS 2021 paper "On the Equivalence between Neural Network and Support Vector Machine".

Cite our paper

Yilan Chen, Wei Huang, Lam M. Nguyen, Tsui-Wei Weng, "On the Equivalence between Neural Network and Support Vector Machine", NeurIPS 2021.

@inproceedings{chen2021equiv,
  title={On the equivalence between neural network and support vector machine},
  author={Yilan Chen and Wei Huang and Lam M. Nguyen and Tsui-Wei Weng},
  booktitle={Advances in Neural Information Processing Systems},
  year={2021}
}

Overview

In this paper, we prove the equivalence between neural network (NN) and support vector machine (SVM), specifically, the infinitely wide NN trained by soft margin loss and the standard soft margin SVM with NTK trained by subgradient descent. Our main theoretical results include establishing the equivalence between NN and a broad family of L2 regularized kernel machines (KMs) with finite-width bounds, which cannot be handled by prior work, and showing that every finite-width NN trained by such regularized loss functions is approximately a KM.

Furthermore, we demonstrate our theory can enable three practical applications, including

  • non-vacuous generalization bound of NN via the corresponding KM;
  • non-trivial robustness certificate for the infinite-width NN (while existing robustness verification methods (e.g. IBP, Fast-Lin, CROWN) would provide vacuous bounds);
  • intrinsically more robust infinite-width NNs than those from previous kernel regression.

See our paper and slides for details.

Equivalence between infinite-width NNs and a family of KMs

Code overview

  • train_sgd.py: train the NN and SVM with NTK with stochastic subgradient descent. Plot the results to verify the equivalence.

  • generalization.py: compute non-vacuous generalization bound of NN via the corresponding KM.

  • regression.py: kernel ridge regression with NTK.

  • robust_svm.py:

    • test(): evaluate the robustness of NN using IBP or SVM with our method in the paper.
    • test_regressions(): evaluate the robustness of kernel ridge regression models using our method.
    • bound_ntk():calculate the lower and upper bound for NTK of two-layer fully-connected NN.
  • ibp.py: functions to calculate IBP bounds. Specified for NTK parameterization.

  • models/model.py: codes for constructing fully-connected neural networks with NTK parameterization.

  • config/:

    • svm_sgd.yaml: configurations and hyper-parameters to train NN and SVM.
    • svm_gene.yaml: configurations and hyper-parameters to calculate generalization bound.

Required environments:

This code is tested on the below environments:

python==3.8.8
torch==1.8.1
neural-tangents==0.3.6

Other required packages can be installed using Conda as follows,

conda create -n equiv-nn-svm python=3.8
conda activate equiv-nn-svm
conda install numpy tqdm matplotlib seaborn pyyaml

For the installation of PyTorch, please reference the instructions from https://pytorch.org/get-started/locally/. For the installation and usage of neural-tangents, please reference the instructions at https://github.com/google/neural-tangents.

Experiments

Train NN and SVM to verify the equivalence

python train_sgd.py

Example of the SGD results

SGD results

Example of the GD results

GD results

Computing non-vacuous generalization bound of NN via the corresponding KM

python generalization.py

Example of the generalization bound results

Generalization bound results

Robustness verification of NN

Add your paths to your NN models in the code and separate by the width. Specify the width of the models you want to verify. Then run the test() function in robust_svm.py.

python -c "import robust_svm; robust_svm.test('nn')"

Robustness verification of SVM

Add your paths to your SVM models in the code. Then run the test() function in robust_svm.py.

python -c "import robust_svm; robust_svm.test('svm')"

robustness verification results

Train kernel ridge regression with NTK models

python regression.py

Robustness verification of kernel ridge regression models

Run test_regressions() function in robust_svm.py.

python -c "import robust_svm; robust_svm.test_regressions()"

robustness verification results

You might also like...
PyTorch implementation of NeurIPS 2021 paper:
PyTorch implementation of NeurIPS 2021 paper: "CoFiNet: Reliable Coarse-to-fine Correspondences for Robust Point Cloud Registration"

PyTorch implementation of NeurIPS 2021 paper: "CoFiNet: Reliable Coarse-to-fine Correspondences for Robust Point Cloud Registration"

Source code of NeurIPS 2021 Paper ''Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration''

CaGCN This repo is for source code of NeurIPS 2021 paper "Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration". Paper L

The official implementation of NeurIPS 2021 paper: Finding Optimal Tangent Points for Reducing Distortions of Hard-label Attacks

The official implementation of NeurIPS 2021 paper: Finding Optimal Tangent Points for Reducing Distortions of Hard-label Attacks

PyTorch implementation for our NeurIPS 2021 Spotlight paper
PyTorch implementation for our NeurIPS 2021 Spotlight paper "Long Short-Term Transformer for Online Action Detection".

Long Short-Term Transformer for Online Action Detection Introduction This is a PyTorch implementation for our NeurIPS 2021 Spotlight paper "Long Short

Code for NeurIPS 2021 paper: Invariant Causal Imitation Learning for Generalizable Policies

Invariant Causal Imitation Learning for Generalizable Policies Ioana Bica, Daniel Jarrett, Mihaela van der Schaar Neural Information Processing System

Official implementation of NeurIPS'2021 paper TransformerFusion
Official implementation of NeurIPS'2021 paper TransformerFusion

TransformerFusion: Monocular RGB Scene Reconstruction using Transformers Project Page | Paper | Video TransformerFusion: Monocular RGB Scene Reconstru

Source codes of CenterTrack++ in 2021 ICME Workshop on Big Surveillance Data Processing and Analysis
Source codes of CenterTrack++ in 2021 ICME Workshop on Big Surveillance Data Processing and Analysis

MOT Tracked object bounding box association (CenterTrack++) New association method based on CenterTrack. Two new branches (Tracked Size and IOU) are a

A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.

P-tuning A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''. How to use our code We have released the code

Codes for our paper "SentiLARE: Sentiment-Aware Language Representation Learning with Linguistic Knowledge" (EMNLP 2020)

SentiLARE: Sentiment-Aware Language Representation Learning with Linguistic Knowledge Introduction SentiLARE is a sentiment-aware pre-trained language

Owner
Leslie
Leslie
The source codes for ACL 2021 paper 'BoB: BERT Over BERT for Training Persona-based Dialogue Models from Limited Personalized Data'

BoB: BERT Over BERT for Training Persona-based Dialogue Models from Limited Personalized Data This repository provides the implementation details for

null 124 Dec 27, 2022
Codes for paper "Towards Diverse Paragraph Captioning for Untrimmed Videos". CVPR 2021

Towards Diverse Paragraph Captioning for Untrimmed Videos This repository contains PyTorch implementation of our paper Towards Diverse Paragraph Capti

Yuqing Song 61 Oct 11, 2022
Implementation of CVPR 2021 paper "Spatially-invariant Style-codes Controlled Makeup Transfer"

SCGAN Implementation of CVPR 2021 paper "Spatially-invariant Style-codes Controlled Makeup Transfer" Prepare The pre-trained model is avaiable at http

null 118 Dec 12, 2022
Official implementation of NeurIPS 2021 paper "One Loss for All: Deep Hashing with a Single Cosine Similarity based Learning Objective"

Official implementation of NeurIPS 2021 paper "One Loss for All: Deep Hashing with a Single Cosine Similarity based Learning Objective"

Ng Kam Woh 71 Dec 22, 2022
Code for our NeurIPS 2021 paper Mining the Benefits of Two-stage and One-stage HOI Detection

CDN Code for our NeurIPS 2021 paper "Mining the Benefits of Two-stage and One-stage HOI Detection". Contributed by Aixi Zhang*, Yue Liao*, Si Liu, Mia

null 71 Dec 14, 2022
Code to reproduce the experiments from our NeurIPS 2021 paper " The Limitations of Large Width in Neural Networks: A Deep Gaussian Process Perspective"

Code To run: python runner.py new --save <SAVE_NAME> --data <PATH_TO_DATA_DIR> --dataset <DATASET> --model <model_name> [options] --n 1000 - train - t

Geoff Pleiss 5 Dec 12, 2022
Companion code for the paper "An Infinite-Feature Extension for Bayesian ReLU Nets That Fixes Their Asymptotic Overconfidence" (NeurIPS 2021)

ReLU-GP Residual (RGPR) This repository contains code for reproducing the following NeurIPS 2021 paper: @inproceedings{kristiadi2021infinite, title=

Agustinus Kristiadi 4 Dec 26, 2021
Official implementation of NeurIPS 2021 paper "Contextual Similarity Aggregation with Self-attention for Visual Re-ranking"

CSA: Contextual Similarity Aggregation with Self-attention for Visual Re-ranking PyTorch training code for CSA (Contextual Similarity Aggregation). We

Hui Wu 19 Oct 21, 2022
Code for our NeurIPS 2021 paper 'Exploiting the Intrinsic Neighborhood Structure for Source-free Domain Adaptation'

Exploiting the Intrinsic Neighborhood Structure for Source-free Domain Adaptation (NeurIPS 2021) Code for our NeurIPS 2021 paper 'Exploiting the Intri

Shiqi Yang 53 Dec 25, 2022
This GitHub repository contains code used for plots in NeurIPS 2021 paper 'Stochastic Multi-Armed Bandits with Control Variates.'

About Repository This repository contains code used for plots in NeurIPS 2021 paper 'Stochastic Multi-Armed Bandits with Control Variates.' About Code

Arun Verma 1 Nov 9, 2021