Peer Loss functions
This repository is the (Multi-Class & Deep Learning) Pytorch implementation of "Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates" accepted by ICML2020.
Required Packages & Environment
Supported OS: Windows, Linux, Mac OS X; Python: 3.6/3.7;
Deep Learning Library: PyTorch (GPU required)
Required Packages: Numpy, Pandas, random, sklearn, tqdm, csv, torch (Keras is required if you want to estimate the noise transition matrix).
Utilities
This repository includes:
Details of running (weighted) Peer Loss functions on MNIST, Fashion MNIST, CIFAR-10, CIFAR-100 with different noise setting are mentioned in the README.md
file in each folder.
The workflow of weighted Peer Loss functions comes to:
Decision boundary visualization
Given a 2D syntheric dataset, the decision boundaries returned by training with Cross-entropy loss become loose when the noise rate is high. However, the decision boundaries w.r.t. Peer Loss functions remain tight despite high presence of label noise.
Citation
If you use our code, please cite the following paper:
@inproceedings{liu2020peer,
title={Peer loss functions: Learning from noisy labels without knowing noise rates},
author={Liu, Yang and Guo, Hongyi},
booktitle={International Conference on Machine Learning},
pages={6226--6236},
year={2020},
organization={PMLR}
}
Related Code