Code for Low-Cost Algorithmic Recourse for Users With Uncertain Cost Functions

Overview

EMS-COLS-recourse

Initial Code for Low-Cost Algorithmic Recourse for Users With Uncertain Cost Functions

Folder structure:

  • data folder contains raw and final preprocessed data, along with the pre-processing script.
  • Src folder contain the code for our method.
  • trained_model contains the trained black box model checkpoint.

Making the environment

conda create -n rec_gen python=3.8.1
conda activate rec_gen
pip install -r requirements.txt

Steps for running experiments.

change current working directory to src

cd ./src/
  1. Run data_io.py to dump mcmc cost samples.
python ./utils/data_io.py --save_data --data_name adult_binary --dump_negative_data --num_mcmc 1000

python ./utils/data_io.py --save_data --data_name compas_binary --dump_negative_data --num_mcmc 1000
  1. run main experiments on COLS and P-COLS.
python run.py --data_name adult_binary --num_mcmc 1000 --model ls --num_cfs 10 --project_name exp_main --budget 5000
python run.py --data_name compas_binary --num_mcmc 1000 --model ls --num_cfs 10 --project_name exp_main --budget 5000

python run.py --data_name adult_binary --num_mcmc 1000 --model pls --num_cfs 10 --project_name exp_main --budget 5000
python run.py --data_name compas_binary --num_mcmc 1000 --model pls --num_cfs 10 --project_name exp_main --budget 5000
  1. Run ablation Experiments
python run.py --data_name adult_binary --num_mcmc 1000 --model ls --num_cfs 10 --project_name exp_ablation --budget 3000 --eval cost
python run.py --data_name adult_binary --num_mcmc 1000 --model ls --num_cfs 10 --project_name exp_ablation --budget 3000 --eval cost_simple
python run.py --data_name adult_binary --num_mcmc 1000 --model ls --num_cfs 10 --project_name exp_ablation --budget 3000 --eval proximity
python run.py --data_name adult_binary --num_mcmc 1000 --model ls --num_cfs 10 --project_name exp_ablation --budget 3000 --eval sparsity
python run.py --data_name adult_binary --num_mcmc 1000 --model ls --num_cfs 10 --project_name exp_ablation --budget 3000 --eval diversity
  1. Run experiments with budget
python run.py --data_name adult_binary --model ls --num_cfs 10 --num_users 100 --project_name exp_budget --budget 500
python run.py --data_name adult_binary --model ls --num_cfs 10 --num_users 100 --project_name exp_budget --budget 1000
python run.py --data_name adult_binary --model ls --num_cfs 10 --num_users 100 --project_name exp_budget --budget 2000
python run.py --data_name adult_binary --model ls --num_cfs 10 --num_users 100 --project_name exp_budget --budget 3000
python run.py --data_name adult_binary --model ls --num_cfs 10 --num_users 100 --project_name exp_budget --budget 5000
python run.py --data_name adult_binary --model ls --num_cfs 10 --num_users 100 --project_name exp_budget --budget 10000
  1. Run experiments with number of counterfactuals
python run.py --data_name adult_binary --model model_name --num_cfs 1 --num_users 100 --project_name exp_cfs --budget 5000
python run.py --data_name adult_binary --model model_name --num_cfs 2 --num_users 100 --project_name exp_cfs --budget 5000
python run.py --data_name adult_binary --model model_name --num_cfs 3 --num_users 100 --project_name exp_cfs --budget 5000
python run.py --data_name adult_binary --model model_name --num_cfs 5 --num_users 100 --project_name exp_cfs --budget 5000
python run.py --data_name adult_binary --model model_name --num_cfs 10 --num_users 100 --project_name exp_cfs --budget 5000
python run.py --data_name adult_binary --model model_name --num_cfs 20 --num_users 100 --project_name exp_cfs --budget 5000
python run.py --data_name adult_binary --model model_name --num_cfs 30 --num_users 100 --project_name exp_cfs --budget 5000
  1. Experiment with respect to Monte Carlo samples
  • Run these commands for different num_mcmc values. Default set to 5 in commands.
python ./utils/data_io.py --save_data --data_name adult_binary --dump_negative_data --num_mcmc 5

python run.py --data_name adult_binary --num_mcmc 5 --model model_name --num_cfs 10 --project_name exp_mcmc --budget 5000 --num_users 100

To train a new blackbox model

  • Run this right after preprocessing the data.
python train_model.py --data_name adult --max_epochs 1000 --check_val_every_n_epoch=1 --learning_rate=0.0001
You might also like...
This project is a loose implementation of paper "Algorithmic Financial Trading with Deep Convolutional Neural Networks: Time Series to Image Conversion Approach"

Stock Market Buy/Sell/Hold prediction Using convolutional Neural Network This repo is an attempt to implement the research paper titled "Algorithmic F

Submission to Twitter's algorithmic bias bounty challenge
Submission to Twitter's algorithmic bias bounty challenge

Twitter Ethics Challenge: Pixel Perfect Submission to Twitter's algorithmic bias bounty challenge, by Travis Hoppe (@metasemantic). Abstract We build

The CLRS Algorithmic Reasoning Benchmark

Learning representations of algorithms is an emerging area of machine learning, seeking to bridge concepts from neural networks with classical algorithms.

Re-implementation of 'Grokking: Generalization beyond overfitting on small algorithmic datasets'

Re-implementation of the paper 'Grokking: Generalization beyond overfitting on small algorithmic datasets' Paper Original paper can be found here Data

PyTorch implementation of Grokking: Generalization Beyond Overfitting on Small Algorithmic Datasets
PyTorch implementation of Grokking: Generalization Beyond Overfitting on Small Algorithmic Datasets

Simple PyTorch Implementation of "Grokking" Implementation of Grokking: Generalization Beyond Overfitting on Small Algorithmic Datasets Usage Running

Low-code/No-code approach for deep learning inference on devices
Low-code/No-code approach for deep learning inference on devices

EzEdgeAI A concept project that uses a low-code/no-code approach to implement deep learning inference on devices. It provides a componentized framewor

CFNet: Cascade and Fused Cost Volume for Robust Stereo Matching(CVPR2021)

CFNet(CVPR 2021) This is the implementation of the paper CFNet: Cascade and Fused Cost Volume for Robust Stereo Matching, CVPR 2021, Zhelun Shen, Yuch

Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks
Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks

Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks (SDPoint) This repository contains the cod

Sound and Cost-effective Fuzzing of Stripped Binaries by Incremental and Stochastic Rewriting
Sound and Cost-effective Fuzzing of Stripped Binaries by Incremental and Stochastic Rewriting

StochFuzz: A New Solution for Binary-only Fuzzing StochFuzz is a (probabilistically) sound and cost-effective fuzzing technique for stripped binaries.

Owner
Prateek Yadav
Prateek Yadav
An open-source, low-cost, image-based weed detection device for fallow scenarios.

Welcome to the OpenWeedLocator (OWL) project, an opensource hardware and software green-on-brown weed detector that uses entirely off-the-shelf compon

Guy Coleman 145 Jan 5, 2023
A resource for learning about deep learning techniques from regression to LSTM and Reinforcement Learning using financial data and the fitness functions of algorithmic trading

A tour through tensorflow with financial data I present several models ranging in complexity from simple regression to LSTM and policy networks. The s

null 195 Dec 7, 2022
Code for HLA-Face: Joint High-Low Adaptation for Low Light Face Detection (CVPR21)

HLA-Face: Joint High-Low Adaptation for Low Light Face Detection The official PyTorch implementation for HLA-Face: Joint High-Low Adaptation for Low L

Wenjing Wang 77 Dec 8, 2022
Official code of "R2RNet: Low-light Image Enhancement via Real-low to Real-normal Network."

R2RNet Official code of "R2RNet: Low-light Image Enhancement via Real-low to Real-normal Network." Jiang Hai, Zhu Xuan, Ren Yang, Yutong Hao, Fengzhu

null 77 Dec 24, 2022
Implement some metaheuristics and cost functions

Metaheuristics This repot implement some metaheuristics and cost functions. Metaheuristics JAYA Implement Jaya optimizer without constraints. Cost fun

Adri1G 1 Mar 23, 2022
Gradient-free global optimization algorithm for multidimensional functions based on the low rank tensor train format

ttopt Description Gradient-free global optimization algorithm for multidimensional functions based on the low rank tensor train (TT) format and maximu

null 5 May 23, 2022
Algorithmic trading using machine learning.

Algorithmic Trading This machine learning algorithm was built using Python 3 and scikit-learn with a Decision Tree Classifier. The program gathers sto

Sourav Biswas 101 Nov 10, 2022
High frequency AI based algorithmic trading module.

Flow Flow is a high frequency algorithmic trading module that uses machine learning to self regulate and self optimize for maximum return. The current

null 59 Dec 14, 2022
Algorithmic trading with deep learning experiments

Deep-Trading Algorithmic trading with deep learning experiments. Now released part one - simple time series forecasting. I plan to implement more soph

Alex Honchar 1.4k Jan 2, 2023
Algorithmic Trading using RNN

Deep-Trading This an implementation adapted from Rachnog Neural networks for algorithmic trading. Part One — Simple time series forecasting and this c

Hazem Nomer 29 Sep 4, 2022