Skip to content

Latest commit

 

History

History
17 lines (12 loc) · 526 Bytes

README.md

File metadata and controls

17 lines (12 loc) · 526 Bytes

MPMAB_BEACON

This is code used for the paper "Heterogeneous Multi-player Multi-armed Bandits: Closing the Gap and Generalization", Neurips 2021.

Requirements

  • Python 3.7
  • matplotlib
  • numpy
  • scipy

Experiments

1 random instances with linear or proportional fairness reward function: exp_random_instances.py

2 (M, K)=(5,5) with linear reward function: exp_linear.py

3 (M, K)=(6,8) with max_min reward function: exp_minimal.py

4 (M, K)=(6,8) with proportional fairness reward function: exp_proportional_fairness.py