Multi-Armed Bandits

1 benchmarks1262 papers

Multi-armed bandits refer to a task where a fixed amount of resources must be allocated between competing resources that maximizes expected gain. Typically these problems involve an exploration/exploitation trade-off.

<span style="color:grey; opacity: 0.6">( Image credit: Microsoft Research )</span>

Benchmarks

Multi-Armed Bandits on Mushroom