Multi-Player Multi-Armed Bandit: Can We Still Collaborate at Homes Without "Zoom"?
Subscribers:
68,700
Published on ● Video Link: https://www.youtube.com/watch?v=tMHufa9Pdqg
Yuanzhi Li (Carnegie Mellon University)
https://simons.berkeley.edu/talks/multi-player-multi-armed-bandit-can-we-still-collaborate-homes-without-zoom
Mathematics of Online Decision Making
Other Videos By Simons Institute for the Theory of Computing
Tags:
Simons Institute
theoretical computer science
UC Berkeley
Computer Science
Theory of Computation
Theory of Computing
Yuanzhi Li
Mathematics of Online Decision Making