AAAI Publications, Workshops at the Twenty-Seventh AAAI Conference on Artificial Intelligence

Font Size: 
Volatile Multi-Armed Bandits for Guaranteed Targeted Social Crawling
Zahy Bnaya, Rami Puzis, Roni Stern, Ariel Felner

Last modified: 2013-06-29

Abstract


We introduce a new variant of the multi-armed bandit problem, called Volatile Multi-Arm Bandit (VMAB). A general policy for VMAB is given with proven regret bounds. The problem of collecting intelligence on profiles in social networks is then modeled as a VMAB and experimental results show the superiority of our proposed policy.

Full Text: PDF