Loading…
NIPS 2013 has ended
Sunday, December 8 • 2:00pm - 6:00pm
Estimation Bias in Multi-Armed Bandit Algorithms for Search Advertising

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

In search advertising, the search engine needs to select the most profitable advertisements to display, which can be formulated as an instance of online learning with partial feedback, also known as the stochastic multi-armed bandit (MAB) problem. In this paper, we show that the naive application of MAB algorithms to search advertising for advertisement selection will produce sample selection bias that harms the search engine by decreasing expected revenue and “estimation of the largest mean” (ELM) bias that harms the advertisers by increasing game-theoretic player-regret. We then propose simple bias-correction methods with benefits to both the search engine and the advertisers.
None


Sunday December 8, 2013 2:00pm - 6:00pm PST
Harrah's Special Events Center, 2nd Floor
  Posters
  • posterid Sun08
  • location Poster# Sun08

Attendees (0)