Freezing and sleeping: Tracking experts that learn by evolving past posteriors

Authors
Publication date 2009
Host editors
  • M. van Erp
  • H. Stehouwer
  • M. van Zaanen
Book title Benelearn 09: the 18th Annual Belgian-Dutch Conference on Machine Learning: proceedings of the conference
Event 18th Annual Belgian-Dutch Conference on Machine Learning (Benelearn 09), Tilburg, the Netherlands
Pages (from-to) 91-92
Publisher Tilburg: Tilburg centre for Creative Computing (TiCC), Tilburg University
Organisations
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract
A problem posed by Freund is how to efficiently track a small pool of experts out of a much larger set. This problem was solved when Bousquet and Warmuth introduced their mixing past posteriors (MPP) algorithm in 2001.
In Freund’s problem the experts would normally be considered black boxes. However, in this paper we re-examine Freund’s problem in case the experts have internal structure that enables them to learn. In this case the problem has two possible interpretations: should the experts learn from all data or only from the subsequence on which they are being tracked? The MPP algorithm solves the first case. We generalise MPP to address the second option. Our results apply to any expert structure that can be formalised using (expert) hidden Markov models. Curiously enough, for our interpretation there are two natural reference schemes: freezing and sleeping. For each scheme, we provide an efficient prediction strategy and prove the relevant loss bound.
Document type Conference contribution
Language English
Published at http://benelearn09.uvt.nl/Proceedings_Benelearn_09.pdf
Permalink to this page
Back