Conference paper
Weighted one-against-all
Alina Beygelzimer, John Langford, et al.
aaai 2005
We present and analyze an agnostic active learning algorithm that works without keeping a version space. This is unlike all previous approaches where a restricted set of candidate hypotheses is maintained throughout learning, and only hypotheses from this set are ever returned. By avoiding this version space approach, our algorithm sheds the computational burden and brittleness associated with maintaining version spaces, yet still allows for substantial improvements over supervised learning for classification.
Alina Beygelzimer, John Langford, et al.
aaai 2005
Mudhakar Srivatsa, Bong-Jun Ko, et al.
SRDS 2008
Alina Beygelzimer, Chang-Shing Perng, et al.
KDD 2001
John Langford, John Shawe-Taylor
NeurIPS 2002