Wasserstein learning of deep generative point process models
Shuai Xiao, Mehrdad Farajtabar, et al.
NeurIPS 2017
In this brief, we propose a novel multilabel learning framework, called multilabel self-paced learning, in an attempt to incorporate the SPL scheme into the regime of multilabel learning. Specifically, we first propose a new multilabel learning formulation by introducing a self-paced function as a regularizer, so as to simultaneously prioritize label learning tasks and instances in each iteration. Considering that different multilabel learning scenarios often need different self-paced schemes during learning, we thus provide a general way to find the desired self-paced functions. To the best of our knowledge, this is the first work to study multilabel learning by jointly taking into consideration the complexities of both training instances and labels. Experimental results on four publicly available data sets suggest the effectiveness of our approach, compared with the state-of-the-art methods.
Shuai Xiao, Mehrdad Farajtabar, et al.
NeurIPS 2017
Xin Liu, Junchi Yan, et al.
AAAI 2017
Changsheng Li, Junchi Yan, et al.
AAAI 2017
Lin Yang, Changsheng Li, et al.
ICSS 2014