Aurélie C. Lozano, Naoki Abe, et al.
KDD 2009
Margin maximizing properties play an important role in the analysis of classification models, such as boosting and support vector machines. Margin maximization is theoretically interesting because it facilitates generalization error analysis, and practically interesting because it presents a clear geometric interpretation of the models being built. We formulate and prove a sufficient condition for the solutions of regularized loss functions to converge to margin maximizing separators, as the regularization vanishes. This condition covers the hinge loss of SVM, the exponential loss of AdaBoost and logistic regression loss. We also generalize it to multi-class classification problems, and present margin maximizing multiclass versions of logistic regression and support vector machines.
Aurélie C. Lozano, Naoki Abe, et al.
KDD 2009
Daniela Pucci De Farias, Nimrod Megiddo
NeurIPS 2003
Eyal Kishon, Trevor Hastie, et al.
Journal of Robotic Systems
Saharon Rosset
KDD 2005