Optimizing the Classification Cost using SVMs with a Double Hinge Loss
Classifier with a double hinge loss. Such binary classifiers have the option to reject observations when
the cost of rejection is lower than that of misclassification. To train this classifier, the standard SVM
optimization problem was modified by minimizing a double hinge loss function considered as a surrogate
convex loss function. The impact of such classifier is illustrated on several discussed results obtained with
artificial data and medical data.
This work is licensed under a Creative Commons Attribution 3.0 License.