Revisiting Example Dependent Cost-Sensitive Learning with Decision TreesOverviewTypical approaches to classification treat class labels as disjoint. For each training example, it is assumed that there is only one class label that correctly describes it, and that all other labels are equally bad. We know however, that good and bad labels are too simplistic in many scenarios, hurting accuracy. In the promising realm of example dependent cost-sensitive learning, each label is instead a vector representing a data point's affinity for each of the classes. At test time, our goal is not to minimize the misclassification rate, but to maximize that affinity.We propose a novel example dependent cost-sensitive impurity measure for decision trees. Our multi-class experiments show that this new impurity measure improves test performance while still retaining the fast test times of standard classification trees. We compare our classifier to classification trees and other cost-sensitive methods on three classes of computer vision problems, tracking, descriptor matching, and optical flow, and show improvements in all three domains. Publications
CodeMATLAB source code: V.10 available here.Contact: omacaodh (@) cs.ucl.ac.uk |