Revisiting Example Dependent Cost-Sensitive Learning with Decision Trees



Overview

Typical approaches to classification treat class labels as disjoint. For each training example, it is assumed that there is only one class label that correctly describes it, and that all other labels are equally bad. We know however, that good and bad labels are too simplistic in many scenarios, hurting accuracy. In the promising realm of example dependent cost-sensitive learning, each label is instead a vector representing a data point's affinity for each of the classes. At test time, our goal is not to minimize the misclassification rate, but to maximize that affinity.

We propose a novel example dependent cost-sensitive impurity measure for decision trees. Our multi-class experiments show that this new impurity measure improves test performance while still retaining the fast test times of standard classification trees. We compare our classifier to classification trees and other cost-sensitive methods on three classes of computer vision problems, tracking, descriptor matching, and optical flow, and show improvements in all three domains.


Publications

CVPR 2010
Revisiting Example Dependent Cost-Sensitive Learning with Decision Trees
Oisin Mac Aodha and Gabriel J. Brostow
ICCV 2013
[paper] [supp] [bibtex]


Code

MATLAB source code: V.10 available here.







Contact: omacaodh (@) cs.ucl.ac.uk