- A Lazy Man's Approach to Benchmarking: Semisupervised Classifier Evaluation and Recalibration
- IEEE Conference on Computer Vision and Pattern Recognition: CVPR 2013
- Book/source title
- Proceedings: 2013 IEEE Conference on Computer Vision and Pattern Recognition: CVPR 2013: 23-28 June 2013, Portland, Oregon, USA
- Pages (from-to)
- Los Alamitos, CA: IEEE Computer Society Conference Publishing Services
- Document type
- Conference contribution
- Faculty of Science (FNWI)
- Informatics Institute (IVI)
How many labeled examples are needed to estimate a classifier's performance on a new dataset? We study the case where data is plentiful, but labels are expensive. We show that by making a few reasonable assumptions on the structure of the data, it is possible to estimate performance curves, with confidence bounds, using a small number of ground truth labels. Our approach, which we call Semi supervised Performance Evaluation (SPE), is based on a generative model for the classifier's confidence scores. In addition to estimating the performance of classifiers on new datasets, SPE can be used to recalibrate a classifier by re-estimating the class-conditional confidence distributions.
- go to publisher's site
If you believe that digital publication of certain material infringes any of your rights or (privacy) interests, please let the Library know, stating your reasons. In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website. Please Ask the Library, or send a letter to: Library of the University of Amsterdam, Secretariat, Singel 425, 1012 WP Amsterdam, The Netherlands. You will be contacted as soon as possible.