Abstract
Feature quantization is a crucial component for efficient large scale image retrieval and object recognition. By quantizing local features into visual words, one hopes that features that match each other obtain the same word ID. Then, similarities between images can be measured with respect to the corresponding histograms of visual words. Given the appearance variations of local features, traditional quantization methods do not take into account the distribution of matched features. In this paper, we investigate how to encode additional prior information on the feature distribution via entropy optimization by leveraging ground truth correspondence data. We propose a computationally efficient optimization scheme for large scale vocabulary training. The results from our experiments suggest that entropy-optimized vocabulary performs better than unsupervised quantization methods in terms of recall and precision for feature matching. We also demonstrate the advantage of the optimized vocabulary for image retrieval.
Original language | English |
---|---|
Title of host publication | Computer Vision Workshops (ICCV Workshops), 2011 IEEE International Conference on |
Publisher | IEEE - Institute of Electrical and Electronics Engineers Inc. |
Pages | 1386-1393 |
Number of pages | 8 |
ISBN (Print) | 978-1-4673-0062-9 (print) |
DOIs | |
Publication status | Published - 2011 |
Event | 1st IEEE Workshop on Information Theory in Computer Vision and Pattern Recognition (ICCV 2011), 2011 - Barcelona, Spain Duration: 2011 Nov 6 → 2011 Nov 13 |
Conference
Conference | 1st IEEE Workshop on Information Theory in Computer Vision and Pattern Recognition (ICCV 2011), 2011 |
---|---|
Country/Territory | Spain |
City | Barcelona |
Period | 2011/11/06 → 2011/11/13 |
Subject classification (UKÄ)
- Computer Vision and Robotics (Autonomous Systems)
- Mathematics
Free keywords
- visual vocabulary
- entropy optimization