Tree quantization for large-scale similarity search and classification

Artem Babenko, Victor Lempitsky

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    70 Citations (Scopus)

    Abstract

    We propose a new vector encoding scheme (tree quantization) that obtains lossy compact codes for high-dimensional vectors via tree-based dynamic programming. Similarly to several previous schemes such as product quantization, these codes correspond to codeword numbers within multiple codebooks. We propose an integer programming-based optimization that jointly recovers the coding tree structure and the codebooks by minimizing the compression error on a training dataset. In the experiments with diverse visual descriptors (SIFT, neural codes, Fisher vectors), tree quantization is shown to combine fast encoding and state-of-the-art accuracy in terms of the compression error, the retrieval performance, and the image classification error.

    Original languageEnglish
    Title of host publicationIEEE Conference on Computer Vision and Pattern Recognition, CVPR 2015
    PublisherIEEE Computer Society
    Pages4240-4248
    Number of pages9
    ISBN (Electronic)9781467369640
    DOIs
    Publication statusPublished - 14 Oct 2015
    EventIEEE Conference on Computer Vision and Pattern Recognition, CVPR 2015 - Boston, United States
    Duration: 7 Jun 201512 Jun 2015

    Publication series

    NameProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
    Volume07-12-June-2015
    ISSN (Print)1063-6919

    Conference

    ConferenceIEEE Conference on Computer Vision and Pattern Recognition, CVPR 2015
    Country/TerritoryUnited States
    CityBoston
    Period7/06/1512/06/15

    Fingerprint

    Dive into the research topics of 'Tree quantization for large-scale similarity search and classification'. Together they form a unique fingerprint.

    Cite this