PARTICULAR OBJECT RETRIEVAL WITH INTEGRAL MAX-POOLING OF CNN ACTIVATIONS

PARTICULAR OBJECT RETRIEVAL WITH INTEGRAL MAX-POOLING OF CNN ACTIVATIONS

24 Feb 2016 | Giorgos Tolias, Ronan Sicre, Hervé Jégou
This paper addresses the challenge of image retrieval using Convolutional Neural Network (CNN) activations, particularly for particular object retrieval. The authors propose a compact feature vector representation that encodes multiple image regions without the need to feed multiple inputs to the network. They extend integral images to handle max-pooling on convolutional layer activations, enabling efficient localization of matching objects. The resulting bounding box is used for image re-ranking, significantly improving the performance of existing CNN-based recognition pipelines. The method outperforms traditional methods on the Oxford5k and Paris6k datasets, demonstrating its effectiveness in both filtering and re-ranking stages of image retrieval. The contributions include a novel compact feature representation, an efficient localization method, and a simple query expansion technique, collectively enhancing the overall retrieval performance.This paper addresses the challenge of image retrieval using Convolutional Neural Network (CNN) activations, particularly for particular object retrieval. The authors propose a compact feature vector representation that encodes multiple image regions without the need to feed multiple inputs to the network. They extend integral images to handle max-pooling on convolutional layer activations, enabling efficient localization of matching objects. The resulting bounding box is used for image re-ranking, significantly improving the performance of existing CNN-based recognition pipelines. The method outperforms traditional methods on the Oxford5k and Paris6k datasets, demonstrating its effectiveness in both filtering and re-ranking stages of image retrieval. The contributions include a novel compact feature representation, an efficient localization method, and a simple query expansion technique, collectively enhancing the overall retrieval performance.
Reach us at info@study.space