Dense Adaptive Self-Correlation (DASC) Descriptor


Seungryong Kim1
Dongbo Min2
Bumsub Ham3
Minh N. Do4
Kwanghoon Sohn3
Korea University1, Ewha Womans University2, Yonsei University3, UIUC4

[CVPR'15 paper]
[TPAMI'17 paper]
[TPAMI'17 supp]
[Code]
[Data]



Some challenging multi-modal and multi-spectral images such as (from top to bottom) RGB-NIR, flash-noflash images, two images with different exposures, and blur-sharp images. The images in the third and fourth column are the results obtained by warping images in the second column to images in the first column with dense correspondence maps estimated by using DAISY and our DASC descriptor, respectively.


Establishing dense correspondences between multiple images is a fundamental task in many applications. However, finding a reliable correspondence between multi-modal or multi-spectral images still remains unsolved due to their challenging photometric and geometric variations. In this paper, we propose a novel dense descriptor, called dense adaptive self-correlation (DASC), to estimate dense multi-modal and multi-spectral correspondences. Based on an observation that self-similarity existing within images is robust to imaging modality variations, we define the descriptor with a series of an adaptive self-correlation similarity measure between patches sampled by a randomized receptive field pooling, in which a sampling pattern is obtained using a discriminative learning. The computational redundancy of dense descriptors is dramatically reduced by applying fast edge-aware filtering. Furthermore, in order to address geometric variations including scale and rotation, we propose a geometry-invariant DASC (GI-DASC) descriptor that effectively leverages the DASC through a superpixel-based representation. For a quantitative evaluation of the GI-DASC, we build a novel multi-modal benchmark as varying photometric and geometric conditions. Experimental results demonstrate the outstanding performance of the DASC and GI-DASC in many cases of dense multi-modal and multi-spectral correspondences.