高级检索

    面向大规模图像检索的深度强相关散列学习方法

    Deep Highly Interrelated Hashing for Fast Image Retrieval

    • 摘要: 近年来,随着图像数据量的爆炸式增长,散列方法与深度学习相结合的方法在图像检索领域表现出优异的性能.主流的深度监督散列方法大多采用“成对”策略,利用标签信息生成一个相似矩阵约束散列编码.这类方法的计算开销大,不适用于大规模的图像检索.为此,提出了一种一元深度监督散列学习方法——深度强相关散列学习方法,为卷积神经网络添加了一个散列层以得到散列码,通过计算低维散列码之间的汉明距离完成快速图像检索.特别地,为了学习到的散列码更具有区别性,提出了强相关损失函数约束散列码的生成.强相关损失函数通过改变模型对权重矩阵的敏感度调节特征之间的距离,尽可能地增大特征类间距离、缩小类内距离.该方法能够实现快速、准确的大规模图像检索,并且可以广泛地使用在多种卷积神经网络中.在CIFAR-10, NUS-WIDE, SVHN这3个大规模公开数据集中进行了大量实验,结果表明该方法的图像检索性能优于目前主流方法.

       

      Abstract: In recent years, with the explosive growth of the amount of image data, the combination of hashing and deep learning shows excellent performance in the field of large-scale image retrieval. Most of the mainstream deep-supervised hashing methods use a “paired” strategy to generate a similarity matrix constrained Hash encoding. The instance-pairwise similarity matrix is a n×n matrix, where n is the number of training samples. The computational cost of such methods is large, and such methods are not suitable for large-scale image retrieval. Therefore, this paper proposes a deep highly interrelated hashing method, which is a deep-supervised hashing method that enables fast and accurate large-scale image retrieval. It can be widely used in a variety of deep convolutional neural networks. Particularly, in order to make the Hash code more discriminating, this paper proposes a highly interrelated loss function constrained Hash encoding. The highly interrelated loss function adjusts the distance between features by changing the sensitivity of the model to the weight matrix. It maximizes the distance between classes and reduces the distance within the class. Many experiments in CIFAR-10, NUS-WIDE and SVHN datasets are done. The experimental results show that the image retrieval performance of deep highly interrelated hashing is better than the current mainstream methods.

       

    /

    返回文章
    返回