1.School of Mechanical Engineering, Shenyang Jianzhu University;2.Key Laboratory of Optical-Electronics Information Processing, Shenyang Institute of Automation, Chinese Academy of Sciences;3.Excellence XinShiDai Certification Co.,Ltd.
目前基于深度学习的红外与可见光图像融合算法, 通常无法感知源图像显著性区域, 导致融合结果没有突出红外与可见光图像各自的典型特征, 无法达到理想的融合效果. 针对上述问题, 设计一种适用于红外与可见光图像融合任务的改进残差密集生成对抗网络结构. 首先, 使用改进残差密集模块作为基础网络分别构建生成器与判别器, 并引入基于注意力机制的挤压激励网络来捕获通道维度下的显著特征, 充分保留红外图像的热辐射信息和可见光图像的纹理细节信息; 其次, 使用相对平均判别器, 分别衡量融合图像与红外图像、可见光图像之间的相对差异, 并根据差异指导生成器保留缺少的源图像信息; 最后, 通过在 TNO 等多个图像融合数据集上的实验结果证明, 所提方法能够生成目标清晰、细节丰富的融合图像, 相比基于残差网络的融合方法，边缘强度和平均梯度分别提升了 64.56% 和 64.94%.
The current infrared and visible image fusion methods based on deep learning usually cannot perceive the saliency area of source images. As a result, the result of image fusion failed to highlight the respective typical features of infrared and visible images, and the ideal fusion effect cannot be reached. In response to these issues, an improved residual dense generative adversarial network structure suitable for infrared and visible image fusion tasks is designed. First of all, the improved residual dense block is used as the basic network to construct the generator and the discriminator respectively, the squeeze and excitation network based on attention mechanism is introduced to capture the salient features in the channel dimension, which will preserve the thermal radiation information of infrared images and the textures information of visible images adequately. Secondly, the relativistic discriminator is used to measure the relative difference between the fusion image and two source images respectively, so as to instruct the generator to preserve the missing information of source images based on the difference. Finally, the experimental results on the multiple image fusion datasets such as TNO, it is proved that the proposed method can generate fused images with clear targets and rich details. Compared with the fusion method based on the residual network, the edge intensity and average gradient are increased by 64.56% and 64.94%, respectively.