基于交叉注意力机制的多特征行人重识别
作者:
中图分类号:

TP391.4

基金项目:

国家自然科学基金(51875293);国家重点研发计划(2018YFC1405703)


Multi-feature person re-identification based on cross-attention mechanism
Author:
  • 摘要
  • | |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • | |
  • 文章评论
    摘要:

    针对现有的行人重识别方法难以避免环境噪声导致的特征提取不精确、易被误认为行人特征等问题,提出一种基于动态卷积与注意力机制的行人多特征融合分支网络.首先,由于拍摄时存在光照变化、人体姿势调整以及物体遮挡等不确定因素,提出使用动态卷积替换ResNet50中的静态卷积得到具有更强鲁棒性的Dy-ResNet50模型;其次,考虑到拍摄行人图片的视角有较大差异且存在行人被物体遮挡的情况,提出将自注意力机制与交叉注意力机制嵌入骨干网络;最后,将交叉熵损失函数和难样本三元损失函数共同作为模型损失函数,在DukeMTMC-ReID、Market-1501和MSMT17公开数据集上进行实验,并与主流网络模型进行比较.结果表明:在3个公开数据集上,本文所提模型的Rank-1(第一次命中)与mAP(平均精度均值)相比当前主流模型均有所提升,具有较高的识别准确率.

    Abstract:

    Existing person re-identification (Re-ID) methods often struggle with inaccurate feature extraction and misidentification of person features due to environmental noise.Here,we propose a multi-feature fusion branch network for person Re-ID based on dynamic convolution and attention mechanism.First,considering the uncertainties in illumination,human posture and occlusion,dynamic convolution is proposed to replace static convolution in ResNet50 to obtain a more robust Dy-ResNet50 model.Second,given the great difference in camera perspective and the likelihood of people being occluded by objects,self-attention and cross-attention mechanisms are embedded into the backbone network.Finally,the cross entropy loss function and the hard triplet loss function are used as the model's loss functions,and experiments are carried out on public datasets of DukeMTMC-ReID,Market-1501 and MSMT17.The results show that the proposed model outperforms current mainstream models in Rank-1 (first hit) and mAP (mean Average Precision) on three public datasets,indicating its high identification accuracy.

    参考文献
    引证文献
引用本文

邬心怡,邓志良,刘云平,董娟,李嘉琦.基于交叉注意力机制的多特征行人重识别[J].南京信息工程大学学报(自然科学版),2024,16(4):461-471
WU Xinyi, DENG Zhiliang, LIU Yunping, DONG Juan, LI Jiaqi. Multi-feature person re-identification based on cross-attention mechanism[J]. Journal of Nanjing University of Information Science & Technology, 2024,16(4):461-471

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2023-11-13
  • 在线发布日期: 2024-08-07
  • 出版日期: 2024-07-28

地址:江苏省南京市宁六路219号    邮编:210044

联系电话:025-58731025    E-mail:nxdxb@nuist.edu.cn

南京信息工程大学学报 ® 2025 版权所有  技术支持:北京勤云科技发展有限公司