Occluded Person Re-identification via Saliency-Guided Patch Transfer

Occluded Person Re-identification via Saliency-Guided Patch Transfer

2024 | Lei Tan, Jiaer Xia, Wenfeng Liu, Pingyang Dai, Yongjian Wu, Liujuan Cao
The paper introduces a novel data-driven strategy called Saliency-Guided Patch Transfer (SPT) to enhance the robustness of person re-identification (ReID) models against occlusions. SPT leverages real-world occlusion scenarios from the training dataset to generate realistic occluded samples. The method uses a vision transformer to divide images into identity and occlusion sets using salient patch selection. By recombining these sets, SPT can create high-quality occluded samples. The paper also proposes an Occlusion-Aware Intersection over Union (OIoU) with mask rolling to select the most suitable identity-occlusion pairs and a class-ignoring strategy for stable training. Extensive experiments on occluded and holistic ReID benchmarks demonstrate that SPT significantly improves the performance of ViT-based ReID algorithms in occluded scenarios.The paper introduces a novel data-driven strategy called Saliency-Guided Patch Transfer (SPT) to enhance the robustness of person re-identification (ReID) models against occlusions. SPT leverages real-world occlusion scenarios from the training dataset to generate realistic occluded samples. The method uses a vision transformer to divide images into identity and occlusion sets using salient patch selection. By recombining these sets, SPT can create high-quality occluded samples. The paper also proposes an Occlusion-Aware Intersection over Union (OIoU) with mask rolling to select the most suitable identity-occlusion pairs and a class-ignoring strategy for stable training. Extensive experiments on occluded and holistic ReID benchmarks demonstrate that SPT significantly improves the performance of ViT-based ReID algorithms in occluded scenarios.
Reach us at info@study.space