Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields

Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields

14 Apr 2017 | Zhe Cao, Tomas Simon, Shih-En Wei, Yaser Sheikh
The paper presents an efficient method for real-time multi-person 2D pose estimation using Part Affinity Fields (PAFs). The approach uses a non-parametric representation, PAFs, to learn the association between body parts and individuals in an image. The architecture encodes global context, allowing a greedy bottom-up parsing step that maintains high accuracy while achieving real-time performance, regardless of the number of people in the image. The method jointly learns part locations and their associations through two branches of a sequential prediction process. The method placed first in the COCO 2016 keypoints challenge and significantly outperformed previous state-of-the-art results on the MPII Multi-Person benchmark in both performance and efficiency. The paper also discusses the challenges of multi-person pose estimation, such as complex spatial interference and runtime complexity, and addresses these issues with the proposed method.The paper presents an efficient method for real-time multi-person 2D pose estimation using Part Affinity Fields (PAFs). The approach uses a non-parametric representation, PAFs, to learn the association between body parts and individuals in an image. The architecture encodes global context, allowing a greedy bottom-up parsing step that maintains high accuracy while achieving real-time performance, regardless of the number of people in the image. The method jointly learns part locations and their associations through two branches of a sequential prediction process. The method placed first in the COCO 2016 keypoints challenge and significantly outperformed previous state-of-the-art results on the MPII Multi-Person benchmark in both performance and efficiency. The paper also discusses the challenges of multi-person pose estimation, such as complex spatial interference and runtime complexity, and addresses these issues with the proposed method.
Reach us at info@study.space