ATOM: Accurate Tracking by Overlap Maximization

ATOM: Accurate Tracking by Overlap Maximization

11 Apr 2019 | Martin Danelljan*,1,2 Goutam Bhat*,1,2 Fahad Shahbaz Khan1,3 Michael Felsberg1
The paper "ATOM: Accurate Tracking by Overlap Maximization" addresses the limitations of current visual tracking methods, which often rely on simple multi-scale search for target bounding box estimation, leading to suboptimal accuracy. The authors propose a novel tracking architecture that includes dedicated target estimation and classification components. The target estimation component is trained offline to predict the overlap between the target object and an estimated bounding box, incorporating high-level knowledge about the object through extensive offline learning. The classification component is trained online to ensure robust discrimination against distractors. The final tracking framework sets a new state-of-the-art on five challenging benchmarks, achieving a 15% relative gain over the previous best approach on the TrackingNet dataset while running at over 30 FPS. The code and models are available at <https://github.com/visionml/pytracking>.The paper "ATOM: Accurate Tracking by Overlap Maximization" addresses the limitations of current visual tracking methods, which often rely on simple multi-scale search for target bounding box estimation, leading to suboptimal accuracy. The authors propose a novel tracking architecture that includes dedicated target estimation and classification components. The target estimation component is trained offline to predict the overlap between the target object and an estimated bounding box, incorporating high-level knowledge about the object through extensive offline learning. The classification component is trained online to ensure robust discrimination against distractors. The final tracking framework sets a new state-of-the-art on five challenging benchmarks, achieving a 15% relative gain over the previous best approach on the TrackingNet dataset while running at over 30 FPS. The code and models are available at <https://github.com/visionml/pytracking>.
Reach us at info@study.space